From haberland at ucla.edu Sun Feb 5 00:13:21 2017 From: haberland at ucla.edu (Matt Haberland) Date: Sat, 4 Feb 2017 21:13:21 -0800 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? Message-ID: Hi all, I'm new to the list. 1. I am interested in contributing a binary integer linear programming solver now, and possibly a more general MILP solver in the future. Existing solutions (lp_solve, cvxopt, PuLp w/ GLPK) are not as easy to use as they should be, at least for Windows users. As I was writing the BILP code, I found a bug in linprog in which it would report success, yet the bounds were not at all respected by the returned solution. I will report that bug, but in the meantime I started writing an interior-point linear programming solver, which I'd be interested in contributing as well. Please let me know if you'd be interested in me contributing: * A binary integer linear programming solver similar to Matlab's old bintprog, but with options for both Balas' method and bintprog's branch and bound algorithm * A primal-dual interior-point method for linprog 2. I was trying to search the archives to see if there is already discussion about integer programming or additional methods for linprog, but it seems that the server: http://dir.gmane.org/ isn't working? I checked: http://www.isitdownrightnow.com/dir.gmane.org.html and it's down for everyone. I just thought I should report that. Thanks! Matt -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.craig.wilson at gmail.com Sun Feb 5 12:56:32 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Sun, 5 Feb 2017 11:56:32 -0600 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: > As I was writing the BILP code, I found a bug in linprog in which it would > report success, yet the bounds were not at all respected by the returned > solution. I wonder if it's the same as either of these? https://github.com/scipy/scipy/issues/5400 https://github.com/scipy/scipy/issues/6690 From haberland at ucla.edu Sun Feb 5 14:11:15 2017 From: haberland at ucla.edu (Matt Haberland) Date: Sun, 5 Feb 2017 11:11:15 -0800 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: Thanks. I thought it might be related, too, so I just posted my example in issue 6690. https://github.com/scipy/scipy/issues/6690 In any case, please let me know what you think about the addition of a bintprog-like function and interior-point method for linprog. Matt On Sun, Feb 5, 2017 at 9:56 AM, Joshua Wilson wrote: > > As I was writing the BILP code, I found a bug in linprog in which it > would > > report success, yet the bounds were not at all respected by the returned > > solution. > > I wonder if it's the same as either of these? > > https://github.com/scipy/scipy/issues/5400 > https://github.com/scipy/scipy/issues/6690 > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.haessig at crans.org Tue Feb 7 03:37:30 2017 From: pierre.haessig at crans.org (Pierre Haessig) Date: Tue, 7 Feb 2017 09:37:30 +0100 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: <08d5e58e-d110-ffa3-cb49-86b0bfc12e33@crans.org> Hi, Le 05/02/2017 ? 06:13, Matt Haberland a ?crit : > 1. I am interested in contributing a binary integer linear programming > solver now, and possibly a more general MILP solver in the future. > Existing solutions (lp_solve, cvxopt, PuLp w/ GLPK) are not as easy to > use as they should be, at least for Windows users. A MILP routine would be a great companion to the existing LP. As you said, there are many existing solvers but the installation is not always easy. For cvxopt on Windows I found that the binary wheel from Christoph Gohlke http://www.lfd.uci.edu/~gohlke/pythonlibs/#cvxopt did work with Anaconda (last time I checked, a year a ago or so), but it's quite a miracle to me. best, Pierre From ralf.gommers at gmail.com Wed Feb 8 03:53:16 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 8 Feb 2017 21:53:16 +1300 Subject: [SciPy-Dev] GSoC'17 participation? In-Reply-To: References: Message-ID: On Fri, Jan 20, 2017 at 12:47 PM, Joshua Wilson wrote: > P.S. If anyone wants to co-mentor that would be welcome. > > On Thu, Jan 19, 2017 at 5:29 PM, Joshua Wilson < > josh.craig.wilson at gmail.com> wrote: > >> > Would you be able to add this to the wiki? >> >> Here it is: >> >> https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas# >> improve-the-parabolic-cylinder-functions >> > I have created a new page for 2017: https://github.com/scipy/scipy/wiki/GSoC-2017-project-ideas More ideas/mentors welcome, please edit! A link from http://python-gsoc.org/#ideas to our ideas page will be available within 24 hours I expect. Cheers, Ralf >> >> On Thu, Jan 19, 2017 at 10:18 AM, Ted Pudlik wrote: >> >>> I agree that it would be very challenging. It would also be more of a >>> research project in applied maths than a straightforward implementation >>> exercise. For the confluent hypergeometric function specifically (on which >>> I've worked unsuccessfully), the main difficulty is that no known algorithm >>> (https://arxiv.org/abs/1407.7786) is reliable throughout parameter >>> space, and in fact finding any combination of algorithms that works well is >>> rather hard. >>> >>> I have more specific results tucked away somewhere, including >>> interesting plots (one attached as an example---number of correct digits >>> computed by the optimally-truncated asymptotic series as a function of the >>> parameters). I'll try to clean them up and post somewhere for posterity's >>> sake, assuming I won't be able to finish this work myself.[image: 3.png] >>> >>> On Wed, Jan 18, 2017 at 11:32 AM Evgeni Burovski < >>> evgeny.burovskiy at gmail.com> wrote: >>> >>>> 3. hypergeometric functions would be great, but this might be too >>>> difficult. Josh, Nikolay, Ted --- you guys looked at this at some >>>> point; any comments? >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 3.png Type: image/png Size: 82262 bytes Desc: not available URL: From sylvain.corlay at gmail.com Wed Feb 8 11:59:16 2017 From: sylvain.corlay at gmail.com (Sylvain Corlay) Date: Wed, 8 Feb 2017 17:59:16 +0100 Subject: [SciPy-Dev] ANN: xtensor 0.4.1 Message-ID: Hi All, On behalf of the xtensor development team, I am pleased to announce the release of xtensor 0.4.1. xtensor is C++ tensor algebra library supporting numpy-style broadcasting and universal functions in C++. The xtensor-python project enables viewing numpy arrays as xtensor expressions, exposing an STL-compliant API and making xtensor a very simple tool for C++ Python extension authors to operate on numpy data structures. More than an array manipulation library, xtensor is a lazy expression system. Results of universal functions such as xt::where() don't hold any value and are only evaluated upon access or when assigned to a container. This also enables the use of other data structures as xtensor expressions through adaptors. xtensor can easily be installed with conda conda install xtensor -c conda-forge Highlights of the 0.4.1 release: - support for newaxis in xtensor views. - new xtensor::random module similar to that of numpy. - support for numpy-style pretty-print of xtensor expressions. New contributors are welcome to join us at https://github.com/ QuantStack/xtensor A comprehensive documentation is available at http://xtensor.readthedocs.org Thanks, Sylvain -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Feb 8 12:42:02 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 8 Feb 2017 10:42:02 -0700 Subject: [SciPy-Dev] GSoC'17 participation? In-Reply-To: References: Message-ID: Nice selection. The move to PyTest would also be relevant for NumPy. On Wed, Feb 8, 2017 at 1:53 AM, Ralf Gommers wrote: > > > On Fri, Jan 20, 2017 at 12:47 PM, Joshua Wilson < > josh.craig.wilson at gmail.com> wrote: > >> P.S. If anyone wants to co-mentor that would be welcome. >> >> On Thu, Jan 19, 2017 at 5:29 PM, Joshua Wilson < >> josh.craig.wilson at gmail.com> wrote: >> >>> > Would you be able to add this to the wiki? >>> >>> Here it is: >>> >>> https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas# >>> improve-the-parabolic-cylinder-functions >>> >> > I have created a new page for 2017: https://github.com/scipy/ > scipy/wiki/GSoC-2017-project-ideas > More ideas/mentors welcome, please edit! > > A link from http://python-gsoc.org/#ideas to our ideas page will be > available within 24 hours I expect. > > Cheers, > Ralf > > Nice selection and write up. The move to PyTest would also be relevant for NumPy. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Feb 8 12:57:58 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 8 Feb 2017 20:57:58 +0300 Subject: [SciPy-Dev] branching 0.19.x soon? In-Reply-To: References: Message-ID: On Tue, Jan 17, 2017 at 11:54 AM, Ralf Gommers wrote: > > > On Tue, Jan 17, 2017 at 8:56 AM, Evgeni Burovski > wrote: >> >> > We had planned the 0.19.0 release for the end of January. We won't make >> > that >> > anymore, but branching this month seems doable (nothing particularly >> > large >> > or blocking in https://github.com/scipy/scipy/milestone/30). >> >> Thanks for bringing this up Ralf! >> >> The dblquad regression would be good to fix. >> (Ah, as noted on GH already) >> Can't think of other blockers off the top of my head. Some more time >> for merging open PRs would be nice to have, but optional. > > > It seems like we're always hovering around 130-140 PRs, no matter how much > time we take. But probably good to give it say two weeks, so everyone has a > chance to go and review/merge the PRs they would really like to get in? > >> >> >> > @Evgeni, did you plan / want to be the release manager for this release >> > again, or do you want someone else to take that role? >> >> If someone wants it, great. Volunteers? >> Otherwise I'll happily do it. > > > "want" is a big word:) I'm also happy to take a turn again, either for this > release or a following one. But if someone new to the role volunteers, > that'd be quite nice. Having discussed this with Ralf off-list, I am volunteering to do the release legwork. The contents of the milestone: https://github.com/scipy/scipy/milestones/0.19.0. Several of these are rolled over from a previous release or even earlier ones, where an issue or a PR does not make the release X, and gets re-milestoned to X+1 so that we don't forget (tm). These look like blockers: A flaky test in linalg: https://github.com/scipy/scipy/issues/6984 A regression in integrate: https://github.com/scipy/scipy/issues/6898 A failure in stats with numpy-dev: https://github.com/scipy/scipy/pull/7012#issuecomment-278368718 The rest seem to be possible to roll over again unless fixed. Of course, any help with troubleshooting, reviewing and merging is most appreciated. If you think something is missing and should go into 0.19.0, please add it to the milestone, or comment on github or in this email thread. Here's an optimistic schedule: 15 Feb : branch 0.19.x, cut 0.19.0rc1 22 Feb : 0.19.0rc2, if needed 09 Mar : 0.19.0 final Please speak up if the timing does not work for you. Cheers, Evgeni From ashwin.pathak at students.iiit.ac.in Wed Feb 8 14:47:58 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Thu, 09 Feb 2017 01:17:58 +0530 Subject: [SciPy-Dev] Scipy Development queries Message-ID: I am new to scipy and am trying to contribute to the organization, can someone tell me how to go about reading the source code so as to get an abstract idea of the functionality? Also, how can I claim for the issues that I want to solve? Thanks in advance :) From shoyer at gmail.com Wed Feb 8 23:49:30 2017 From: shoyer at gmail.com (Stephan Hoyer) Date: Wed, 8 Feb 2017 20:49:30 -0800 Subject: [SciPy-Dev] Scipy Development queries In-Reply-To: References: Message-ID: Hi Ashwin, SciPy uses GitHub for managing issues and source code: http://github.com/scipy/scipy Cheers, Stephan On Wed, Feb 8, 2017 at 11:47 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: > I am new to scipy and am trying to contribute to the organization, can > someone tell me how to go about reading the source code so as to get an > abstract idea of the functionality? Also, how can I claim for the issues > that I want to solve? > Thanks in advance :) > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Feb 9 04:25:34 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 9 Feb 2017 22:25:34 +1300 Subject: [SciPy-Dev] GSoC'17 participation? In-Reply-To: References: Message-ID: On Thu, Feb 9, 2017 at 6:42 AM, Charles R Harris wrote: > Nice selection. The move to PyTest would also be relevant for NumPy. > > > > On Wed, Feb 8, 2017 at 1:53 AM, Ralf Gommers > wrote: > >> >> >> On Fri, Jan 20, 2017 at 12:47 PM, Joshua Wilson < >> josh.craig.wilson at gmail.com> wrote: >> >>> P.S. If anyone wants to co-mentor that would be welcome. >>> >>> On Thu, Jan 19, 2017 at 5:29 PM, Joshua Wilson < >>> josh.craig.wilson at gmail.com> wrote: >>> >>>> > Would you be able to add this to the wiki? >>>> >>>> Here it is: >>>> >>>> https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas# >>>> improve-the-parabolic-cylinder-functions >>>> >>> >> I have created a new page for 2017: https://github.com/scipy/scipy >> /wiki/GSoC-2017-project-ideas >> More ideas/mentors welcome, please edit! >> >> A link from http://python-gsoc.org/#ideas to our ideas page will be >> available within 24 hours I expect. >> >> Cheers, >> Ralf >> >> > Nice selection and write up. The move to PyTest would also be relevant for > NumPy. > Indeed. I did mention changes to numpy.testing, but changing the Numpy test suite could be added if there's time left. I'd expect that a good student would be able to do both Scipy and Numpy in a single GSoC. The PSF admins asked us to list two mentors per project. I've added myself to 2 ideas, now still need 1 extra name on the parabolic cylinder functions idea and 2 names on the scipy.diff idea. Any takers? This is not a hard commitment; at this point it would be helpful to list yourself if you would feel comfortable with mentoring on that topic and may possible want to co-mentor. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Thu Feb 9 14:13:18 2017 From: haberland at ucla.edu (Matt Haberland) Date: Thu, 9 Feb 2017 11:13:18 -0800 Subject: [SciPy-Dev] GSoC'17 participation? In-Reply-To: References: Message-ID: I am interested in helping with scipy.diff. I'm in the process of contributing to SciPy for the first time myself, so I don't have a ton of experience there. But I have experience mentoring code-heavy undergraduate applied math research. On Feb 9, 2017 1:43 AM, "Ralf Gommers" wrote: > I am interested in scipy.diff. Numerical differentiation is a great topic. > I'm in the process of contributing to SciPy for the first time myself, so > I don't have a ton of experience there. > But I have experience mentoring summer undergraduate code-heavy applied > math research. > > On Feb 9, 2017 1:43 AM, "Ralf Gommers" wrote: > >> >> >> On Thu, Feb 9, 2017 at 6:42 AM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> >>> Nice selection. The move to PyTest would also be relevant for NumPy. >>> >>> >>> >>> On Wed, Feb 8, 2017 at 1:53 AM, Ralf Gommers >>> wrote: >>> >>>> >>>> >>>> On Fri, Jan 20, 2017 at 12:47 PM, Joshua Wilson < >>>> josh.craig.wilson at gmail.com> wrote: >>>> >>>>> P.S. If anyone wants to co-mentor that would be welcome. >>>>> >>>>> On Thu, Jan 19, 2017 at 5:29 PM, Joshua Wilson < >>>>> josh.craig.wilson at gmail.com> wrote: >>>>> >>>>>> > Would you be able to add this to the wiki? >>>>>> >>>>>> Here it is: >>>>>> >>>>>> https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas# >>>>>> improve-the-parabolic-cylinder-functions >>>>>> >>>>> >>>> I have created a new page for 2017: https://github.com/scipy/scipy >>>> /wiki/GSoC-2017-project-ideas >>>> More ideas/mentors welcome, please edit! >>>> >>>> A link from http://python-gsoc.org/#ideas to our ideas page will be >>>> available within 24 hours I expect. >>>> >>>> Cheers, >>>> Ralf >>>> >>>> >>> Nice selection and write up. The move to PyTest would also be relevant >>> for NumPy. >>> >> >> Indeed. I did mention changes to numpy.testing, but changing the Numpy >> test suite could be added if there's time left. I'd expect that a good >> student would be able to do both Scipy and Numpy in a single GSoC. >> >> The PSF admins asked us to list two mentors per project. I've added >> myself to 2 ideas, now still need 1 extra name on the parabolic cylinder >> functions idea and 2 names on the scipy.diff idea. >> >> Any takers? This is not a hard commitment; at this point it would be >> helpful to list yourself if you would feel comfortable with mentoring on >> that topic and may possible want to co-mentor. >> >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.craig.wilson at gmail.com Thu Feb 9 14:21:29 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Thu, 9 Feb 2017 13:21:29 -0600 Subject: [SciPy-Dev] Scipy Development queries In-Reply-To: References: Message-ID: > can someone tell me how to go about reading the source code so as to get an abstract idea of the functionality? Just my opinion, but rather than trying to get an abstract idea I would propose starting very small and concrete. Find a very simple issue (e.g. documentation) and try to fix that. Depending on your experience, just the process of getting familiar with git and how to submit pull requests can be a lot to take in. Then try to find a simple bug to fix, and keep moving up from there. > Also, how can I claim for the issues that I want to solve? We don't really claim issues, if you fix an issue send a PR and someone will review it. - Josh On Wed, Feb 8, 2017 at 10:49 PM, Stephan Hoyer wrote: > Hi Ashwin, > > SciPy uses GitHub for managing issues and source code: > http://github.com/scipy/scipy > > Cheers, > Stephan > > On Wed, Feb 8, 2017 at 11:47 AM, ashwin.pathak > wrote: >> >> I am new to scipy and am trying to contribute to the organization, can >> someone tell me how to go about reading the source code so as to get an >> abstract idea of the functionality? Also, how can I claim for the issues >> that I want to solve? >> Thanks in advance :) >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From ashwin.pathak at students.iiit.ac.in Sat Feb 11 13:28:58 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Sat, 11 Feb 2017 23:58:58 +0530 Subject: [SciPy-Dev] Doubt regarding Docs Message-ID: Dear developers, I was trying to solve an issue related to the documentation of hyper-geometric function, 2F1. But, I could not find the file in the source code where 2F1 is addressed. Can someone help me find the file. Issue #6966. Thanks :) From tpudlik at gmail.com Sat Feb 11 14:23:47 2017 From: tpudlik at gmail.com (Ted Pudlik) Date: Sat, 11 Feb 2017 19:23:47 +0000 Subject: [SciPy-Dev] Doubt regarding Docs In-Reply-To: References: Message-ID: The documentation of the functions in the `special` module is regrettably somewhat scattered. A small number are documented right by their definitions, in https://github.com/scipy/scipy/blob/master/scipy/special/basic.py. Most, however, have their docstrings defined in https://github.com/scipy/scipy/blob/master/scipy/special/add_newdocs.py. This includes the hypergeometric functions. After changing the docstrings in add_newdocs.py, you will have to run generate_ufuncs.py for the docstrings to actually become attached to the appropriate ufuncs (and render in the html documentation). It's customary to do this in a separate commit, because it produces quite a bit of generated code. If you are trying to find not the docstrings but the actual source code of the functions, take a look at `generate_ufuncs.py`. It contains lines such as, * hyp1f1* -- *hyp1f1*_wrap: ddd->d, chyp1f1_wrap: ddD->D -- specfun_wrappers.h which indicate that the hyp1f1 SciPy function wraps the specfun implementation. A look at `specfun_wrappers.c` will show that the original specfun function is called `CHGM`; it is defined in https://github.com/scipy/scipy/blob/master/scipy/special/specfun/specfun.f. The documentation provided in the Fortran file is quite minimal (if still more than what you can currently find in the SciPy docstrings). However, as the header of the specfun.f file informs you, these implementations originally come from a book by Shanjie Zhang and Jianming Jin, "Computation of Special Functions" (1996). If you have access to a university library, you might be able to track down this book and find more information about the implementation. Hope this helps! Ted On Sat, Feb 11, 2017 at 10:30 AM ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: Dear developers, I was trying to solve an issue related to the documentation of hyper-geometric function, 2F1. But, I could not find the file in the source code where 2F1 is addressed. Can someone help me find the file. Issue #6966. Thanks :) _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Feb 13 03:41:11 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 13 Feb 2017 21:41:11 +1300 Subject: [SciPy-Dev] GSoC'17 participation? In-Reply-To: References: Message-ID: On Fri, Feb 10, 2017 at 8:13 AM, Matt Haberland wrote: > I am interested in helping with scipy.diff. > I'm in the process of contributing to SciPy for the first time myself, so > I don't have a ton of experience there. > But I have experience mentoring code-heavy undergraduate applied math > research. > Thanks Matt, that's awesome. No worries about lack of SciPy experience - if you were to co-mentor with a SciPy core dev then your expertise on the math side can be very helpful. I've just added your name and mine to the scipy.diff idea. Our ideas page now looks pretty good; more ideas still welcome of course. Cheers, Ralf > > On Feb 9, 2017 1:43 AM, "Ralf Gommers" wrote: > >> I am interested in scipy.diff. Numerical differentiation is a great topic. >> I'm in the process of contributing to SciPy for the first time myself, so >> I don't have a ton of experience there. >> But I have experience mentoring summer undergraduate code-heavy applied >> math research. >> >> On Feb 9, 2017 1:43 AM, "Ralf Gommers" wrote: >> >>> >>> >>> On Thu, Feb 9, 2017 at 6:42 AM, Charles R Harris < >>> charlesr.harris at gmail.com> wrote: >>> >>>> Nice selection. The move to PyTest would also be relevant for NumPy. >>>> >>>> >>>> >>>> On Wed, Feb 8, 2017 at 1:53 AM, Ralf Gommers >>>> wrote: >>>> >>>>> >>>>> >>>>> On Fri, Jan 20, 2017 at 12:47 PM, Joshua Wilson < >>>>> josh.craig.wilson at gmail.com> wrote: >>>>> >>>>>> P.S. If anyone wants to co-mentor that would be welcome. >>>>>> >>>>>> On Thu, Jan 19, 2017 at 5:29 PM, Joshua Wilson < >>>>>> josh.craig.wilson at gmail.com> wrote: >>>>>> >>>>>>> > Would you be able to add this to the wiki? >>>>>>> >>>>>>> Here it is: >>>>>>> >>>>>>> https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas# >>>>>>> improve-the-parabolic-cylinder-functions >>>>>>> >>>>>> >>>>> I have created a new page for 2017: https://github.com/scipy/scipy >>>>> /wiki/GSoC-2017-project-ideas >>>>> More ideas/mentors welcome, please edit! >>>>> >>>>> A link from http://python-gsoc.org/#ideas to our ideas page will be >>>>> available within 24 hours I expect. >>>>> >>>>> Cheers, >>>>> Ralf >>>>> >>>>> >>>> Nice selection and write up. The move to PyTest would also be relevant >>>> for NumPy. >>>> >>> >>> Indeed. I did mention changes to numpy.testing, but changing the Numpy >>> test suite could be added if there's time left. I'd expect that a good >>> student would be able to do both Scipy and Numpy in a single GSoC. >>> >>> The PSF admins asked us to list two mentors per project. I've added >>> myself to 2 ideas, now still need 1 extra name on the parabolic cylinder >>> functions idea and 2 names on the scipy.diff idea. >>> >>> Any takers? This is not a hard commitment; at this point it would be >>> helpful to list yourself if you would feel comfortable with mentoring on >>> that topic and may possible want to co-mentor. >>> >>> Ralf >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Feb 13 04:27:44 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 13 Feb 2017 22:27:44 +1300 Subject: [SciPy-Dev] branching 0.19.x soon? In-Reply-To: References: Message-ID: On Thu, Feb 9, 2017 at 6:57 AM, Evgeni Burovski wrote: > On Tue, Jan 17, 2017 at 11:54 AM, Ralf Gommers > wrote: > > > > > > On Tue, Jan 17, 2017 at 8:56 AM, Evgeni Burovski > > wrote: > >> > >> > We had planned the 0.19.0 release for the end of January. We won't > make > >> > that > >> > anymore, but branching this month seems doable (nothing particularly > >> > large > >> > or blocking in https://github.com/scipy/scipy/milestone/30). > >> > >> Thanks for bringing this up Ralf! > >> > >> The dblquad regression would be good to fix. > >> (Ah, as noted on GH already) > >> Can't think of other blockers off the top of my head. Some more time > >> for merging open PRs would be nice to have, but optional. > > > > > > It seems like we're always hovering around 130-140 PRs, no matter how > much > > time we take. But probably good to give it say two weeks, so everyone > has a > > chance to go and review/merge the PRs they would really like to get in? > > > >> > >> > >> > @Evgeni, did you plan / want to be the release manager for this > release > >> > again, or do you want someone else to take that role? > >> > >> If someone wants it, great. Volunteers? > >> Otherwise I'll happily do it. > > > > > > "want" is a big word:) I'm also happy to take a turn again, either for > this > > release or a following one. But if someone new to the role volunteers, > > that'd be quite nice. > > > Having discussed this with Ralf off-list, I am volunteering to do the > release legwork. > Thanks again Evgeni! > > The contents of the milestone: https://github.com/scipy/ > scipy/milestones/0.19.0. > Several of these are rolled over from a previous release or even > earlier ones, where an issue or a PR does not make the release X, and > gets re-milestoned to X+1 so that we don't forget (tm). > > These look like blockers: > > A flaky test in linalg: https://github.com/scipy/scipy/issues/6984 > A regression in integrate: https://github.com/scipy/scipy/issues/6898 There's a PR which needs completing (tests to add): https://github.com/scipy/scipy/pull/6965. If someone would want to have a go at that, that'd be great. > > A failure in stats with numpy-dev: > https://github.com/scipy/scipy/pull/7012#issuecomment-278368718 > > The rest seem to be possible to roll over again unless fixed. I'll check the ``make dist`` issue mentioned in gh-3250; from the description it can't be rolled over but isn't a problem for RC1. > Of > course, any help with troubleshooting, reviewing and merging is most > appreciated. > > If you think something is missing and should go into 0.19.0, please > add it to the milestone, or comment on github or in this email thread. > > Here's an optimistic schedule: > > 15 Feb : branch 0.19.x, cut 0.19.0rc1 > 22 Feb : 0.19.0rc2, if needed > 09 Mar : 0.19.0 final > > Please speak up if the timing does not work for you. > LGTM Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Feb 13 04:31:29 2017 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 13 Feb 2017 20:31:29 +1100 Subject: [SciPy-Dev] branching 0.19.x soon? In-Reply-To: References: Message-ID: I'm currently running all the module tests on OSX=10.12.3, Python=3.5.2, NumPy (nomkl) =1.12.0, with the master branch. scipy.integrate is reporting one test failure, #7038. scipy.sparse is reporting lots of errors, then segfaults. On 13 February 2017 at 20:27, Ralf Gommers wrote: > > > On Thu, Feb 9, 2017 at 6:57 AM, Evgeni Burovski < > evgeny.burovskiy at gmail.com> wrote: > >> On Tue, Jan 17, 2017 at 11:54 AM, Ralf Gommers >> wrote: >> > >> > >> > On Tue, Jan 17, 2017 at 8:56 AM, Evgeni Burovski >> > wrote: >> >> >> >> > We had planned the 0.19.0 release for the end of January. We won't >> make >> >> > that >> >> > anymore, but branching this month seems doable (nothing particularly >> >> > large >> >> > or blocking in https://github.com/scipy/scipy/milestone/30). >> >> >> >> Thanks for bringing this up Ralf! >> >> >> >> The dblquad regression would be good to fix. >> >> (Ah, as noted on GH already) >> >> Can't think of other blockers off the top of my head. Some more time >> >> for merging open PRs would be nice to have, but optional. >> > >> > >> > It seems like we're always hovering around 130-140 PRs, no matter how >> much >> > time we take. But probably good to give it say two weeks, so everyone >> has a >> > chance to go and review/merge the PRs they would really like to get in? >> > >> >> >> >> >> >> > @Evgeni, did you plan / want to be the release manager for this >> release >> >> > again, or do you want someone else to take that role? >> >> >> >> If someone wants it, great. Volunteers? >> >> Otherwise I'll happily do it. >> > >> > >> > "want" is a big word:) I'm also happy to take a turn again, either for >> this >> > release or a following one. But if someone new to the role volunteers, >> > that'd be quite nice. >> >> >> Having discussed this with Ralf off-list, I am volunteering to do the >> release legwork. >> > > Thanks again Evgeni! > > >> >> The contents of the milestone: https://github.com/scipy/scipy >> /milestones/0.19.0. >> Several of these are rolled over from a previous release or even >> earlier ones, where an issue or a PR does not make the release X, and >> gets re-milestoned to X+1 so that we don't forget (tm). >> >> These look like blockers: >> >> A flaky test in linalg: https://github.com/scipy/scipy/issues/6984 >> A regression in integrate: https://github.com/scipy/scipy/issues/6898 > > > There's a PR which needs completing (tests to add): > https://github.com/scipy/scipy/pull/6965. If someone would want to have a > go at that, that'd be great. > > >> >> A failure in stats with numpy-dev: >> https://github.com/scipy/scipy/pull/7012#issuecomment-278368718 >> >> The rest seem to be possible to roll over again unless fixed. > > > I'll check the ``make dist`` issue mentioned in gh-3250; from the > description it can't be rolled over but isn't a problem for RC1. > > >> Of >> course, any help with troubleshooting, reviewing and merging is most >> appreciated. >> >> If you think something is missing and should go into 0.19.0, please >> add it to the milestone, or comment on github or in this email thread. >> >> Here's an optimistic schedule: >> >> 15 Feb : branch 0.19.x, cut 0.19.0rc1 >> 22 Feb : 0.19.0rc2, if needed >> 09 Mar : 0.19.0 final >> >> Please speak up if the timing does not work for you. >> > > LGTM > > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Feb 13 04:42:22 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 13 Feb 2017 22:42:22 +1300 Subject: [SciPy-Dev] branching 0.19.x soon? In-Reply-To: References: Message-ID: On Mon, Feb 13, 2017 at 10:31 PM, Andrew Nelson wrote: > I'm currently running all the module tests on OSX=10.12.3, Python=3.5.2, > NumPy (nomkl) =1.12.0, with the master branch. > > scipy.integrate is reporting one test failure, #7038. > scipy.sparse is reporting lots of errors, then segfaults. > Can you put your build log in a gist and open an issue with a link to that? Likely something recent if it's real. Ralf > On 13 February 2017 at 20:27, Ralf Gommers wrote: > >> >> >> On Thu, Feb 9, 2017 at 6:57 AM, Evgeni Burovski < >> evgeny.burovskiy at gmail.com> wrote: >> >>> On Tue, Jan 17, 2017 at 11:54 AM, Ralf Gommers >>> wrote: >>> > >>> > >>> > On Tue, Jan 17, 2017 at 8:56 AM, Evgeni Burovski >>> > wrote: >>> >> >>> >> > We had planned the 0.19.0 release for the end of January. We won't >>> make >>> >> > that >>> >> > anymore, but branching this month seems doable (nothing particularly >>> >> > large >>> >> > or blocking in https://github.com/scipy/scipy/milestone/30). >>> >> >>> >> Thanks for bringing this up Ralf! >>> >> >>> >> The dblquad regression would be good to fix. >>> >> (Ah, as noted on GH already) >>> >> Can't think of other blockers off the top of my head. Some more time >>> >> for merging open PRs would be nice to have, but optional. >>> > >>> > >>> > It seems like we're always hovering around 130-140 PRs, no matter how >>> much >>> > time we take. But probably good to give it say two weeks, so everyone >>> has a >>> > chance to go and review/merge the PRs they would really like to get in? >>> > >>> >> >>> >> >>> >> > @Evgeni, did you plan / want to be the release manager for this >>> release >>> >> > again, or do you want someone else to take that role? >>> >> >>> >> If someone wants it, great. Volunteers? >>> >> Otherwise I'll happily do it. >>> > >>> > >>> > "want" is a big word:) I'm also happy to take a turn again, either for >>> this >>> > release or a following one. But if someone new to the role volunteers, >>> > that'd be quite nice. >>> >>> >>> Having discussed this with Ralf off-list, I am volunteering to do the >>> release legwork. >>> >> >> Thanks again Evgeni! >> >> >>> >>> The contents of the milestone: https://github.com/scipy/scipy >>> /milestones/0.19.0. >>> Several of these are rolled over from a previous release or even >>> earlier ones, where an issue or a PR does not make the release X, and >>> gets re-milestoned to X+1 so that we don't forget (tm). >>> >>> These look like blockers: >>> >>> A flaky test in linalg: https://github.com/scipy/scipy/issues/6984 >>> A regression in integrate: https://github.com/scipy/scipy/issues/6898 >> >> >> There's a PR which needs completing (tests to add): >> https://github.com/scipy/scipy/pull/6965. If someone would want to have >> a go at that, that'd be great. >> >> >>> >>> A failure in stats with numpy-dev: >>> https://github.com/scipy/scipy/pull/7012#issuecomment-278368718 >>> >>> The rest seem to be possible to roll over again unless fixed. >> >> >> I'll check the ``make dist`` issue mentioned in gh-3250; from the >> description it can't be rolled over but isn't a problem for RC1. >> >> >>> Of >>> course, any help with troubleshooting, reviewing and merging is most >>> appreciated. >>> >>> If you think something is missing and should go into 0.19.0, please >>> add it to the milestone, or comment on github or in this email thread. >>> >>> Here's an optimistic schedule: >>> >>> 15 Feb : branch 0.19.x, cut 0.19.0rc1 >>> 22 Feb : 0.19.0rc2, if needed >>> 09 Mar : 0.19.0 final >>> >>> Please speak up if the timing does not work for you. >>> >> >> LGTM >> >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > > -- > _____________________________________ > Dr. Andrew Nelson > > > _____________________________________ > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Feb 13 05:30:15 2017 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 13 Feb 2017 21:30:15 +1100 Subject: [SciPy-Dev] branching 0.19.x soon? In-Reply-To: References: Message-ID: Just cleaned source directory and rebuilt, and #7038 disappears, as does the sparse segfault. But there were a couple of different issues instead, #7039 On 13 February 2017 at 20:42, Ralf Gommers wrote: > > > On Mon, Feb 13, 2017 at 10:31 PM, Andrew Nelson > wrote: > >> I'm currently running all the module tests on OSX=10.12.3, Python=3.5.2, >> NumPy (nomkl) =1.12.0, with the master branch. >> >> scipy.integrate is reporting one test failure, #7038. >> scipy.sparse is reporting lots of errors, then segfaults. >> > > Can you put your build log in a gist and open an issue with a link to > that? Likely something recent if it's real. > > Ralf > > > >> On 13 February 2017 at 20:27, Ralf Gommers >> wrote: >> >>> >>> >>> On Thu, Feb 9, 2017 at 6:57 AM, Evgeni Burovski < >>> evgeny.burovskiy at gmail.com> wrote: >>> >>>> On Tue, Jan 17, 2017 at 11:54 AM, Ralf Gommers >>>> wrote: >>>> > >>>> > >>>> > On Tue, Jan 17, 2017 at 8:56 AM, Evgeni Burovski >>>> > wrote: >>>> >> >>>> >> > We had planned the 0.19.0 release for the end of January. We won't >>>> make >>>> >> > that >>>> >> > anymore, but branching this month seems doable (nothing >>>> particularly >>>> >> > large >>>> >> > or blocking in https://github.com/scipy/scipy/milestone/30). >>>> >> >>>> >> Thanks for bringing this up Ralf! >>>> >> >>>> >> The dblquad regression would be good to fix. >>>> >> (Ah, as noted on GH already) >>>> >> Can't think of other blockers off the top of my head. Some more time >>>> >> for merging open PRs would be nice to have, but optional. >>>> > >>>> > >>>> > It seems like we're always hovering around 130-140 PRs, no matter how >>>> much >>>> > time we take. But probably good to give it say two weeks, so everyone >>>> has a >>>> > chance to go and review/merge the PRs they would really like to get >>>> in? >>>> > >>>> >> >>>> >> >>>> >> > @Evgeni, did you plan / want to be the release manager for this >>>> release >>>> >> > again, or do you want someone else to take that role? >>>> >> >>>> >> If someone wants it, great. Volunteers? >>>> >> Otherwise I'll happily do it. >>>> > >>>> > >>>> > "want" is a big word:) I'm also happy to take a turn again, either >>>> for this >>>> > release or a following one. But if someone new to the role volunteers, >>>> > that'd be quite nice. >>>> >>>> >>>> Having discussed this with Ralf off-list, I am volunteering to do the >>>> release legwork. >>>> >>> >>> Thanks again Evgeni! >>> >>> >>>> >>>> The contents of the milestone: https://github.com/scipy/scipy >>>> /milestones/0.19.0. >>>> Several of these are rolled over from a previous release or even >>>> earlier ones, where an issue or a PR does not make the release X, and >>>> gets re-milestoned to X+1 so that we don't forget (tm). >>>> >>>> These look like blockers: >>>> >>>> A flaky test in linalg: https://github.com/scipy/scipy/issues/6984 >>>> A regression in integrate: https://github.com/scipy/scipy/issues/6898 >>> >>> >>> There's a PR which needs completing (tests to add): >>> https://github.com/scipy/scipy/pull/6965. If someone would want to have >>> a go at that, that'd be great. >>> >>> >>>> >>>> A failure in stats with numpy-dev: >>>> https://github.com/scipy/scipy/pull/7012#issuecomment-278368718 >>>> >>>> The rest seem to be possible to roll over again unless fixed. >>> >>> >>> I'll check the ``make dist`` issue mentioned in gh-3250; from the >>> description it can't be rolled over but isn't a problem for RC1. >>> >>> >>>> Of >>>> course, any help with troubleshooting, reviewing and merging is most >>>> appreciated. >>>> >>>> If you think something is missing and should go into 0.19.0, please >>>> add it to the milestone, or comment on github or in this email thread. >>>> >>>> Here's an optimistic schedule: >>>> >>>> 15 Feb : branch 0.19.x, cut 0.19.0rc1 >>>> 22 Feb : 0.19.0rc2, if needed >>>> 09 Mar : 0.19.0 final >>>> >>>> Please speak up if the timing does not work for you. >>>> >>> >>> LGTM >>> >>> Ralf >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> >> -- >> _____________________________________ >> Dr. Andrew Nelson >> >> >> _____________________________________ >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashwin.pathak at students.iiit.ac.in Mon Feb 13 17:05:29 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Tue, 14 Feb 2017 03:35:29 +0530 Subject: [SciPy-Dev] Issues on parabolic functions Message-ID: <59e01f8ad02223dfd9da670a5ba9d343@students.iiit.ac.in> Hello developers, I am new to this organization , I did solve easy-fix issues related to documentation. I wanted to solve some issues related to parabolic functions with easy or moderate difficulty , I tried looking it up in the issues list but could find any. Can someone help me with the issues? From ralf.gommers at gmail.com Tue Feb 14 04:15:10 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 14 Feb 2017 22:15:10 +1300 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2017 at 6:13 PM, Matt Haberland wrote: > Hi all, I'm new to the list. > > 1. I am interested in contributing a binary integer linear programming > solver now, and possibly a more general MILP solver in the future. Existing > solutions (lp_solve, cvxopt, PuLp w/ GLPK) are not as easy to use as they > should be, at least for Windows users. > > As I was writing the BILP code, I found a bug in linprog in which it would > report success, yet the bounds were not at all respected by the returned > solution. > > I will report that bug, but in the meantime I started writing an > interior-point linear programming solver, which I'd be interested in > contributing as well. > > Please let me know if you'd be interested in me contributing: > * A binary integer linear programming solver similar to Matlab's old > bintprog, but with options for both Balas' method and bintprog's branch and > bound algorithm > * A primal-dual interior-point method for linprog > I'm not a user of these optimization methods, so not sure. What would be helpful is if someone can make a proposal of what the scope of linear programming methods in SciPy should be, rather than adding them one by one. There's an argument to be made for having a consistent set of well-known algorithms in SciPy, but knowing the boundary of the scope here is helpful especially if there are more powerful solutions like cvxopt that could become easy to install soon. https://github.com/conda-forge/cvxopt-feedstock exists, and once the OpenBLAS Windows issue is solved (I would bet on that happening within a year) there'll be win64 binaries. > > 2. I was trying to search the archives to see if there is already > discussion about integer programming or additional methods for linprog, but > it seems that the server: > http://dir.gmane.org/ > isn't working? > Yes, Gmane suffered from lack of maintenance and is still recovering (it's like that for months already). Our mailman archives are fine ( https://mail.scipy.org/pipermail/scipy-dev/) but aren't searchable. We're still hoping for Gmane to recover - no one is actively working on migrating AFAIK. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Feb 15 15:36:55 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 15 Feb 2017 23:36:55 +0300 Subject: [SciPy-Dev] maintenance/0.19.x branched Message-ID: Hi, Scipy 0.19.x has been branched, 0.19.0rc1 has been tagged. An announcement will follow shortly. The master is now open for 1.0.0 development [sic!] Cheers, Evgeni From evgeny.burovskiy at gmail.com Wed Feb 15 15:48:04 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 15 Feb 2017 23:48:04 +0300 Subject: [SciPy-Dev] ANN: scipy 0.19.0 release candidate 1 Message-ID: Hi, I'm pleased to announce the availability of the first release candidate for scipy 0.19.0. It contains contributions from 120 people over the course of seven months. Please try this release and report any issues on Github tracker, https://github.com/scipy/scipy, or scipy-dev mailing list. Source tarballs and release notes are available from Github releases, https://github.com/scipy/scipy/releases/tag/v0.19.0rc1 Please note that this is a source-only release. We do not provide Windows binaries for this release. OS X and Linux wheels will be provided for the final release. The current release schedule is 22 Feb : 0.19.0rc2, if needed 09 Mar : 0.19.0 final Thanks to everyone who contributed to this release! Cheers, Evgeni A part of the release notes follows below: =================================================== ========================== SciPy 0.19.0 Release Notes ========================== .. note:: Scipy 0.19.0 is not released yet! .. contents:: SciPy 0.19.0 is the culmination of X months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.19.x branch, and on adding new features on the master branch. This release requires Python 2.7 or 3.4-3.6 and NumPy 1.8.2 or greater. Highlights of this release include: - A unified foreign function interface layer, `scipy.LowLevelCallable`. - Cython API for scalar, typed versions of the universal functions from the `scipy.special` module, via `cimport scipy.special.cython_special`. New features ============ Foreign function interface improvements --------------------------------------- `scipy.LowLevelCallable` provides a new unified interface for wrapping low-level compiled callback functions in the Python space. It supports Cython imported "api" functions, ctypes function pointers, CFFI function pointers, ``PyCapsules``, Numba jitted functions and more. See `gh-6509 `_ for details. `scipy.linalg` improvements --------------------------- The function `scipy.linalg.solve` obtained two more keywords ``assume_a`` and ``transposed``. The underlying LAPACK routines are replaced with "expert" versions and now can also be used to solve symmetric, hermitian and positive definite coefficient matrices. Moreover, ill-conditioned matrices now cause a warning to be emitted with the estimated condition number information. Old ``sym_pos`` keyword is kept for backwards compatibility reasons however it is identical to using ``assume_a='pos'``. Moreover, the ``debug`` keyword, which had no function but only printing the ``overwrite_`` values, is deprecated. The function `scipy.linalg.matrix_balance` was added to perform the so-called matrix balancing using the LAPACK xGEBAL routine family. This can be used to approximately equate the row and column norms through diagonal similarity transformations. The functions `scipy.linalg.solve_continuous_are` and `scipy.linalg.solve_discrete_are` have numerically more stable algorithms. These functions can also solve generalized algebraic matrix Riccati equations. Moreover, both gained a ``balanced`` keyword to turn balancing on and off. `scipy.spatial` improvements ---------------------------- `scipy.spatial.SphericalVoronoi.sort_vertices_of_regions` has been re-written in Cython to improve performance. `scipy.spatial.SphericalVoronoi` can handle > 200 k points (at least 10 million) and has improved performance. The function `scipy.spatial.distance.directed_hausdorff` was added to calculate the directed Hausdorff distance. ``count_neighbors`` method of `scipy.spatial.cKDTree` gained an ability to perform weighted pair counting via the new keywords ``weights`` and ``cumulative``. See `gh-5647 `_ for details. `scipy.ndimage` improvements ---------------------------- The callback function C API supports PyCapsules in Python 2.7 Multidimensional filters now allow having different extrapolation modes for different axes. `scipy.optimize` improvements ----------------------------- The `scipy.optimize.basinhopping` global minimizer obtained a new keyword, `seed`, which can be used to seed the random number generator and obtain repeatable minimizations. The keyword `sigma` in `scipy.optimize.curve_fit` was overloaded to also accept the covariance matrix of errors in the data. `scipy.signal` improvements --------------------------- The function `scipy.signal.correlate` and `scipy.signal.convolve` have a new optional parameter `method`. The default value of `auto` estimates the fastest of two computation methods, the direct approach and the Fourier transform approach. A new function has been added to choose the convolution/correlation method, `scipy.signal.choose_conv_method` which may be appropriate if convolutions or correlations are performed on many arrays of the same size. New functions have been added to calculate complex short time fourier transforms of an input signal, and to invert the transform to recover the original signal: `scipy.signal.stft` and `scipy.signal.istft`. This implementation also fixes the previously incorrect ouput of `scipy.signal.spectrogram` when complex output data were requested. The function `scipy.signal.sosfreqz` was added to compute the frequency response from second-order sections. The function `scipy.signal.unit_impulse` was added to conveniently generate an impulse function. The function `scipy.signal.iirnotch` was added to design second-order IIR notch filters that can be used to remove a frequency component from a signal. The dual function `scipy.signal.iirpeak` was added to compute the coefficients of a second-order IIR peak (resonant) filter. The function `scipy.signal.minimum_phase` was added to convert linear-phase FIR filters to minimum phase. The functions `scipy.signal.upfirdn` and `scipy.signal.resample_poly` are now substantially faster when operating on some n-dimensional arrays when n > 1. The largest reduction in computation time is realized in cases where the size of the array is small (<1k samples or so) along the axis to be filtered. `scipy.fftpack` improvements ---------------------------- Fast Fourier transform routines now accept `np.float16` inputs and upcast them to `np.float32`. Previously, they would raise an error. `scipy.cluster` improvements ---------------------------- Methods ``"centroid"`` and ``"median"`` of `scipy.cluster.hierarchy.linkage` have been significantly sped up. Long-standing issues with using ``linkage`` on large input data (over 16 GB) have been resolved. `scipy.sparse` improvements --------------------------- The functions `scipy.sparse.save_npz` and `scipy.sparse.load_npz` were added, providing simple serialization for some sparse formats. The `prune` method of classes `bsr_matrix`, `csc_matrix`, and `csr_matrix` was updated to reallocate backing arrays under certain conditions, reducing memory usage. The methods `argmin` and `argmax` were added to classes `coo_matrix`, `csc_matrix`, `csr_matrix`, and `bsr_matrix`. New function `scipy.sparse.csgraph.structural_rank` computes the structural rank of a graph with a given sparsity pattern. New function `scipy.sparse.linalg.spsolve_triangular` solves a sparse linear system with a triangular left hand side matrix. `scipy.special` improvements ---------------------------- Scalar, typed versions of universal functions from `scipy.special` are available in the Cython space via ``cimport`` from the new module `scipy.special.cython_special`. These scalar functions can be expected to be significantly faster then the universal functions for scalar arguments. See the `scipy.special` tutorial for details. Better control over special-function errors is offered by the functions `scipy.special.geterr` and `scipy.special.seterr` and the context manager `scipy.special.errstate`. The names of orthogonal polynomial root functions have been changed to be consistent with other functions relating to orthogonal polynomials. For example, `scipy.special.j_roots` has been renamed `scipy.special.roots_jacobi` for consistency with the related functions `scipy.special.jacobi` and `scipy.special.eval_jacobi`. To preserve back-compatibility the old names have been left as aliases. Wright Omega function is implemented as `scipy.special.wrightomega`. `scipy.stats` improvements -------------------------- The function `scipy.stats.weightedtau` was added. It provides a weighted version of Kendall's tau. New class `scipy.stats.multinomial` implements the multinomial distribution. New class `scipy.stats.rv_histogram` constructs a continuous univariate distribution with a piecewise linear CDF from a binned data sample. New class `scipy.stats.argus` implements the Argus distribution. `scipy.interpolate` improvements -------------------------------- New class `scipy.interpolate.BSpline` represents splines. ``BSpline`` objects contain knots and coefficients and can evaluate the spline. The format is consistent with FITPACK, so that one can do, for example:: >>> t, c, k = splrep(x, y, s=0) >>> spl = BSpline(t, c, k) >>> np.allclose(spl(x), y) ``spl*`` functions, `scipy.interpolate.splev`, `scipy.interpolate.splint`, `scipy.interpolate.splder` and `scipy.interpolate.splantider`, accept both ``BSpline`` objects and ``(t, c, k)`` tuples for backwards compatibility. For multidimensional splines, ``c.ndim > 1``, ``BSpline`` objects are consistent with piecewise polynomials, `scipy.interpolate.PPoly`. This means that ``BSpline`` objects are not immediately consistent with `scipy.interpolate.splprep`, and one *cannot* do ``>>> BSpline(*splprep([x, y])[0])``. Consult the `scipy.interpolate` test suite for examples of the precise equivalence. In new code, prefer using ``scipy.interpolate.BSpline`` objects instead of manipulating ``(t, c, k)`` tuples directly. New function `scipy.interpolate.make_interp_spline` constructs an interpolating spline given data points and boundary conditions. New function `scipy.interpolate.make_lsq_spline` constructs a least-squares spline approximation given data points. `scipy.integrate` improvements ------------------------------ Now `scipy.integrate.fixed_quad` supports vector-valued functions. Deprecated features =================== `scipy.interpolate.splmake`, `scipy.interpolate.spleval` and `scipy.interpolate.spline` are deprecated. The format used by `splmake/spleval` was inconsistent with `splrep/splev` which was confusing to users. `scipy.special.errprint` is deprecated. Improved functionality is available in `scipy.special.seterr`. Backwards incompatible changes ============================== The deprecated ``scipy.weave`` submodule was removed. `scipy.spatial.distance.squareform` now returns arrays of the same dtype as the input, instead of always float64. `scipy.special.errprint` now returns a boolean. The function `scipy.signal.find_peaks_cwt` now returns an array instead of a list. `scipy.stats.kendalltau` now computes the correct p-value in case the input contains ties. The p-value is also identical to that computed by `scipy.stats.mstats.kendalltau` and by R. If the input does not contain ties there is no change w.r.t. the previous implementation. The function `scipy.linalg.block_diag` will not ignore zero-sized matrices anymore. Instead it will insert rows or columns of zeros of the appropriate size. See gh-4908 for more details. Other changes ============= SciPy wheels will now report their dependency on ``numpy`` on all platforms. This change was made because Numpy wheels are available, and because the pip upgrade behavior is finally changing for the better (use ``--upgrade-strategy=only-if-needed`` for ``pip >= 8.2``; that behavior will become the default in the next major version of ``pip``). Numerical values returned by `scipy.interpolate.interp1d` with ``kind="cubic"`` and ``"quadratic"`` may change relative to previous scipy versions. If your code depended on specific numeric values (i.e., on implementation details of the interpolators), you may want to double-check your results. Authors ======= * @endolith * Max Argus + * Herv? Audren * Alessandro Pietro Bardelli + * Michael Benfield + * Felix Berkenkamp * Matthew Brett * Per Brodtkorb * Evgeni Burovski * Pierre de Buyl * CJ Carey * Brandon Carter + * Tim Cera * Klesk Chonkin * Christian H?ggstr?m + * Luca Citi * Peadar Coyle + * Daniel da Silva + * Greg Dooper + * John Draper + * drlvk + * David Ellis + * Yu Feng * Baptiste Fontaine + * Jed Frey + * Siddhartha Gandhi + * GiggleLiu + * Wim Glenn + * Akash Goel + * Ralf Gommers * Alexander Goncearenco + * Richard Gowers + * Alex Griffing * Radoslaw Guzinski + * Charles Harris * Callum Jacob Hays + * Ian Henriksen * Randy Heydon + * Lindsey Hiltner + * Gerrit Holl + * Hiroki IKEDA + * jfinkels + * Mher Kazandjian + * Thomas Keck + * keuj6 + * Kornel Kielczewski + * Sergey B Kirpichev + * Vasily Kokorev + * Eric Larson * Denis Laxalde * Gregory R. Lee * Josh Lefler + * Julien Lhermitte + * Evan Limanto + * Nikolay Mayorov * Geordie McBain + * Josue Melka + * Matthieu Melot * michaelvmartin15 + * Surhud More + * Brett M. Morris + * Chris Mutel + * Paul Nation * Andrew Nelson * David Nicholson + * Aaron Nielsen + * Joel Nothman * nrnrk + * Juan Nunez-Iglesias * Mikhail Pak + * Gavin Parnaby + * Thomas Pingel + * Ilhan Polat + * Aman Pratik + * Sebastian Pucilowski * Ted Pudlik * puenka + * Eric Quintero * Tyler Reddy * Joscha Reimer * Antonio Horta Ribeiro + * Edward Richards + * Roman Ring + * Rafael Rossi + * Colm Ryan + * Sami Salonen + * Alvaro Sanchez-Gonzalez + * Johannes Schmitz * Kari Schoonbee * Yurii Shevchuk + * Jonathan Siebert + * Jonathan Tammo Siebert + * Scott Sievert + * Sourav Singh * Byron Smith + * Srikiran + * Samuel St-Jean + * Yoni Teitelbaum + * Bhavika Tekwani * Martin Thoma * timbalam + * Svend Vanderveken + * Sebastiano Vigna + * Aditya Vijaykumar + * Santi Villalba + * Ze Vinicius * Pauli Virtanen * Matteo Visconti * Yusuke Watanabe + * Warren Weckesser * Phillip Weinberg + * Nils Werner * Jakub Wilk * Josh Wilson * wirew0rm + * David Wolever + * Nathan Woods * ybeltukov + * G Young * Evgeny Zhurko + A total of 120 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. From ashwin.pathak at students.iiit.ac.in Fri Feb 17 13:25:41 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Fri, 17 Feb 2017 23:55:41 +0530 Subject: [SciPy-Dev] Documentation of scipy.interpolate.splprep Message-ID: <75196847283f879824dc02b62b43def7@students.iiit.ac.in> I was solving the issue #6618, but I could not find the part of the source code where doc-string of scipy.interpolate.splprep is written. Can someone please point me out where I can find the doc-string? From josef.pktd at gmail.com Sat Feb 18 00:00:05 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 18 Feb 2017 00:00:05 -0500 Subject: [SciPy-Dev] tricky unary versus binary minus Message-ID: just an observation >>> bwi 25 >>> - bwi // 2 # not what I wanted -13 >>> 0 - bwi // 2 -12 >>> - (bwi // 2) -12 different rules from float division >>> 0 - bwi / 2 -12.5 >>> - bwi / 2 -12.5 bug in my code Josef -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sat Feb 18 00:03:38 2017 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 17 Feb 2017 21:03:38 -0800 Subject: [SciPy-Dev] tricky unary versus binary minus In-Reply-To: References: Message-ID: On Fri, Feb 17, 2017 at 9:00 PM, wrote: > > just an observation > > >>> bwi > 25 > >>> - bwi // 2 # not what I wanted > -13 > >>> 0 - bwi // 2 > -12 > >>> - (bwi // 2) > -12 > > different rules from float division > > >>> 0 - bwi / 2 > -12.5 > >>> - bwi / 2 > -12.5 > > bug in my code The rules (of operator precedence) are the same in both cases; unary negation binds tighter than either division, which in turn binds tighter than binary subtraction. It's just that in the latter case, both orders of operations just happen to give the same result for these values. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From alan.isaac at gmail.com Sat Feb 18 00:04:22 2017 From: alan.isaac at gmail.com (Alan Isaac) Date: Sat, 18 Feb 2017 00:04:22 -0500 Subject: [SciPy-Dev] tricky unary versus binary minus In-Reply-To: References: Message-ID: <8c32315d-ad7d-a75a-0db4-2714bd42165d@gmail.com> http://python-history.blogspot.com/2010/08/why-pythons-integer-division-floors.html fwiw, Alan From josef.pktd at gmail.com Sat Feb 18 00:25:09 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 18 Feb 2017 00:25:09 -0500 Subject: [SciPy-Dev] tricky unary versus binary minus In-Reply-To: <8c32315d-ad7d-a75a-0db4-2714bd42165d@gmail.com> References: <8c32315d-ad7d-a75a-0db4-2714bd42165d@gmail.com> Message-ID: On Sat, Feb 18, 2017 at 12:04 AM, Alan Isaac wrote: > http://python-history.blogspot.com/2010/08/why-pythons- > integer-division-floors.html I need floor division for the remaining computation, so that part I'm happy with.(Actually, when I start to use divmod I have only non-negative numbers) >>> bwi = 5 >>> np.arange(-bwi // 2, bwi // 2 + 1) array([-3, -2, -1, 0, 1, 2]) >>> np.arange(-(bwi // 2), bwi // 2 + 1) array([-2, -1, 0, 1, 2]) the bug hunting was: Why is the window asymmetric? What threw me off is the operator precedence, what Robert said about operator precedence is kind of obvious ex-post, but the case where it matters doesn't show up often enough to automatically think about it, and the familiar float analogy doesn't apply. e.g. I avoid remembering some rules by using explicit, defensive parenthesis >>> 1.5**(-2) it's commutative >>> --2 / -3 -0.6666666666666666 >>> ---2 / 3 -0.6666666666666666 >>> ---(2 / 3) -0.6666666666666666 Josef > > > fwiw, > Alan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alan.isaac at gmail.com Sat Feb 18 00:30:53 2017 From: alan.isaac at gmail.com (Alan Isaac) Date: Sat, 18 Feb 2017 00:30:53 -0500 Subject: [SciPy-Dev] tricky unary versus binary minus In-Reply-To: References: <8c32315d-ad7d-a75a-0db4-2714bd42165d@gmail.com> Message-ID: On 2/18/2017 12:25 AM, josef.pktd at gmail.com wrote: > What threw me off is the operator precedence Here's a related example most likely to appear tricky:: -1**2 == -1 Cheers, Alan From ralf.gommers at gmail.com Sat Feb 18 03:28:22 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 18 Feb 2017 21:28:22 +1300 Subject: [SciPy-Dev] Documentation of scipy.interpolate.splprep In-Reply-To: <75196847283f879824dc02b62b43def7@students.iiit.ac.in> References: <75196847283f879824dc02b62b43def7@students.iiit.ac.in> Message-ID: On Sat, Feb 18, 2017 at 7:25 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: > I was solving the issue #6618, but I could not find the part of the source > code where doc-string of scipy.interpolate.splprep is written. Can someone > please point me out where I can find the doc-string? > It's here: https://github.com/scipy/scipy/blob/master/scipy/interpolate/fitpack.py#L15 There are several ways to find out where the source code is. In IPython, do "interpolate.splprep??" - that will work for all Python code. For compiled code, you can use grep or grin to search through all files in the repo. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.mikolas1 at gmail.com Mon Feb 20 05:20:13 2017 From: david.mikolas1 at gmail.com (David Mikolas) Date: Mon, 20 Feb 2017 18:20:13 +0800 Subject: [SciPy-Dev] dense output option in integrate.ode In-Reply-To: References: Message-ID: Anne, Thanks for the info. I don't see anything... [wow, 14 month delay; Dec 2015 to Feb 2017] in this documentation that explains how to use the dense output (interpolation coefficients) for the last step when using vode, as you've mentioned. https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.integrate.ode.html#scipy.integrate.ode I have a few work-arounds available, but I'd really like to try the following. If I detect a result sufficiently close to a target, I'd like to call the interpolation from a minimizer to find the moment and location of closest approach within this step. If it is not close enough, I'd then like to take the next step and test again. ultimately this may not be the best way to do this, but I'd like to try this none-the-less. How can I do this with vode - as you described in December 2015? Is there a tutorial or example of something like this being done within Python 2.7 / SciPy? Thanks! David On Fri, Dec 4, 2015 at 9:47 PM, Anne Archibald wrote: > Dense output is actually already available from some of the integrators - > I believe vode in particular. It is limited to evaluation within the last > step, so the most reliable way to use it is to allow the adaptive stepper > to take a single step then evaluate within it. > > For many purposes, for example stopping criteria, evaluation within the > last step is cheap and sufficient, and many solver implementations provide > an interface for it. But it would occasionally be useful to have a solution > object that can be evaluated anywhere - preferably including one or more > derivatives. > > Anne > > On Mon, Nov 23, 2015, 12:43 David Mikolas > wrote: > >> OK, Jim Martin has started this and I'm trying to add momentum and any >> help that I can. I'm already using this feature (access to interpolation >> coefficients) by installing an environment with his changes, so I can run >> the integration flat out, yet get results at arbitrary density with nearly >> the same precision. >> >> I'm new to SciPy-dev and mail lists in general, appreciate any/all >> suggestions. Thanks! >> >> David >> >> On Mon, Nov 23, 2015 at 4:28 PM, Evgeni Burovski < >> evgeny.burovskiy at gmail.com> wrote: >> >>> Hi David, >>> >>> This sounds like a useful feature. >>> I guess a pull request would be welcome. (even though I'm not a user, >>> and I won't be able to review it). >>> >>> Evgeni >>> 23.11.2015 10:47 ???????????? "David Mikolas" >>> ???????: >>> >>> One way that dense output for dopri5 and dop853 can be very useful is if >>>> the integration is expensive/long and you don't want to repeat it, but you >>>> want to obtain results at new time points at a later date, or even iterate >>>> on it - for example, find the time of closest approac. This is done by >>>> saving the interpolation coefficients. I put a simple example here, though >>>> there is no saving to disk yet. >>>> >>>> http://pastebin.com/e6qNjbL9 dendop_test_v00.py >>>> >>>> I wonder if this could be developed into an option to return an >>>> interpolator object or function. If I can help let me know. >>>> >>>> On Sat, Nov 14, 2015 at 12:19 AM, Evgeni Burovski < >>>> evgeny.burovskiy at gmail.com> wrote: >>>> >>>>> Hi David, Hi Jim, >>>>> >>>>> > I am new to SciPy-dev. Will the dense output option in >>>>> scipy.integrate.ode >>>>> > become available in 0.17.0? This is a feature already available in >>>>> the >>>>> > original FORTAN, but wasn't implemented in the wrapper. >>>>> >>>>> If the feature is sent as a pull request against the scipy master >>>>> branch, the PR is reviewed by the maintainers of the integrate package >>>>> and merged into master before the release split, then yes, it would be >>>>> available in 0.17.0. >>>>> >>>>> So far I do not see any progress towards it. >>>>> >>>>> >>>>> > I wrote these dense output extensions that you listed: >>>>> > >>>>> > https://github.com/jddmartin/scipy/tree/dense_output_from_ >>>>> dopri5_and_dop853 >>>>> > https://github.com/jddmartin/dense_output_example_usage >>>>> > >>>>> > but I didn't issue any pull request to scipy. I sent this message >>>>> to the >>>>> > scipy developers list: >>>>> > http://article.gmane.org/gmane.comp.python.scientific.devel/19635/ >>>>> > explaining the changes. I was hoping for some feedback before >>>>> issuing a >>>>> > pull request. >>>>> >>>>> Ah, I see that the email likely fell through the cracks back in April. >>>>> Sorry about that. >>>>> You might want to ping that email thread once more or send a pull >>>>> request on github (or both). >>>>> >>>>> >>>>> >>>>> Cheers, >>>>> >>>>> Evgeni >>>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>>>> >>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Wed Feb 22 01:52:15 2017 From: haberland at ucla.edu (Matt Haberland) Date: Tue, 21 Feb 2017 22:52:15 -0800 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: On Tue, Feb 14, 2017 at 1:15 AM, Ralf Gommers wrote: > I'm not a user of these optimization methods, so not sure. What would be > helpful is if someone can make a proposal of what the scope of linear > programming methods in SciPy should be, rather than adding them one by > one. > > There's an argument to be made for having a consistent set of well-known > algorithms in SciPy, but knowing the boundary of the scope here is > helpful especially if there are more powerful solutions like cvxopt that > could become easy to install soon. https://github.com/conda- > forge/cvxopt-feedstock exists, and once the OpenBLAS Windows issue is > solved (I would bet on that happening within a year) there'll be win64 > binaries. > I'm not an expert in the scope of LP methods; I've only picked up some information as I've been implementing this interior point method. But since there haven't been any replies to your question, I'll attempt to answer, and hopefully others will correct me. I'll take two approaches: first identify the algorithms most popular in the literature, then look at the algorithms used in the most popular software. *In the literature...* There seem to be two main branches of LP methods: basis exchange algorithms (like the simplex method currently used in scipy.optimize.linprog) and interior point methods (like the primal-dual path following code I want to contribute). A moderately deep search into the basis exchange algorithms brings up only two names: simplex and criss-cross, each with a number of variations. The simplex method is the easier of the two to visualize: it starts at a vertex of the polytope defined by the problem constraints, traverses an edge in a direction that decreases the objective function, and continues until there are no such edges. Because it has to start at a vertex (and only visits vertices), the simplex method often requires the solution of a "phase 1" problem (often as time-consuming as the LP itself) to find a vertex. The criss-cross method, on the other hand, is not restricted to feasible bases, so it does not need to solve a "phase 1" problem and gets straight to work on the LP itself. Unfortunately, this does not guarantee better performance, as most criss-cross algorithms cannot guarantee that the objective function is non-decreasing. Both have exponential complexity in the worst case but fare much better in practice. There does not seem to be consensus in the literature on whether one typically outperforms the other. The poor worst-case complexity of basis-exchange algorithms spurred a lot of work on (provably polynomial-complexity) interior point methods. The ellipsoidal algorithm, Karmarkar's projective algorithm, and affine scaling method are a few names of historical importance, but I've seen several sources (not just Wikipedia...) state that primal-dual path-following algorithms are the preferred choice today. In a sense, the approach turns the linear programming problem with equality and inequality constraints into a nonlinear programming problem with equality constraints (only). The objective function is augmented with a "logarithmic barrier" that ensures plenty of slack in the inequality constraints, and the (mildly nonlinear) KKT conditions for this problem are iteratively solved as the logarithmic barrier is gradually removed. I like to visualize it as if there is a force pushing the current iterate point toward the center of the polytope while another pulls it in the direction of decreasing objective, but I imagine this is even more simplified/bastardized than my description of the simplex method above. Between basis-exchange and interior point branches, I haven't seen any claims that one is generally superior in practice. While the worst-case polynomial complexity of interior point methods is obviously better than the worst-case exponential complexity of simplex, both tend to scale approximately linearly with problem size for practical problems. Which of the two performs better depends on the particulars of the problem. *In software...* Sandia published a "Comparison of Open-Source Linear Programming Solvers ". Hans Mittelman does benchmarks of optimization software . Here are the names of the algorithms used in each. Basically there are just three - primal simplex, dual simplex (which can be viewed as the primal simplex method applied to the dual problem), and primal-dual path following. CVXOPT : primal-dual path following QSOPT : primal simplex, dual simplex COIN-OR : primal simplex, dual simplex, primal-dual path following GLPK (see this also): primal simplex, dual simplex, primal-dual path following lp_solve : simplex MINOS : primal simplex GUROBI : "Primal and dual simplex algorithms, parallel barrier algorithm with crossover, concurrent optimization, and sifting algorithm" (Parallel refers to the engagement of multiple cores. Barrier probably refers to the logarithmic barrier function used in the "primal-dual path following" above. Crossover seems to refer to the generation of the optimal basis, as that is not automatically produced by interior point methods. Concurrent optimization seems to refer to running different algorithms simultaneously on separate cores. Sifting is described here . I don't think there are any fundamentally different algorithms here; just extensions.) CPLEX : primal simplex, dual simplex, barrier interior point (which, if they're going to be so vague, can be considered synonymous with primal-dual path following) XPRESS : primal simplex, dual simplex, and primal-dual path following. (The active-set method for SQP is also listed.) MOSEK : primal simplex, dual simplex, and primal-dual path following Matlab : dual simplex (I suspect that it actually chooses between primal and dual automatically), primal-dual path following (two variants) *My conclusion:* While there are lots of variants, there are only two major approaches: simplex and primal-dual path following. While biased (as I've already written the code and want to contribute it), I hope I've also shown that compared to popular software, SciPy is somewhat incomplete without an interior point method, and could be considered relatively complete with an interior point method. I don't know much about whether faster, free code will become more accessible in Python. If this does happen, would linprog be removed from SciPy? If not, I would suggest adding the interior point method now. In my next email I'll compare the performance of my interior point code to the current simplex implementation. In short, I've tried four types of randomly generated problems and two non-random but scalable problems at several scales, and the interior point code is faster for all of them. I'm actually very surprised, but I have two explanations: * interior point typically requires fewer iterations than simplex, which means less overhead in calling compiled code (and more time spent *in* compiled/optimized code) * the interior point code is written to leverage scipy.sparse matrices and routines If you have benchmark problems already written for linprog that you'd like me to test (beyond those in the unit tests), please let me know. Matt -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Wed Feb 22 02:07:45 2017 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 22 Feb 2017 08:07:45 +0100 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: Message-ID: <20170222070745.GI3818748@phare.normalesup.org> > On Tue, Feb 14, 2017 at 1:15 AM, Ralf Gommers wrote: > There's an argument to be made for having a consistent set of well-known > algorithms in SciPy, but knowing the boundary of the scope here is helpful > especially if there are more powerful solutions like cvxopt that could > become easy to install soon. cvxopt is GPLv3 and scipy is BSD, so cvxopt cannot be a supplement of scipy for everybody. Ga?l From haberland at ucla.edu Wed Feb 22 19:44:41 2017 From: haberland at ucla.edu (Matt Haberland) Date: Wed, 22 Feb 2017 16:44:41 -0800 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: <20170222070745.GI3818748@phare.normalesup.org> References: <20170222070745.GI3818748@phare.normalesup.org> Message-ID: Thanks for pointing out the more restrictive license of cvxopt, Ga?l. Along those lines: lp_solve is Lesser GPL. GLPK is GPLv3 (as one would expect) COIN-OR is CPL. QSOPT is GPLv2.1+ or for research use one (user's choice). The rest of the codes are commercial, I believe. As mentioned in my last email, I wanted to compare the performance of linprog and the interior point code, which I'll refer to as linprog_ip. The benchmark problems used were scalable and typically randomly-generated. They were selected purely for their availability/ease of implementation (except for the Klee-Minty Hypercube, which is the bane of simplex solvers. It is possible that many of these tests are particularly challenging to simplex solvers, but the only one I know to punish simplex is the Klee-Minty hypercube.) I'd love to test with the more practical NETLIB benchmarks but these are in MPS format and I haven't looked into how to convert yet. The results are available at: https://www.dropbox.com/s/8vfye61kvn9h3zg/linprog_benchmarks.pdf?dl=0 and details about the tests are in the postscript. In summary, linprog_ip is significantly faster for all but the smallest problems (those that are solved in hundredths of a second), and is more reliable at finding solutions when they exist.. As problem size increases, the performance gap tends to increase. For sparse problems, the ability of linprog_ip to leverage scipy.sparse linear algebra yields great improvement. For certain types of problems, linprog has difficulty finding a starting point, and takes longer to fail than linprog_ip does to return the optimal solution. Besides the test results, please also see all open linprog bug reports: 5400 , 6139 , 6690 , 7044 . linprog_ip suffers from none of these. Of course, it also passes all the linprog unit tests and many more I have written. I attribute the success of linprog_ip to the algorithm it implements and the suitability of the algorithm to NumPy/SciPy. For instance, for lpgen_2D with 60 constraints and 60 variables, linprog reports about 140 iterations whereas linprog_ip finishes in 9. Each of linprog's iterations has loops that do one elementary row operations per loop iteration, whereas at the heart of a linprog_ip iteration is the Cholesky factorization of a matrix and back/forward substitution for four different right hand sides (all LAPACK via scipy.linalg.solve). Consequently, linprog_ip spends larger chunks of time in compiled code with far fewer loops/function calls in Python. Please let me know if you'd like to see the interior point code included in SciPy, and I'll integrate with linprog and figure out the submission process. (If you are familiar with compiling from source on Windows, please shoot me an email. After pushing through a few roadblocks, I'm stuck with a very vague error message.) Matt ----- All tests were performed on a Dell XPS 13, IntelCore i5-4200 CPU at 1.6-2.3 GHz, 8GB RAM, and Windows 10 64-bit. Tests were timed using line_profiler ( kernprof). For randomized problems, each of N problems of arbitrary size (m constraints, n variables) were generated and solved by linprog and linprog_ip in either (random) order. For non-random problems, the same problem of a given dimension was solved by linprog and linprog_ip in either (random) order N times. Unless otherwise noted, the consistency of the solutions returned by linprog and linprog_ip was checked by comparing the objective function value using numpy.allclose with default tolerances. Times reported are the average per hit (call to linprog or linprog_ip ) in units of 4.46E-07 s and also as normalized by the faster of the two times (1 is better; the other number indicates how many times longer it took). 1. lpgen_2D , the randomized test case currently part of the linprog unit tests. *Note that in these tests there are actually m*n variables and m+n constraints. For example, in the last test there are 800 constraints and 160000 variables.* (Unfortunately, I realized after testing that I left np.random.seed(0) in the code, and thus the same problem was solved N times per test. However, unreported testing after fixing this yielded similar results.) 2. zionts1 , a randomized test with inequality constraints originally used by Zionts to compare the performance of his criss-cross algorithm to the simplex. Occasionally, linprog fails to find a feasible initial point, even where a solution exist, as reported in Issue 6139 . 3. zionts2 , a randomized test with both equality and inequality constraints originally used by Zionts to compare the performance of his criss-cross algorithm to the simplex. For problems of moderate size, linprog fails to find a feasible initial point, even where a solution exist, as reported in Issue 6139 . 4. bonates , a sparse, randomized test used in "Performance evaluation of a family of criss?cross algorithms for linear programming". Here, percentage of nonzeros (density) is reported as p (the sparsity is 1-p). The solutions are typically infeasible or unbounded; the time reported is that required for the solver to reach this conclusion. linprog always terminates when it fails to find a feasible initial point. linprog_ip reports whether the problem is infeasible or unbounded. 5. magic_square, an LP-relaxed version of a binary integer linear program designed to find a magic square of size D. The cost vector is random; any feasible solution represents a magic square. The problem includes equality constraints and simple bounds (0, 1). As formulated, the problem's equality constraint matrix is rank deficient (in a non-obvious way), causing linprog to report success despite the fact that the returned solution is infeasible and the objective value is inconsistent with the solution, as reported in Issue 6690 . To enable performance comparison, I've lent linprog the use of linprog_ip's presolve routine, which removes redundant rows such that linprog is able to solve the problem correctly. 6. klee_minty , a problem that elicits the worst-case performance of Dantzig's original simplex method. Here, Bland's rule is used by linprog. (Its runtime is far worse with the default pivoting rule.) On Tue, Feb 21, 2017 at 11:07 PM, Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > > On Tue, Feb 14, 2017 at 1:15 AM, Ralf Gommers > wrote: > > There's an argument to be made for having a consistent set of > well-known > > algorithms in SciPy, but knowing the boundary of the scope here is > helpful > > especially if there are more powerful solutions like cvxopt that > could > > become easy to install soon. > > cvxopt is GPLv3 and scipy is BSD, so cvxopt cannot be a supplement of > scipy for everybody. > > Ga?l > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Thu Feb 23 01:46:15 2017 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 23 Feb 2017 07:46:15 +0100 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: <20170222070745.GI3818748@phare.normalesup.org> Message-ID: <20170223064615.GR282619@phare.normalesup.org> > Please let me know if you'd like to see the interior point code included in > SciPy, I personnaly think that it would be a good thing, given that it is a fairly general and core set of problems. But I am not very active in scipy anymore, so it's more a comment from the peanut gallery than an educated opinion. Cheers, Ga?l From pierre.haessig at crans.org Thu Feb 23 09:49:23 2017 From: pierre.haessig at crans.org (Pierre Haessig) Date: Thu, 23 Feb 2017 15:49:23 +0100 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: <20170222070745.GI3818748@phare.normalesup.org> Message-ID: <2ad3a609-d29d-205e-ccfe-5151c9202261@crans.org> Hi, Le 23/02/2017 ? 01:44, Matt Haberland a ?crit : > > Please let me know if you'd like to see the interior point code > included in SciPy, and I'll integrate with linprog and figure out the > submission process. (If you are familiar with compiling from source on > Windows, please shoot me an email. After pushing through a few > roadblocks, I'm stuck with a very vague error message.) I haven't looked at your detailed results, only your email. Still, the fact you checked both the speed and the accuracy looks very convincing to me (I've been bitten by the lack of accuracy of Scipy linprog in the past). In terms of API, my only suggestion would be to keep one single linprog routine, with a switch (between IP and simplex). Given your demonstration of the superiority of your algo, it should be the default. best, Pierre From evgeny.burovskiy at gmail.com Fri Feb 24 09:36:51 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Fri, 24 Feb 2017 17:36:51 +0300 Subject: [SciPy-Dev] ANN: Scipy 0.19.0 release candidate 2 Message-ID: Hi, I'm pleased to announce the availability of the second release candidate for Scipy 0.19.0. It contains contributions from 121 people over the course of seven months. The main difference to the rc1 is several Windows specific issues that were fixed (special thanks to Christoph Gohlke). Please try this release and report any issues on Github tracker, https://github.com/scipy/scipy, or scipy-dev mailing list. Source tarballs and release notes are available from Github releases, https://github.com/scipy/scipy/releases/tag/v0.19.0rc2 We appreciate if you could both run self-tests on your hardware, and test your code against this release. If no issues are reported, this will graduate to Scipy 0.19.0 final on the 9th March 2017. Thanks to everyone who contributed to this release! Cheers, Evgeni -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 ========================== SciPy 0.19.0 Release Notes ========================== .. note:: Scipy 0.19.0 is not released yet! .. contents:: SciPy 0.19.0 is the culmination of X months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.19.x branch, and on adding new features on the master branch. This release requires Python 2.7 or 3.4-3.6 and NumPy 1.8.2 or greater. Highlights of this release include: - - A unified foreign function interface layer, `scipy.LowLevelCallable`. - - Cython API for scalar, typed versions of the universal functions from the `scipy.special` module, via `cimport scipy.special.cython_special`. New features ============ Foreign function interface improvements - --------------------------------------- `scipy.LowLevelCallable` provides a new unified interface for wrapping low-level compiled callback functions in the Python space. It supports Cython imported "api" functions, ctypes function pointers, CFFI function pointers, ``PyCapsules``, Numba jitted functions and more. See `gh-6509 `_ for details. `scipy.linalg` improvements - --------------------------- The function `scipy.linalg.solve` obtained two more keywords ``assume_a`` and ``transposed``. The underlying LAPACK routines are replaced with "expert" versions and now can also be used to solve symmetric, hermitian and positive definite coefficient matrices. Moreover, ill-conditioned matrices now cause a warning to be emitted with the estimated condition number information. Old ``sym_pos`` keyword is kept for backwards compatibility reasons however it is identical to using ``assume_a='pos'``. Moreover, the ``debug`` keyword, which had no function but only printing the ``overwrite_`` values, is deprecated. The function `scipy.linalg.matrix_balance` was added to perform the so-called matrix balancing using the LAPACK xGEBAL routine family. This can be used to approximately equate the row and column norms through diagonal similarity transformations. The functions `scipy.linalg.solve_continuous_are` and `scipy.linalg.solve_discrete_are` have numerically more stable algorithms. These functions can also solve generalized algebraic matrix Riccati equations. Moreover, both gained a ``balanced`` keyword to turn balancing on and off. `scipy.spatial` improvements - ---------------------------- `scipy.spatial.SphericalVoronoi.sort_vertices_of_regions` has been re-written in Cython to improve performance. `scipy.spatial.SphericalVoronoi` can handle > 200 k points (at least 10 million) and has improved performance. The function `scipy.spatial.distance.directed_hausdorff` was added to calculate the directed Hausdorff distance. ``count_neighbors`` method of `scipy.spatial.cKDTree` gained an ability to perform weighted pair counting via the new keywords ``weights`` and ``cumulative``. See `gh-5647 `_ for details. `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` now support non-double custom metrics. `scipy.ndimage` improvements - ---------------------------- The callback function C API supports PyCapsules in Python 2.7 Multidimensional filters now allow having different extrapolation modes for different axes. `scipy.optimize` improvements - ----------------------------- The `scipy.optimize.basinhopping` global minimizer obtained a new keyword, `seed`, which can be used to seed the random number generator and obtain repeatable minimizations. The keyword `sigma` in `scipy.optimize.curve_fit` was overloaded to also accept the covariance matrix of errors in the data. `scipy.signal` improvements - --------------------------- The function `scipy.signal.correlate` and `scipy.signal.convolve` have a new optional parameter `method`. The default value of `auto` estimates the fastest of two computation methods, the direct approach and the Fourier transform approach. A new function has been added to choose the convolution/correlation method, `scipy.signal.choose_conv_method` which may be appropriate if convolutions or correlations are performed on many arrays of the same size. New functions have been added to calculate complex short time fourier transforms of an input signal, and to invert the transform to recover the original signal: `scipy.signal.stft` and `scipy.signal.istft`. This implementation also fixes the previously incorrect ouput of `scipy.signal.spectrogram` when complex output data were requested. The function `scipy.signal.sosfreqz` was added to compute the frequency response from second-order sections. The function `scipy.signal.unit_impulse` was added to conveniently generate an impulse function. The function `scipy.signal.iirnotch` was added to design second-order IIR notch filters that can be used to remove a frequency component from a signal. The dual function `scipy.signal.iirpeak` was added to compute the coefficients of a second-order IIR peak (resonant) filter. The function `scipy.signal.minimum_phase` was added to convert linear-phase FIR filters to minimum phase. The functions `scipy.signal.upfirdn` and `scipy.signal.resample_poly` are now substantially faster when operating on some n-dimensional arrays when n > 1. The largest reduction in computation time is realized in cases where the size of the array is small (<1k samples or so) along the axis to be filtered. `scipy.fftpack` improvements - ---------------------------- Fast Fourier transform routines now accept `np.float16` inputs and upcast them to `np.float32`. Previously, they would raise an error. `scipy.cluster` improvements - ---------------------------- Methods ``"centroid"`` and ``"median"`` of `scipy.cluster.hierarchy.linkage` have been significantly sped up. Long-standing issues with using ``linkage`` on large input data (over 16 GB) have been resolved. `scipy.sparse` improvements - --------------------------- The functions `scipy.sparse.save_npz` and `scipy.sparse.load_npz` were added, providing simple serialization for some sparse formats. The `prune` method of classes `bsr_matrix`, `csc_matrix`, and `csr_matrix` was updated to reallocate backing arrays under certain conditions, reducing memory usage. The methods `argmin` and `argmax` were added to classes `coo_matrix`, `csc_matrix`, `csr_matrix`, and `bsr_matrix`. New function `scipy.sparse.csgraph.structural_rank` computes the structural rank of a graph with a given sparsity pattern. New function `scipy.sparse.linalg.spsolve_triangular` solves a sparse linear system with a triangular left hand side matrix. `scipy.special` improvements - ---------------------------- Scalar, typed versions of universal functions from `scipy.special` are available in the Cython space via ``cimport`` from the new module `scipy.special.cython_special`. These scalar functions can be expected to be significantly faster then the universal functions for scalar arguments. See the `scipy.special` tutorial for details. Better control over special-function errors is offered by the functions `scipy.special.geterr` and `scipy.special.seterr` and the context manager `scipy.special.errstate`. The names of orthogonal polynomial root functions have been changed to be consistent with other functions relating to orthogonal polynomials. For example, `scipy.special.j_roots` has been renamed `scipy.special.roots_jacobi` for consistency with the related functions `scipy.special.jacobi` and `scipy.special.eval_jacobi`. To preserve back-compatibility the old names have been left as aliases. Wright Omega function is implemented as `scipy.special.wrightomega`. `scipy.stats` improvements - -------------------------- The function `scipy.stats.weightedtau` was added. It provides a weighted version of Kendall's tau. New class `scipy.stats.multinomial` implements the multinomial distribution. New class `scipy.stats.rv_histogram` constructs a continuous univariate distribution with a piecewise linear CDF from a binned data sample. New class `scipy.stats.argus` implements the Argus distribution. `scipy.interpolate` improvements - -------------------------------- New class `scipy.interpolate.BSpline` represents splines. ``BSpline`` objects contain knots and coefficients and can evaluate the spline. The format is consistent with FITPACK, so that one can do, for example:: >>> t, c, k = splrep(x, y, s=0) >>> spl = BSpline(t, c, k) >>> np.allclose(spl(x), y) ``spl*`` functions, `scipy.interpolate.splev`, `scipy.interpolate.splint`, `scipy.interpolate.splder` and `scipy.interpolate.splantider`, accept both ``BSpline`` objects and ``(t, c, k)`` tuples for backwards compatibility. For multidimensional splines, ``c.ndim > 1``, ``BSpline`` objects are consistent with piecewise polynomials, `scipy.interpolate.PPoly`. This means that ``BSpline`` objects are not immediately consistent with `scipy.interpolate.splprep`, and one *cannot* do ``>>> BSpline(*splprep([x, y])[0])``. Consult the `scipy.interpolate` test suite for examples of the precise equivalence. In new code, prefer using ``scipy.interpolate.BSpline`` objects instead of manipulating ``(t, c, k)`` tuples directly. New function `scipy.interpolate.make_interp_spline` constructs an interpolating spline given data points and boundary conditions. New function `scipy.interpolate.make_lsq_spline` constructs a least-squares spline approximation given data points. `scipy.integrate` improvements - ------------------------------ Now `scipy.integrate.fixed_quad` supports vector-valued functions. Deprecated features =================== `scipy.interpolate.splmake`, `scipy.interpolate.spleval` and `scipy.interpolate.spline` are deprecated. The format used by `splmake/spleval` was inconsistent with `splrep/splev` which was confusing to users. `scipy.special.errprint` is deprecated. Improved functionality is available in `scipy.special.seterr`. calling `scipy.spatial.distance.pdist` or `scipy.spatial.distance.cdist` with arguments not needed by the chosen metric is deprecated. Also, metrics `"old_cosine"` and `"old_cos"` are deprecated. Backwards incompatible changes ============================== The deprecated ``scipy.weave`` submodule was removed. `scipy.spatial.distance.squareform` now returns arrays of the same dtype as the input, instead of always float64. `scipy.special.errprint` now returns a boolean. The function `scipy.signal.find_peaks_cwt` now returns an array instead of a list. `scipy.stats.kendalltau` now computes the correct p-value in case the input contains ties. The p-value is also identical to that computed by `scipy.stats.mstats.kendalltau` and by R. If the input does not contain ties there is no change w.r.t. the previous implementation. The function `scipy.linalg.block_diag` will not ignore zero-sized matrices anymore. Instead it will insert rows or columns of zeros of the appropriate size. See gh-4908 for more details. Other changes ============= SciPy wheels will now report their dependency on ``numpy`` on all platforms. This change was made because Numpy wheels are available, and because the pip upgrade behavior is finally changing for the better (use ``--upgrade-strategy=only-if-needed`` for ``pip >= 8.2``; that behavior will become the default in the next major version of ``pip``). Numerical values returned by `scipy.interpolate.interp1d` with ``kind="cubic"`` and ``"quadratic"`` may change relative to previous scipy versions. If your code depended on specific numeric values (i.e., on implementation details of the interpolators), you may want to double-check your results. Authors ======= * @endolith * Max Argus + * Herv? Audren * Alessandro Pietro Bardelli + * Michael Benfield + * Felix Berkenkamp * Matthew Brett * Per Brodtkorb * Evgeni Burovski * Pierre de Buyl * CJ Carey * Brandon Carter + * Tim Cera * Klesk Chonkin * Christian H?ggstr?m + * Luca Citi * Peadar Coyle + * Daniel da Silva + * Greg Dooper + * John Draper + * drlvk + * David Ellis + * Yu Feng * Baptiste Fontaine + * Jed Frey + * Siddhartha Gandhi + * GiggleLiu + * Wim Glenn + * Akash Goel + * Christoph Gohlke * Ralf Gommers * Alexander Goncearenco + * Richard Gowers + * Alex Griffing * Radoslaw Guzinski + * Charles Harris * Callum Jacob Hays + * Ian Henriksen * Randy Heydon + * Lindsey Hiltner + * Gerrit Holl + * Hiroki IKEDA + * jfinkels + * Mher Kazandjian + * Thomas Keck + * keuj6 + * Kornel Kielczewski + * Sergey B Kirpichev + * Vasily Kokorev + * Eric Larson * Denis Laxalde * Gregory R. Lee * Josh Lefler + * Julien Lhermitte + * Evan Limanto + * Nikolay Mayorov * Geordie McBain + * Josue Melka + * Matthieu Melot * michaelvmartin15 + * Surhud More + * Brett M. Morris + * Chris Mutel + * Paul Nation * Andrew Nelson * David Nicholson + * Aaron Nielsen + * Joel Nothman * nrnrk + * Juan Nunez-Iglesias * Mikhail Pak + * Gavin Parnaby + * Thomas Pingel + * Ilhan Polat + * Aman Pratik + * Sebastian Pucilowski * Ted Pudlik * puenka + * Eric Quintero * Tyler Reddy * Joscha Reimer * Antonio Horta Ribeiro + * Edward Richards + * Roman Ring + * Rafael Rossi + * Colm Ryan + * Sami Salonen + * Alvaro Sanchez-Gonzalez + * Johannes Schmitz * Kari Schoonbee * Yurii Shevchuk + * Jonathan Siebert + * Jonathan Tammo Siebert + * Scott Sievert + * Sourav Singh * Byron Smith + * Srikiran + * Samuel St-Jean + * Yoni Teitelbaum + * Bhavika Tekwani * Martin Thoma * timbalam + * Svend Vanderveken + * Sebastiano Vigna + * Aditya Vijaykumar + * Santi Villalba + * Ze Vinicius * Pauli Virtanen * Matteo Visconti * Yusuke Watanabe + * Warren Weckesser * Phillip Weinberg + * Nils Werner * Jakub Wilk * Josh Wilson * wirew0rm + * David Wolever + * Nathan Woods * ybeltukov + * G Young * Evgeny Zhurko + A total of 121 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Issues closed for 0.19.0 - ------------------------ - - `#1767 `__: Function definitions in __fitpack.h should be moved. (Trac #1240) - - `#1774 `__: _kmeans chokes on large thresholds (Trac #1247) - - `#2089 `__: Integer overflows cause segfault in linkage function with large... - - `#2190 `__: Are odd-length window functions supposed to be always symmetrical?... - - `#2251 `__: solve_discrete_are in scipy.linalg does (sometimes) not solve... - - `#2580 `__: scipy.interpolate.UnivariateSpline (or a new superclass of it)... - - `#2592 `__: scipy.stats.anderson assumes gumbel_l - - `#3054 `__: scipy.linalg.eig does not handle infinite eigenvalues - - `#3160 `__: multinomial pmf / logpmf - - `#3904 `__: scipy.special.ellipj dn wrong values at quarter period - - `#4044 `__: Inconsistent code book initialization in kmeans - - `#4234 `__: scipy.signal.flattop documentation doesn't list a source for... - - `#4831 `__: Bugs in C code in __quadpack.h - - `#4908 `__: bug: unnessesary validity check for block dimension in scipy.sparse.block_diag - - `#4917 `__: BUG: indexing error for sparse matrix with ix_ - - `#4938 `__: Docs on extending ndimage need to be updated. - - `#5056 `__: sparse matrix element-wise multiplying dense matrix returns dense... - - `#5337 `__: Formula in documentation for correlate is wrong - - `#5537 `__: use OrderedDict in io.netcdf - - `#5750 `__: [doc] missing data index value in KDTree, cKDTree - - `#5755 `__: p-value computation in scipy.stats.kendalltau() in broken in... - - `#5757 `__: BUG: Incorrect complex output of signal.spectrogram - - `#5964 `__: ENH: expose scalar versions of scipy.special functions to cython - - `#6107 `__: scipy.cluster.hierarchy.single segmentation fault with 2**16... - - `#6278 `__: optimize.basinhopping should take a RandomState object - - `#6296 `__: InterpolatedUnivariateSpline: check_finite fails when w is unspecified - - `#6306 `__: Anderson-Darling bad results - - `#6314 `__: scipy.stats.kendaltau() p value not in agreement with R, SPSS... - - `#6340 `__: Curve_fit bounds and maxfev - - `#6377 `__: expm_multiply, complex matrices not working using start,stop,ect... - - `#6382 `__: optimize.differential_evolution stopping criterion has unintuitive... - - `#6391 `__: Global Benchmarking times out at 600s. - - `#6397 `__: mmwrite errors with large (but still 64-bit) integers - - `#6413 `__: scipy.stats.dirichlet computes multivariate gaussian differential... - - `#6428 `__: scipy.stats.mstats.mode modifies input - - `#6440 `__: Figure out ABI break policy for scipy.special Cython API - - `#6441 `__: Using Qhull for halfspace intersection : segfault - - `#6442 `__: scipy.spatial : In incremental mode volume is not recomputed - - `#6451 `__: Documentation for scipy.cluster.hierarchy.to_tree is confusing... - - `#6490 `__: interp1d (kind=zero) returns wrong value for rightmost interpolation... - - `#6521 `__: scipy.stats.entropy does *not* calculate the KL divergence - - `#6530 `__: scipy.stats.spearmanr unexpected NaN handling - - `#6541 `__: Test runner does not run scipy._lib/tests? - - `#6552 `__: BUG: misc.bytescale returns unexpected results when using cmin/cmax... - - `#6556 `__: RectSphereBivariateSpline(u, v, r) fails if min(v) >= pi - - `#6559 `__: Differential_evolution maxiter causing memory overflow - - `#6565 `__: Coverage of spectral functions could be improved - - `#6628 `__: Incorrect parameter name in binomial documentation - - `#6634 `__: Expose LAPACK's xGESVX family for linalg.solve ill-conditioned... - - `#6657 `__: Confusing documentation for `scipy.special.sph_harm` - - `#6676 `__: optimize: Incorrect size of Jacobian returned by `minimize(...,... - - `#6681 `__: add a new context manager to wrap `scipy.special.seterr` - - `#6700 `__: BUG: scipy.io.wavfile.read stays in infinite loop, warns on wav... - - `#6721 `__: scipy.special.chebyt(N) throw a 'TypeError' when N > 64 - - `#6727 `__: Documentation for scipy.stats.norm.fit is incorrect - - `#6764 `__: Documentation for scipy.spatial.Delaunay is partially incorrect - - `#6811 `__: scipy.spatial.SphericalVoronoi fails for large number of points - - `#6841 `__: spearmanr fails when nan_policy='omit' is set - - `#6869 `__: Currently in gaussian_kde, the logpdf function is calculated... - - `#6875 `__: SLSQP inconsistent handling of invalid bounds - - `#6876 `__: Python stopped working (Segfault?) with minimum/maximum filter... - - `#6889 `__: dblquad gives different results under scipy 0.17.1 and 0.18.1 - - `#6898 `__: BUG: dblquad ignores error tolerances - - `#6901 `__: Solving sparse linear systems in CSR format with complex values - - `#6903 `__: issue in spatial.distance.pdist docstring - - `#6917 `__: Problem in passing drop_rule to scipy.sparse.linalg.spilu - - `#6926 `__: signature mismatches for LowLevelCallable - - `#6961 `__: Scipy contains shebang pointing to /usr/bin/python and /bin/bash... - - `#6972 `__: BUG: special: `generate_ufuncs.py` is broken - - `#6984 `__: Assert raises test failure for test_ill_condition_warning - - `#6990 `__: BUG: sparse: Bad documentation of the `k` argument in `sparse.linalg.eigs` - - `#6991 `__: Division by zero in linregress() - - `#7011 `__: possible speed improvment in rv_continuous.fit() - - `#7015 `__: Test failure with Python 3.5 and numpy master Pull requests for 0.19.0 - ------------------------ - - `#2908 `__: Scipy 1.0 Roadmap - - `#3174 `__: add b-splines - - `#4606 `__: ENH: Add a unit impulse waveform function - - `#5608 `__: Adds keyword argument to choose faster convolution method - - `#5647 `__: ENH: Faster count_neighour in cKDTree / + weighted input data - - `#6021 `__: Netcdf append - - `#6058 `__: ENH: scipy.signal - Add stft and istft - - `#6059 `__: ENH: More accurate signal.freqresp for zpk systems - - `#6195 `__: ENH: Cython interface for special - - `#6234 `__: DOC: Fixed a typo in ward() help - - `#6261 `__: ENH: add docstring and clean up code for signal.normalize - - `#6270 `__: MAINT: special: add tests for cdflib - - `#6271 `__: Fix for scipy.cluster.hierarchy.is_isomorphic - - `#6273 `__: optimize: rewrite while loops as for loops - - `#6279 `__: MAINT: Bessel tweaks - - `#6291 `__: Fixes gh-6219: remove runtime warning from genextreme distribution - - `#6294 `__: STY: Some PEP8 and cleaning up imports in stats/_continuous_distns.py - - `#6297 `__: Clarify docs in misc/__init__.py - - `#6300 `__: ENH: sparse: Loosen input validation for `diags` with empty inputs - - `#6301 `__: BUG: standardizes check_finite behavior re optional weights,... - - `#6303 `__: Fixing example in _lazyselect docstring. - - `#6307 `__: MAINT: more improvements to gammainc/gammaincc - - `#6308 `__: Clarified documentation of hypergeometric distribution. - - `#6309 `__: BUG: stats: Improve calculation of the Anderson-Darling statistic. - - `#6315 `__: ENH: Descending order of x in PPoly - - `#6317 `__: ENH: stats: Add support for nan_policy to stats.median_test - - `#6321 `__: TST: fix a typo in test name - - `#6328 `__: ENH: sosfreqz - - `#6335 `__: Define LinregressResult outside of linregress - - `#6337 `__: In anderson test, added support for right skewed gumbel distribution. - - `#6341 `__: Accept several spellings for the curve_fit max number of function... - - `#6342 `__: DOC: cluster: clarify hierarchy.linkage usage - - `#6352 `__: DOC: removed brentq from its own 'see also' - - `#6362 `__: ENH: stats: Use explicit formulas for sf, logsf, etc in weibull... - - `#6369 `__: MAINT: special: add a comment to hyp0f1_complex - - `#6375 `__: Added the multinomial distribution. - - `#6387 `__: MAINT: special: improve accuracy of ellipj's `dn` at quarter... - - `#6388 `__: BenchmarkGlobal - getting it to work in Python3 - - `#6394 `__: ENH: scipy.sparse: add save and load functions for sparse matrices - - `#6400 `__: MAINT: moves global benchmark run from setup_cache to track_all - - `#6403 `__: ENH: seed kwd for basinhopping. Closes #6278 - - `#6404 `__: ENH: signal: added irrnotch and iirpeak functions. - - `#6406 `__: ENH: special: extend `sici`/`shichi` to complex arguments - - `#6407 `__: ENH: Window functions should not accept non-integer or negative... - - `#6408 `__: MAINT: _differentialevolution now uses _lib._util.check_random_state - - `#6427 `__: MAINT: Fix gmpy build & test that mpmath uses gmpy - - `#6439 `__: MAINT: ndimage: update callback function c api - - `#6443 `__: BUG: Fix volume computation in incremental mode - - `#6447 `__: Fixes issue #6413 - Minor documentation fix in the entropy function... - - `#6448 `__: ENH: Add halfspace mode to Qhull - - `#6449 `__: ENH: rtol and atol for differential_evolution termination fixes... - - `#6453 `__: DOC: Add some See Also links between similar functions - - `#6454 `__: DOC: linalg: clarify callable signature in `ordqz` - - `#6457 `__: ENH: spatial: enable non-double dtypes in squareform - - `#6459 `__: BUG: Complex matrices not handled correctly by expm_multiply... - - `#6465 `__: TST DOC Window docs, tests, etc. - - `#6469 `__: ENH: linalg: better handling of infinite eigenvalues in `eig`/`eigvals` - - `#6475 `__: DOC: calling interp1d/interp2d with NaNs is undefined - - `#6477 `__: Document magic numbers in optimize.py - - `#6481 `__: TST: Supress some warnings from test_windows - - `#6485 `__: DOC: spatial: correct typo in procrustes - - `#6487 `__: Fix Bray-Curtis formula in pdist docstring - - `#6493 `__: ENH: Add covariance functionality to scipy.optimize.curve_fit - - `#6494 `__: ENH: stats: Use log1p() to improve some calculations. - - `#6495 `__: BUG: Use MST algorithm instead of SLINK for single linkage clustering - - `#6497 `__: MRG: Add minimum_phase filter function - - `#6505 `__: reset scipy.signal.resample window shape to 1-D - - `#6507 `__: BUG: linkage: Raise exception if y contains non-finite elements - - `#6509 `__: ENH: _lib: add common machinery for low-level callback functions - - `#6520 `__: scipy.sparse.base.__mul__ non-numpy/scipy objects with 'shape'... - - `#6522 `__: Replace kl_div by rel_entr in entropy - - `#6524 `__: DOC: add next_fast_len to list of functions - - `#6527 `__: DOC: Release notes to reflect the new covariance feature in optimize.curve_fit - - `#6532 `__: ENH: Simplify _cos_win, document it, add symmetric/periodic arg - - `#6535 `__: MAINT: sparse.csgraph: updating old cython loops - - `#6540 `__: DOC: add to documentation of orthogonal polynomials - - `#6544 `__: TST: Ensure tests for scipy._lib are run by scipy.test() - - `#6546 `__: updated docstring of stats.linregress - - `#6553 `__: commited changes that I originally submitted for scipy.signal.cspline? - - `#6561 `__: BUG: modify signal.find_peaks_cwt() to return array and accept... - - `#6562 `__: DOC: Negative binomial distribution clarification - - `#6563 `__: MAINT: be more liberal in requiring numpy - - `#6567 `__: MAINT: use xrange for iteration in differential_evolution fixes... - - `#6572 `__: BUG: "sp.linalg.solve_discrete_are" fails for random data - - `#6578 `__: BUG: misc: allow both cmin/cmax and low/high params in bytescale - - `#6581 `__: Fix some unfortunate typos - - `#6582 `__: MAINT: linalg: make handling of infinite eigenvalues in `ordqz`... - - `#6585 `__: DOC: interpolate: correct seealso links to ndimage - - `#6588 `__: Update docstring of scipy.spatial.distance_matrix - - `#6592 `__: DOC: Replace 'first' by 'smallest' in mode - - `#6593 `__: MAINT: remove scipy.weave submodule - - `#6594 `__: DOC: distance.squareform: fix html docs, add note about dtype... - - `#6598 `__: [DOC] Fix incorrect error message in medfilt2d - - `#6599 `__: MAINT: linalg: turn a `solve_discrete_are` test back on - - `#6600 `__: DOC: Add SOS goals to roadmap - - `#6601 `__: DEP: Raise minimum numpy version to 1.8.2 - - `#6605 `__: MAINT: 'new' module is deprecated, don't use it - - `#6607 `__: DOC: add note on change in wheel dependency on numpy and pip... - - `#6609 `__: Fixes #6602 - Typo in docs - - `#6616 `__: ENH: generalization of continuous and discrete Riccati solvers... - - `#6621 `__: DOC: improve cluster.hierarchy docstrings. - - `#6623 `__: CS matrix prune method should copy data from large unpruned arrays - - `#6625 `__: DOC: special: complete documentation of `eval_*` functions - - `#6626 `__: TST: special: silence some deprecation warnings - - `#6631 `__: fix parameter name doc for discrete distributions - - `#6632 `__: MAINT: stats: change some instances of `special` to `sc` - - `#6633 `__: MAINT: refguide: py2k long integers are equal to py3k integers - - `#6638 `__: MAINT: change type declaration in cluster.linkage, prevent overflow - - `#6640 `__: BUG: fix issue with duplicate values used in cluster.vq.kmeans - - `#6641 `__: BUG: fix corner case in cluster.vq.kmeans for large thresholds - - `#6643 `__: MAINT: clean up truncation modes of dendrogram - - `#6645 `__: MAINT: special: rename `*_roots` functions - - `#6646 `__: MAINT: clean up mpmath imports - - `#6647 `__: DOC: add sqrt to Mahalanobis description for pdist - - `#6648 `__: DOC: special: add a section on `cython_special` to the tutorial - - `#6649 `__: ENH: Added scipy.spatial.distance.directed_hausdorff - - `#6650 `__: DOC: add Sphinx roles for DOI and arXiv links - - `#6651 `__: BUG: mstats: make sure mode(..., None) does not modify its input - - `#6652 `__: DOC: special: add section to tutorial on functions not in special - - `#6653 `__: ENH: special: add the Wright Omega function - - `#6656 `__: ENH: don't coerce input to double with custom metric in cdist... - - `#6658 `__: Faster/shorter code for computation of discordances - - `#6659 `__: DOC: special: make __init__ summaries and html summaries match - - `#6661 `__: general.rst: Fix a typo - - `#6664 `__: TST: Spectral functions' window correction factor - - `#6665 `__: [DOC] Conditions on v in RectSphereBivariateSpline - - `#6668 `__: DOC: Mention negative masses for center of mass - - `#6675 `__: MAINT: special: remove outdated README - - `#6677 `__: BUG: Fixes computation of p-values. - - `#6679 `__: BUG: optimize: return correct Jacobian for method 'SLSQP' in... - - `#6680 `__: ENH: Add structural rank to sparse.csgraph - - `#6686 `__: TST: Added Airspeed Velocity benchmarks for SphericalVoronoi - - `#6687 `__: DOC: add section "deciding on new features" to developer guide. - - `#6691 `__: ENH: Clearer error when fmin_slsqp obj doesn't return scalar - - `#6702 `__: TST: Added airspeed velocity benchmarks for scipy.spatial.distance.cdist - - `#6707 `__: TST: interpolate: test fitpack wrappers, not _impl - - `#6709 `__: TST: fix a number of test failures on 32-bit systems - - `#6711 `__: MAINT: move function definitions from __fitpack.h to _fitpackmodule.c - - `#6712 `__: MAINT: clean up wishlist in stats.morestats, and copyright statement. - - `#6715 `__: DOC: update the release notes with BSpline et al. - - `#6716 `__: MAINT: scipy.io.wavfile: No infinite loop when trying to read... - - `#6717 `__: some style cleanup - - `#6723 `__: BUG: special: cast to float before in-place multiplication in... - - `#6726 `__: address performance regressions in interp1d - - `#6728 `__: DOC: made code examples in `integrate` tutorial copy-pasteable - - `#6731 `__: DOC: scipy.optimize: Added an example for wrapping complex-valued... - - `#6732 `__: MAINT: cython_special: remove `errprint` - - `#6733 `__: MAINT: special: fix some pyflakes warnings - - `#6734 `__: DOC: sparse.linalg: fixed matrix description in `bicgstab` doc - - `#6737 `__: BLD: update `cythonize.py` to detect changes in pxi files - - `#6740 `__: DOC: special: some small fixes to docstrings - - `#6741 `__: MAINT: remove dead code in interpolate.py - - `#6742 `__: BUG: fix ``linalg.block_diag`` to support zero-sized matrices. - - `#6744 `__: ENH: interpolate: make PPoly.from_spline accept BSpline objects - - `#6746 `__: DOC: special: clarify use of Condon-Shortley phase in `sph_harm`/`lpmv` - - `#6750 `__: ENH: sparse: avoid densification on broadcasted elem-wise mult - - `#6751 `__: sinm doc explained cosm - - `#6753 `__: ENH: special: allow for more fine-tuned error handling - - `#6759 `__: Move logsumexp and pade from scipy.misc to scipy.special and... - - `#6761 `__: ENH: argmax and argmin methods for sparse matrices - - `#6762 `__: DOC: Improve docstrings of sparse matrices - - `#6763 `__: ENH: Weighted tau - - `#6768 `__: ENH: cythonized spherical Voronoi region polygon vertex sorting - - `#6770 `__: Correction of Delaunay class' documentation - - `#6775 `__: ENH: Integrating LAPACK "expert" routines with conditioning warnings... - - `#6776 `__: MAINT: Removing the trivial f2py warnings - - `#6777 `__: DOC: Update rv_continuous.fit doc. - - `#6778 `__: MAINT: cluster.hierarchy: Improved wording of error msgs - - `#6786 `__: BLD: increase minimum Cython version to 0.23.4 - - `#6787 `__: DOC: expand on ``linalg.block_diag`` changes in 0.19.0 release... - - `#6789 `__: ENH: Add further documentation for norm.fit - - `#6790 `__: MAINT: Fix a potential problem in nn_chain linkage algorithm - - `#6791 `__: DOC: Add examples to scipy.ndimage.fourier - - `#6792 `__: DOC: fix some numpydoc / Sphinx issues. - - `#6793 `__: MAINT: fix circular import after moving functions out of misc - - `#6796 `__: TST: test importing each submodule. Regression test for gh-6793. - - `#6799 `__: ENH: stats: Argus distribution - - `#6801 `__: ENH: stats: Histogram distribution - - `#6803 `__: TST: make sure tests for ``_build_utils`` are run. - - `#6804 `__: MAINT: more fixes in `loggamma` - - `#6806 `__: ENH: Faster linkage for 'centroid' and 'median' methods - - `#6810 `__: ENH: speed up upfirdn and resample_poly for n-dimensional arrays - - `#6812 `__: TST: Added ConvexHull asv benchmark code - - `#6814 `__: ENH: Different extrapolation modes for different dimensions in... - - `#6826 `__: Signal spectral window default fix - - `#6828 `__: BUG: SphericalVoronoi Space Complexity (Fixes #6811) - - `#6830 `__: RealData docstring correction - - `#6834 `__: DOC: Added reference for skewtest function. See #6829 - - `#6836 `__: DOC: Added mode='mirror' in the docstring for the functions accepting... - - `#6838 `__: MAINT: sparse: start removing old BSR methods - - `#6844 `__: handle incompatible dimensions when input is not an ndarray in... - - `#6847 `__: Added maxiter to golden search. - - `#6850 `__: BUG: added check for optional param scipy.stats.spearmanr - - `#6858 `__: MAINT: Removing redundant tests - - `#6861 `__: DEP: Fix escape sequences deprecated in Python 3.6. - - `#6862 `__: DOC: dx should be float, not int - - `#6863 `__: updated documentation curve_fit - - `#6866 `__: DOC : added some documentation to j1 referring to spherical_jn - - `#6867 `__: DOC: cdist move long examples list into Notes section - - `#6868 `__: BUG: Make stats.mode return a ModeResult namedtuple on empty... - - `#6871 `__: Corrected documentation. - - `#6874 `__: ENH: gaussian_kde.logpdf based on logsumexp - - `#6877 `__: BUG: ndimage: guard against footprints of all zeros - - `#6881 `__: python 3.6 - - `#6885 `__: Vectorized integrate.fixed_quad - - `#6886 `__: fixed typo - - `#6891 `__: TST: fix failures for linalg.dare/care due to tightened test... - - `#6892 `__: DOC: fix a bunch of Sphinx errors. - - `#6894 `__: TST: Added asv benchmarks for scipy.spatial.Voronoi - - `#6908 `__: BUG: Fix return dtype for complex input in spsolve - - `#6909 `__: ENH: fftpack: use float32 routines for float16 inputs. - - `#6911 `__: added min/max support to binned_statistic - - `#6913 `__: Fix 6875: SLSQP raise ValueError for all invalid bounds. - - `#6914 `__: DOCS: GH6903 updating docs of Spatial.distance.pdist - - `#6916 `__: MAINT: fix some issues for 32-bit Python - - `#6924 `__: BLD: update Bento build for scipy.LowLevelCallable - - `#6932 `__: ENH: Use OrderedDict in io.netcdf. Closes gh-5537 - - `#6933 `__: BUG: fix LowLevelCallable issue on 32-bit Python. - - `#6936 `__: BUG: sparse: handle size-1 2D indexes correctly - - `#6938 `__: TST: fix test failures in special on 32-bit Python. - - `#6939 `__: Added attributes list to cKDTree docstring - - `#6940 `__: improve efficiency of dok_matrix.tocoo - - `#6942 `__: DOC: add link to liac-arff package in the io.arff docstring. - - `#6943 `__: MAINT: Docstring fixes and an additional test for linalg.solve - - `#6944 `__: DOC: Add example of odeint with a banded Jacobian to the integrate... - - `#6946 `__: ENH: hypergeom.logpmf in terms of betaln - - `#6947 `__: TST: speedup distance tests - - `#6948 `__: DEP: Deprecate the keyword "debug" from linalg.solve - - `#6950 `__: BUG: Correctly treat large integers in MMIO (fixes #6397) - - `#6952 `__: ENH: Minor user-friendliness cleanup in LowLevelCallable - - `#6956 `__: DOC: improve description of 'output' keyword for convolve - - `#6957 `__: ENH more informative error in sparse.bmat - - `#6962 `__: Shebang fixes - - `#6964 `__: DOC: note argmin/argmax addition - - `#6965 `__: BUG: Fix issues passing error tolerances in dblquad and tplquad. - - `#6971 `__: fix the docstring of signaltools.correlate - - `#6973 `__: Silence expected numpy warnings in scipy.ndimage.interpolation.zoom() - - `#6975 `__: BUG: special: fix regex in `generate_ufuncs.py` - - `#6976 `__: Update docstring for griddata - - `#6978 `__: Avoid division by zero in zoom factor calculation - - `#6979 `__: BUG: ARE solvers did not check the generalized case carefully - - `#6985 `__: ENH: sparse: add scipy.sparse.linalg.spsolve_triangular - - `#6994 `__: MAINT: spatial: updates to plotting utils - - `#6995 `__: DOC: Bad documentation of k in sparse.linalg.eigs See #6990 - - `#6997 `__: TST: Changed the test with a less singular example - - `#7000 `__: DOC: clarify interp1d 'zero' argument - - `#7007 `__: BUG: Fix division by zero in linregress() for 2 data points - - `#7009 `__: BUG: Fix problem in passing drop_rule to scipy.sparse.linalg.spilu - - `#7012 `__: speed improvment in _distn_infrastructure.py - - `#7014 `__: Fix Typo: add a single quotation mark to fix a slight typo - - `#7021 `__: MAINT: stats: use machine constants from np.finfo, not machar - - `#7026 `__: MAINT: update .mailmap - - `#7032 `__: Fix layout of rv_histogram docs - - `#7035 `__: DOC: update 0.19.0 release notes - - `#7036 `__: ENH: Add more boundary options to signal.stft - - `#7040 `__: TST: stats: skip too slow tests - - `#7042 `__: MAINT: sparse: speed up setdiag tests - - `#7043 `__: MAINT: refactory and code cleaning Xdist - - `#7053 `__: Fix msvc 9 and 10 compile errors - - `#7060 `__: DOC: updated release notes with #7043 and #6656 - - `#7062 `__: MAINT: Change defaut STFT boundary kwarg to "zeros" - - `#7064 `__: Fix ValueError: path is on mount 'X:', start on mount 'D:' on... - - `#7067 `__: TST: Fix PermissionError: [Errno 13] Permission denied on Windows - - `#7068 `__: TST: Fix UnboundLocalError: local variable 'data' referenced... - - `#7069 `__: Fix OverflowError: Python int too large to convert to C long... - - `#7071 `__: TST: silence RuntimeWarning for nan test of stats.spearmanr - - `#7072 `__: Fix OverflowError: Python int too large to convert to C long... - - `#7084 `__: TST: linalg: bump tolerance in test_falker Checksums ========= MD5 ~~~ a1d4a08cb0408fda554787e502a99d89 scipy-0.19.0rc2.tar.gz e5531359c06c6cccb544e0fcb728d832 scipy-0.19.0rc2.tar.xz aad1a08d3eee196a6d3f850b61d8aec5 scipy-0.19.0rc2.zip SHA256 ~~~~~~ a9f978f8cc569138d16f0b115476da8355fd149cbacefed7f283682d52540d05 scipy-0.19.0rc2.tar.gz 643d47551d16d965efe4f16613ee71ac6396481df649df3ad3a63848776cc084 scipy-0.19.0rc2.tar.xz e8f80a26a1089b35b7f410509fa703598a4cd74dd42636f22aa1b65085beaf9f scipy-0.19.0rc2.zip -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iQEcBAEBCAAGBQJYsEDoAAoJEIp0pQ0zQcu+UjAH/1gErT0trZxgNqtxEtIbQ0RP V9EIzQ3kj1QJHx7KKgheswHT4Ee6LYhA5MXzMAIDRx2TIMzE3ByBktK/So6+8U96 jCkceIHMwHgVnzxA8lkoB6AILHWMaaN/jNKk0u9Y6IKwUY4S1kQkYysAlMOw3gaG g1fOJzw1Bn4Yq0dEXw+tAAFRVPSAYKohJgc7U6AvW435VI3QmOqsCCbwh2ua1oN6 egaIBD0QBOGVYb7vlhOMiKPKKkIhEY2DVXnA7EVCiJMykq/dc+pReIL8YNZ+xI6F WUNv1KUAmVfKXU5MfSwTcxLLH89HJ7WfE8asCmVpgavhqdk0i7D4zVzN20uGe18= =f4kz -----END PGP SIGNATURE----- From matthew.brett at gmail.com Fri Feb 24 13:51:24 2017 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 24 Feb 2017 10:51:24 -0800 Subject: [SciPy-Dev] Drop i386 part of OSX wheel build? Message-ID: Hi, We're running into travis-ci time limits when building OSX wheels: https://travis-ci.org/MacPython/scipy-wheels/builds/204105101 The main reason for this is that, for OSX, we are building twice, first for i386, then for x86_64, and then combining these builds into a 'fat' binary to be compatible with either architecture. The builds take a long time, hence the timeout. I believe that the i386 architecture on OSX is pretty much unused now. For example, it looks like south of 0.5% of CPUs running OSX cannot do 64-bit: https://www.adium.im/sparkle/#cpu64bit I propose that we drop the i386 part of the scipy build, but continue to distribute apparently dual-arch wheels (architecture 'intel') so that the wheel installs into the Python.org and System Python builds [1]. This has the added advantage of making the wheel build process a lot simpler. Thoughts? Best, Matthew [1] https://bitbucket.org/pygame/pygame/issues/300/os-x-wheels-dmg-and-zip-builds-with-travis#comment-29275485 From olivier.grisel at ensta.org Fri Feb 24 14:50:11 2017 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Fri, 24 Feb 2017 20:50:11 +0100 Subject: [SciPy-Dev] Drop i386 part of OSX wheel build? In-Reply-To: References: Message-ID: +1 From ralf.gommers at gmail.com Fri Feb 24 16:49:55 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 25 Feb 2017 10:49:55 +1300 Subject: [SciPy-Dev] Drop i386 part of OSX wheel build? In-Reply-To: References: Message-ID: On Sat, Feb 25, 2017 at 7:51 AM, Matthew Brett wrote: > Hi, > > We're running into travis-ci time limits when building OSX wheels: > > https://travis-ci.org/MacPython/scipy-wheels/builds/204105101 > > The main reason for this is that, for OSX, we are building twice, > first for i386, then for x86_64, and then combining these builds into > a 'fat' binary to be compatible with either architecture. The builds > take a long time, hence the timeout. > > I believe that the i386 architecture on OSX is pretty much unused now. > For example, it looks like south of 0.5% of CPUs running OSX cannot do > 64-bit: > > https://www.adium.im/sparkle/#cpu64bit > > I propose that we drop the i386 part of the scipy build, but continue > to distribute apparently dual-arch wheels (architecture 'intel') so > that the wheel installs into the Python.org and System Python builds > [1]. This has the added advantage of making the wheel build process a > lot simpler. > > Thoughts? > Sounds like a good idea to me. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sat Feb 25 09:27:02 2017 From: cournape at gmail.com (David Cournapeau) Date: Sat, 25 Feb 2017 14:27:02 +0000 Subject: [SciPy-Dev] Drop i386 part of OSX wheel build? In-Reply-To: References: Message-ID: FWIW, we dropped osx 32 bits a few years ago at Enthought and nobody complained On Fri, Feb 24, 2017 at 6:51 PM, Matthew Brett wrote: > Hi, > > We're running into travis-ci time limits when building OSX wheels: > > https://travis-ci.org/MacPython/scipy-wheels/builds/204105101 > > The main reason for this is that, for OSX, we are building twice, > first for i386, then for x86_64, and then combining these builds into > a 'fat' binary to be compatible with either architecture. The builds > take a long time, hence the timeout. > > I believe that the i386 architecture on OSX is pretty much unused now. > For example, it looks like south of 0.5% of CPUs running OSX cannot do > 64-bit: > > https://www.adium.im/sparkle/#cpu64bit > > I propose that we drop the i386 part of the scipy build, but continue > to distribute apparently dual-arch wheels (architecture 'intel') so > that the wheel installs into the Python.org and System Python builds > [1]. This has the added advantage of making the wheel build process a > lot simpler. > > Thoughts? > > Best, > > Matthew > > [1] https://bitbucket.org/pygame/pygame/issues/300/os-x-wheels- > dmg-and-zip-builds-with-travis#comment-29275485 > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From karandesai281196 at gmail.com Mon Feb 27 01:44:28 2017 From: karandesai281196 at gmail.com (Karan Desai) Date: Mon, 27 Feb 2017 06:44:28 +0000 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. Message-ID: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Hello developers, I am Karan Desai, a third year undergraduate from Indian Institute of technology, Roorkee. I went through the GSoC Ideas page on Scipy github wiki, and I think that I'll be most comfortable with the idea "Make the SciPy test suite work with pytest". I believe that I match the required prerequisites, and I'll present some facts to support my belief. 1. I have completed my GSoC'16 with TARDIS. Their integration tests took more than a day to execute, hence Travis was not fit for that. I developed an Integration Testing Framework inpytest?to schedule their execution on a local server when a commit occurs on Github. Also, it generated an automated HTML Report with comparison plots between various astronomical quantities in order to debug failed tests. 2. I have contributed to joblib, a multiprocessing library in Python. I have helped with the migration of joblib testing framework from nose to pytest. Including review cycles, it took about 25 days to do so. Scipy and numpy will be bigger than that. With sufficient knowledge of how pytest flavored tests are designed, and some important differences between nose and pytest styles, I think I'll be very much comfortable with this project.I saw recent mail threads in the archive and I was happily surprised to see a few familiar names, Gael Varoquaux and Olivier Grisel might have remembered my contributions in Joblib back in December? Relevant Links: 1. Github Profile:https://www.github.com/karandesai-96/ 2. Issues and Pull Request thread about Pytest Migration: https://www.github.com/joblib/joblib/issues/411 3. GSoC 2016 project: https://summerofcode.withgoogle.com/archive/2016/projects/4796129849901056/ I will be happy to hear from the community. Thanks. Karan Desai,Department of Electrical Engineering, IIT Roorkee. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Mon Feb 27 11:13:48 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Mon, 27 Feb 2017 19:13:48 +0300 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: Hi Karan, Glad to see interest in this project! Your credentials look great. We'll need a small patch to numpy or scipy, too. A few scattered thoughts about the project: - This is not exactly a transition to pytest as such. The goal is to allow an alternative test runner, with minimal changes to the test suite. - At the moment, nose is an optional dependency of numpy and scipy. This should be the same with pytest: both numpy and scipy should remain importable without either nose or pytest. - All tests should be run with either test runner (this will need to be checked early on). - We should try to decouple the test runner as much as reasonable. pytest-specific things should be kept to a minimum. E.g. rewriting of assertions should be left out, I think. A few possibly sticky points: * both test suites are organized per submodule. This does not seem to mesh well with pytest's idea of collecting things in a conftest.py file: we either need to duplicate conftest.py files in each submodule/tests/ folder or there hopefully is some better magic. At least I did not manage to in a quick look I took a while ago, https://github.com/ev-br/numpy/commits/pytest * yield-based test generators. pytest says they are deprecated and are going to be removed in pytest 4. I don't know what is the status of pytest 4 --- is it something with a defined release date or just some uncertain future horizon or something in between. I guess we'll need to ask pytest devs. Meanwhile, there is a switch to silence the deprecation warnings in pytest 3. * The solution should work for both numpy and scipy. It seems reasonable to hide the nose/pytest specific stuff in the numpy.testing module. However scipy (and other downstream packages) have to support older versions of numpy, so anything new in numpy 1.13 is not going to be immediately useful downstream. It would also be good to hear from numpy devs (which I am not, so take what I'm saying about numpy with a spoonful of salt :-)) Cheers, Evgeni On Mon, Feb 27, 2017 at 9:44 AM, Karan Desai wrote: > Hello developers, > > I am Karan Desai, a third year undergraduate from Indian Institute of > technology, Roorkee. I went through the GSoC Ideas page on Scipy github > wiki, and I think that I'll be most comfortable with the idea "Make the > SciPy test suite work with pytest". > > I believe that I match the required prerequisites, and I'll present some > facts to support my belief. > > > 1. I have completed my GSoC'16 with TARDIS. Their integration tests > took more than a day to execute, hence Travis was not fit for that. I > developed an Integration Testing Framework in *pytest* to schedule > their execution on a local server when a commit occurs on Github. Also, it > generated an automated HTML Report with comparison plots between various > astronomical quantities in order to debug failed tests. > > 2. I have contributed to *joblib*, a multiprocessing library in > Python. I have helped with the migration of joblib testing framework from > nose to pytest. Including review cycles, it took about 25 days to do so. > Scipy and numpy will be bigger than that. > > > With sufficient knowledge of how pytest flavored tests are designed, and > some important differences between nose and pytest styles, I think I'll be > very much comfortable with this project. > I saw recent mail threads in the archive and I was happily surprised to > see a few familiar names, Gael Varoquaux and Olivier Grisel might have > remembered my contributions in Joblib back in December? > > Relevant Links: > > 1. Github Profile: https://www.github.com/karandesai-96/ > 2. Issues and Pull Request thread about Pytest Migration: > https://www.github.com/joblib/joblib/issues/411 > > 3. GSoC 2016 project: https://summerofcode.withgoogle.com/archive/2016/ > projects/4796129849901056/ > > > I will be happy to hear from the community. > Thanks. > > Karan Desai, > Department of Electrical Engineering, > > IIT Roorkee. > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Feb 27 11:44:29 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 27 Feb 2017 09:44:29 -0700 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Mon, Feb 27, 2017 at 9:13 AM, Evgeni Burovski wrote: > Hi Karan, > > Glad to see interest in this project! > > Your credentials look great. We'll need a small patch to numpy or scipy, > too. > > A few scattered thoughts about the project: > > - This is not exactly a transition to pytest as such. The goal is to allow > an alternative test runner, with minimal changes to the test suite. > - At the moment, nose is an optional dependency of numpy and scipy. This > should be the same with pytest: both numpy and scipy should remain > importable without either nose or pytest. > - All tests should be run with either test runner (this will need to be > checked early on). > - We should try to decouple the test runner as much as reasonable. > pytest-specific things should be kept to a minimum. E.g. rewriting of > assertions should be left out, I think. > > A few possibly sticky points: > > * both test suites are organized per submodule. This does not seem to mesh > well with pytest's idea of collecting things in a conftest.py file: we > either need to duplicate conftest.py files in each submodule/tests/ folder > or there hopefully is some better magic. At least I did not manage to in a > quick look I took a while ago, https://github.com/ev-br/ > numpy/commits/pytest > > * yield-based test generators. pytest says they are deprecated and are > going to be removed in pytest 4. I don't know what is the status of pytest > 4 --- is it something with a defined release date or just some uncertain > future horizon or something in between. I guess we'll need to ask pytest > devs. Meanwhile, there is a switch to silence the deprecation warnings in > pytest 3. > > * The solution should work for both numpy and scipy. It seems reasonable > to hide the nose/pytest specific stuff in the numpy.testing module. However > scipy (and other downstream packages) have to support older versions of > numpy, so anything new in numpy 1.13 is not going to be immediately useful > downstream. > > It would also be good to hear from numpy devs (which I am not, so take > what I'm saying about numpy with a spoonful of salt :-)) > I think we need to redo the generator based tests to not use generators. If we confine numpy/scipy testing to pytest we may look at parametrized tests as a replacement. Supporting both pytest and nose for downstream projects may be tricky. Note that the downstream projects already using pytest don't need us, so the number requiring the numpy testing apparatus may be limited. I don't know much about pytest, so suggestions from more knowledgeable folks is welcome. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Feb 27 16:03:27 2017 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 27 Feb 2017 21:03:27 +0000 (UTC) Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: [clip] > A few scattered thoughts about the project: > > - This is not exactly a transition to pytest as such. The goal is to > allow an alternative test runner, with minimal changes to the test > suite. I'd suggest we should consider actually dropping nose, if we make the suite pytest compatible. I think retaining nose compatibility in the numpy/scipy test suites does not bring advantages --- if we are planning to retain it, I would like to understand why. Supporting two systems is more complicated, makes it harder to add new tests, and if the other one is not regularly used, it'll probably break often. I think we should avoid temptation to build compatibility layers, or custom test runners --- there's already complication in numpy.testing that IIRC originates from working around issues in nose, and I think not many remember how it works in depth. I think the needs for Scipy and Numpy are not demanding, so sticking with "vanilla" pytest features and using only its native test runner would sound best. The main requirement is numerical assert functions, but AFAIK the ones in numpy.testing are pytest compatible (and independent of the nose machinery). For numpy.testing, retaining nose backward compatibility is important. However, for the scipy/numpy test suites this seems less important to me. Thoughts? Best, Pauli From evgeny.burovskiy at gmail.com Mon Feb 27 16:35:04 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Tue, 28 Feb 2017 00:35:04 +0300 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Tue, Feb 28, 2017 at 12:03 AM, Pauli Virtanen wrote: > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: > [clip] >> A few scattered thoughts about the project: >> >> - This is not exactly a transition to pytest as such. The goal is to >> allow an alternative test runner, with minimal changes to the test >> suite. > > I'd suggest we should consider actually dropping nose, if we make the > suite pytest compatible. > > I think retaining nose compatibility in the numpy/scipy test suites does > not bring advantages --- if we are planning to retain it, I would like to > understand why. Supporting two systems is more complicated, makes it > harder to add new tests, and if the other one is not regularly used, > it'll probably break often. > > I think we should avoid temptation to build compatibility layers, or > custom test runners --- there's already complication in numpy.testing > that IIRC originates from working around issues in nose, and I think not > many remember how it works in depth. > > I think the needs for Scipy and Numpy are not demanding, so sticking with > "vanilla" pytest features and using only its native test runner would > sound best. The main requirement is numerical assert functions, but AFAIK > the ones in numpy.testing are pytest compatible (and independent of the > nose machinery). > > For numpy.testing, retaining nose backward compatibility is important. > However, for the scipy/numpy test suites this seems less important to me. > > Thoughts? My reasoning was simple: there is already a compatibility layer of numpy.testing. As soon as numpy.testing works in a virtualenv which has pytest and does not have nose, the scipy test suite just works, modulo maybe a handful of nose-isms to clean up (a stray import from nose.tools, a setUp/tearDown pair in a class derived from object etc). To catch those, we anyway need to have two parallel installations, one with nose, and one without, at least temporarily. At this point we can either stop or do additional work of removing nose from numpy.testing, which is invisible for scipy --- because numpy.testing just shields it all. I certainly don't understand all details of numpy.testing. That said, ISTM that we only need to port a few bits: submodule.test() and np.testing.run_module_suite. This of course leaves two elephants in the room: - backwards compat for older numpy versions. A straightforward solution is to make a separate package, numpy-testing and mechanically import from it instead of numpy.testing (or make the latter import from the former). - yield based test generators. Evgeni From robert.kern at gmail.com Mon Feb 27 16:35:54 2017 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Feb 2017 13:35:54 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: > > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: > [clip] > > A few scattered thoughts about the project: > > > > - This is not exactly a transition to pytest as such. The goal is to > > allow an alternative test runner, with minimal changes to the test > > suite. > > I'd suggest we should consider actually dropping nose, if we make the > suite pytest compatible. > > I think retaining nose compatibility in the numpy/scipy test suites does > not bring advantages --- if we are planning to retain it, I would like to > understand why. Supporting two systems is more complicated, makes it > harder to add new tests, and if the other one is not regularly used, > it'll probably break often. > > I think we should avoid temptation to build compatibility layers, or > custom test runners --- there's already complication in numpy.testing > that IIRC originates from working around issues in nose, and I think not > many remember how it works in depth. > > I think the needs for Scipy and Numpy are not demanding, so sticking with > "vanilla" pytest features and using only its native test runner would > sound best. The main requirement is numerical assert functions, but AFAIK > the ones in numpy.testing are pytest compatible (and independent of the > nose machinery). If we're migrating test runners, I'd rather drop all dependencies and target `python -m unittest discover` as the lowest common denominator rather than target a fuzzy "vanilla" subset of pytest in particular. This may well be the technical effect of what you are describing, but I think it's worth explicitly stating that focus. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Feb 27 17:30:18 2017 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 27 Feb 2017 22:30:18 +0000 (UTC) Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: Tue, 28 Feb 2017 00:35:04 +0300, Evgeni Burovski kirjoitti: [clip] >> Thoughts? > > My reasoning was simple: there is already a compatibility layer of > numpy.testing. > > As soon as numpy.testing works in a virtualenv which has pytest and does > not have nose, the scipy test suite just works, modulo maybe a handful > of nose-isms to clean up (a stray import from nose.tools, a > setUp/tearDown pair in a class derived from object etc). To catch those, > we anyway need to have two parallel installations, one with nose, and > one without, at least temporarily. > > At this point we can either stop or do additional work of removing nose > from numpy.testing, which is invisible for scipy --- because > numpy.testing just shields it all. I'm actually advocating for abandoning numpy.testing (ie. the test runner part --- the assert functions are fine). I think this is what other numpy- dependent projects that use pytest do. Making numpy.testing test runners work with pytest sounds like a complication that's not really necessary, and adding up more things to maintain. If we are going to port the test suite to work with pytest, I would go all the way. By "vanilla" pytest, I mean whatever pytest supports out of the box, without inventing a new framework with custom conventions. > This of course leaves two elephants in the room: > > - backwards compat for older numpy versions. A straightforward solution > is to make a separate package, numpy-testing and mechanically import > from it instead of numpy.testing (or make the latter import from the > former). > - yield based test generators. I think the first issue goes away if we abandon numpy.testing --- the nose-dependent parts will retained unchanged and maybe deprecated in the long run, and the nose-independent parts kept. The second issue probably doesn't --- these tests would have to be rewritten with the corresponding pytest-isms (fixtures afaics) in any case. Pauli From charlesr.harris at gmail.com Mon Feb 27 17:32:36 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 27 Feb 2017 15:32:36 -0700 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern wrote: > On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: > > > > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: > > [clip] > > > A few scattered thoughts about the project: > > > > > > - This is not exactly a transition to pytest as such. The goal is to > > > allow an alternative test runner, with minimal changes to the test > > > suite. > > > > I'd suggest we should consider actually dropping nose, if we make the > > suite pytest compatible. > > > > I think retaining nose compatibility in the numpy/scipy test suites does > > not bring advantages --- if we are planning to retain it, I would like to > > understand why. Supporting two systems is more complicated, makes it > > harder to add new tests, and if the other one is not regularly used, > > it'll probably break often. > > > > I think we should avoid temptation to build compatibility layers, or > > custom test runners --- there's already complication in numpy.testing > > that IIRC originates from working around issues in nose, and I think not > > many remember how it works in depth. > > > > I think the needs for Scipy and Numpy are not demanding, so sticking with > > "vanilla" pytest features and using only its native test runner would > > sound best. The main requirement is numerical assert functions, but AFAIK > > the ones in numpy.testing are pytest compatible (and independent of the > > nose machinery). > > If we're migrating test runners, I'd rather drop all dependencies and > target `python -m unittest discover` as the lowest common denominator > rather than target a fuzzy "vanilla" subset of pytest in particular. > I'd certainly expect to make full use of the pytest features, why use it otherwise? There is a reason that both nose and pytest extended unittest. The only advantage I see to unittest is that it is less likely to become abandon ware. > > This may well be the technical effect of what you are describing, but I > think it's worth explicitly stating that focus. > I don't think that us where we were headed. Can you make the case for unittest? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Mon Feb 27 17:49:44 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Feb 2017 17:49:44 -0500 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Mon, Feb 27, 2017 at 5:30 PM, Pauli Virtanen wrote: > Tue, 28 Feb 2017 00:35:04 +0300, Evgeni Burovski kirjoitti: > [clip] > >> Thoughts? > > > > My reasoning was simple: there is already a compatibility layer of > > numpy.testing. > > > > As soon as numpy.testing works in a virtualenv which has pytest and does > > not have nose, the scipy test suite just works, modulo maybe a handful > > of nose-isms to clean up (a stray import from nose.tools, a > > setUp/tearDown pair in a class derived from object etc). To catch those, > > we anyway need to have two parallel installations, one with nose, and > > one without, at least temporarily. > > > > At this point we can either stop or do additional work of removing nose > > from numpy.testing, which is invisible for scipy --- because > > numpy.testing just shields it all. > > I'm actually advocating for abandoning numpy.testing (ie. the test runner > part --- the assert functions are fine). I think this is what other numpy- > dependent projects that use pytest do. Making numpy.testing test runners > work with pytest sounds like a complication that's not really necessary, > and adding up more things to maintain. > Is there still a pytest equivalent to running the test from inside an interpreter? import scipy.stats scipy.stats.test() I always liked this as a user (when I'm not already in a testing/debugging shell). Josef > > If we are going to port the test suite to work with pytest, I would go > all the way. By "vanilla" pytest, I mean whatever pytest supports out of > the box, without inventing a new framework with custom conventions. > > > This of course leaves two elephants in the room: > > > > - backwards compat for older numpy versions. A straightforward solution > > is to make a separate package, numpy-testing and mechanically import > > from it instead of numpy.testing (or make the latter import from the > > former). > > - yield based test generators. > > I think the first issue goes away if we abandon numpy.testing --- the > nose-dependent parts will retained unchanged and maybe deprecated in the > long run, and the nose-independent parts kept. > > The second issue probably doesn't --- these tests would have to be > rewritten with the corresponding pytest-isms (fixtures afaics) in any > case. > > Pauli > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Mon Feb 27 18:08:34 2017 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Feb 2017 15:08:34 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris wrote: > > > > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern wrote: >> >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: >> > >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: >> > [clip] >> > > A few scattered thoughts about the project: >> > > >> > > - This is not exactly a transition to pytest as such. The goal is to >> > > allow an alternative test runner, with minimal changes to the test >> > > suite. >> > >> > I'd suggest we should consider actually dropping nose, if we make the >> > suite pytest compatible. >> > >> > I think retaining nose compatibility in the numpy/scipy test suites does >> > not bring advantages --- if we are planning to retain it, I would like to >> > understand why. Supporting two systems is more complicated, makes it >> > harder to add new tests, and if the other one is not regularly used, >> > it'll probably break often. >> > >> > I think we should avoid temptation to build compatibility layers, or >> > custom test runners --- there's already complication in numpy.testing >> > that IIRC originates from working around issues in nose, and I think not >> > many remember how it works in depth. >> > >> > I think the needs for Scipy and Numpy are not demanding, so sticking with >> > "vanilla" pytest features and using only its native test runner would >> > sound best. The main requirement is numerical assert functions, but AFAIK >> > the ones in numpy.testing are pytest compatible (and independent of the >> > nose machinery). >> >> If we're migrating test runners, I'd rather drop all dependencies and target `python -m unittest discover` as the lowest common denominator rather than target a fuzzy "vanilla" subset of pytest in particular. > > I'd certainly expect to make full use of the pytest features, why use it otherwise? There is a reason that both nose and pytest extended unittest. The only advantage I see to unittest is that it is less likely to become abandon ware. Well, that's definitely not what Pauli was advocating, and my response was intending to clarify his position. >> This may well be the technical effect of what you are describing, but I think it's worth explicitly stating that focus. > > I don't think that us where we were headed. Can you make the case for unittest? There are two places where nose and pytest provide features; they are each a *test runner* and also a *test framework*. As a test runner, they provide features for discovering tests (essentially, they let you easily point to a subset of tests to run) and reporting the test results in a variety of user-friendly ways. Test discovery functionality works with plain-old-TestCases, and pretty much any test runner works with the plain-old-TestCases. These features face the user who is *running* the tests. The other place they provide features is as test frameworks. This is where things like the generator tests and additional fixtures come in (i.e. setup_class, setup_module, etc.). These require you to write your tests in framework-specific ways. Those written for nose don't necessarily work with those written for pytest and vice versa. And any other plain test runner is right out. These features face the developer who is *writing* the tests. Test discovery was the main reason we started using nose over plain unittest. At that time (and incidentally, when nose and pytest began), `python -m unittest discover` didn't exist. To run unit tests, you had to manually collect the TestCases into a TestSuite and write your own logic for configuring any subsets from user input. It was a pain in the ass for large hierarchical packages like numpy and scipy. The test framework features of nose like generator tests were nice bonuses, but were not the main motivating factor. [Source: me; it was Jarrod and I who made the decision to use nose at a numpy sprint in Berkeley.] If we're going to make a switch, I'd rather give up those framework features (we don't use all that many) in order to get agnosticism on the test runner front. Being agnostic to the test runner lets us optimize for multiple use cases simultaneously. That is, the best test runner for "run the whole suite and record every detail with coverage under time constraints under Travis CI" is often different from what a developer wants for "run just the one test module for the thing I'm working on and report to me only what breaks as fast as possible with minimal clutter". -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Feb 28 04:31:26 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 28 Feb 2017 22:31:26 +1300 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern wrote: > On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > > > > > > > > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern > wrote: > >> > >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: > >> > > >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: > >> > [clip] > >> > > A few scattered thoughts about the project: > >> > > > >> > > - This is not exactly a transition to pytest as such. The goal is to > >> > > allow an alternative test runner, with minimal changes to the test > >> > > suite. > >> > > >> > I'd suggest we should consider actually dropping nose, if we make the > >> > suite pytest compatible. > >> > > >> > I think retaining nose compatibility in the numpy/scipy test suites > does > >> > not bring advantages --- if we are planning to retain it, I would > like to > >> > understand why. Supporting two systems is more complicated, makes it > >> > harder to add new tests, and if the other one is not regularly used, > >> > it'll probably break often. > >> > > >> > I think we should avoid temptation to build compatibility layers, or > >> > custom test runners --- there's already complication in numpy.testing > >> > that IIRC originates from working around issues in nose, and I think > not > >> > many remember how it works in depth. > >> > > >> > I think the needs for Scipy and Numpy are not demanding, so sticking > with > >> > "vanilla" pytest features and using only its native test runner would > >> > sound best. The main requirement is numerical assert functions, but > AFAIK > >> > the ones in numpy.testing are pytest compatible (and independent of > the > >> > nose machinery). > >> > >> If we're migrating test runners, I'd rather drop all dependencies and > target `python -m unittest discover` as the lowest common denominator > rather than target a fuzzy "vanilla" subset of pytest in particular. > > > > I'd certainly expect to make full use of the pytest features, why use it > otherwise? There is a reason that both nose and pytest extended unittest. > The only advantage I see to unittest is that it is less likely to become > abandon ware. > > Well, that's definitely not what Pauli was advocating, and my response was > intending to clarify his position. > > >> This may well be the technical effect of what you are describing, but I > think it's worth explicitly stating that focus. > > > > I don't think that us where we were headed. Can you make the case for > unittest? > > There are two places where nose and pytest provide features; they are each > a *test runner* and also a *test framework*. As a test runner, they provide > features for discovering tests (essentially, they let you easily point to a > subset of tests to run) and reporting the test results in a variety of > user-friendly ways. Test discovery functionality works with > plain-old-TestCases, and pretty much any test runner works with the > plain-old-TestCases. These features face the user who is *running* the > tests. > > The other place they provide features is as test frameworks. This is where > things like the generator tests and additional fixtures come in (i.e. > setup_class, setup_module, etc.). These require you to write your tests in > framework-specific ways. Those written for nose don't necessarily work with > those written for pytest and vice versa. And any other plain test runner is > right out. These features face the developer who is *writing* the tests. > > Test discovery was the main reason we started using nose over plain > unittest. At that time (and incidentally, when nose and pytest began), > `python -m unittest discover` didn't exist. > As Josef said, "import scipy; scipy.test()" must work (always useful, but especially important on Windows). > To run unit tests, you had to manually collect the TestCases into a > TestSuite and write your own logic for configuring any subsets from user > input. It was a pain in the ass for large hierarchical packages like numpy > and scipy. The test framework features of nose like generator tests were > nice bonuses, but were not the main motivating factor. [Source: me; it was > Jarrod and I who made the decision to use nose at a numpy sprint in > Berkeley.] > > If we're going to make a switch, I'd rather give up those framework > features (we don't use all that many) in order to get agnosticism on the > test runner front. Being agnostic to the test runner lets us optimize for > multiple use cases simultaneously. That is, the best test runner for "run > the whole suite and record every detail with coverage under time > constraints under Travis CI" is often different from what a developer wants > for "run just the one test module for the thing I'm working on and report > to me only what breaks as fast as possible with minimal clutter". > I don't believe we can live with just plain unittest, we want to move to a better maintained framework and gain some features in this GSoC project, not go backwards in functionality. Some other issues: - unittest itself isn't going to gain any form of parametric testing any time soon (https://bugs.python.org/issue7897) and we use those extensively - we need to keep test functions (i.e. not methods of a TestCase subclass) working - we need something like numpy.testing.NoseTester to get us the right behavior for warnings, print versions/paths, etc. - there's decorators like assert_raises_regex, setastest, etc. that we want - we want to gain a test timer: https://github.com/scipy/scipy/issues/6989 Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Tue Feb 28 05:51:05 2017 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 28 Feb 2017 11:51:05 +0100 Subject: [SciPy-Dev] ANN: SfePy 2017.1 Message-ID: <8f48385c-0507-1820-c8d4-8758c3b14c11@ntc.zcu.cz> I am pleased to announce release 2017.1 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software for solving systems of coupled partial differential equations by the finite element method or by the isogeometric analysis (limited support). It is distributed under the new BSD license. Home page: http://sfepy.org Mailing list: http://groups.google.com/group/sfepy-devel Git (source) repository, issue tracker: https://github.com/sfepy/sfepy Highlights of this release -------------------------- - spline-box parametrization of an arbitrary field - conda-forge recipe (thanks to Daniel Wheeler) - fixes for Python 3.6 For full release notes see http://docs.sfepy.org/doc/release_notes.html#id1 (rather long and technical). Cheers, Robert Cimrman --- Contributors to this release in alphabetical order: Siwei Chen Robert Cimrman Jan Heczko Vladimir Lukes Matyas Novak From hyzyla at gmail.com Tue Feb 28 06:38:00 2017 From: hyzyla at gmail.com (=?UTF-8?B?0ITQstCz0LXQvdGW0Lkg0JPQuNC30LjQu9Cw?=) Date: Tue, 28 Feb 2017 13:38:00 +0200 Subject: [SciPy-Dev] GSoC17 - Question about candidacy Message-ID: Dear developers I'm Yevhenii Hyzyla and I'm an undergraduate student from Taras Shevchenko National University of Kyiv in Ukraine. I don't have any experience of open source contributing, but I want to start this way from SciPy. I have good knowledge of Python (some small pet project) and C on sufficient level (university course). Also, I have an experience of working with SciPy library and SciPy stack in general. After seeing the wiki page with ideas and I think I could develop scipy.diff, because I have good knowledge about differential and how to calculate them using numerical or symbolic methods. (I had courses at university from Mathematical analysis and Numerical analysis). And my hobbies are Math and Python. 1) If I make some patches to scipy, what chance of being accepted will be? 2) How many students applies to scipy each year? Could I get feedback about my candidacy from developers? Thank for reading! Yevhenii -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Feb 28 14:01:25 2017 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Feb 2017 11:01:25 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Tue, Feb 28, 2017 at 1:31 AM, Ralf Gommers wrote: > > On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern wrote: >> >> On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: >> > >> > >> > >> > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern wrote: >> >> >> >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: >> >> > >> >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: >> >> > [clip] >> >> > > A few scattered thoughts about the project: >> >> > > >> >> > > - This is not exactly a transition to pytest as such. The goal is to >> >> > > allow an alternative test runner, with minimal changes to the test >> >> > > suite. >> >> > >> >> > I'd suggest we should consider actually dropping nose, if we make the >> >> > suite pytest compatible. >> >> > >> >> > I think retaining nose compatibility in the numpy/scipy test suites does >> >> > not bring advantages --- if we are planning to retain it, I would like to >> >> > understand why. Supporting two systems is more complicated, makes it >> >> > harder to add new tests, and if the other one is not regularly used, >> >> > it'll probably break often. >> >> > >> >> > I think we should avoid temptation to build compatibility layers, or >> >> > custom test runners --- there's already complication in numpy.testing >> >> > that IIRC originates from working around issues in nose, and I think not >> >> > many remember how it works in depth. >> >> > >> >> > I think the needs for Scipy and Numpy are not demanding, so sticking with >> >> > "vanilla" pytest features and using only its native test runner would >> >> > sound best. The main requirement is numerical assert functions, but AFAIK >> >> > the ones in numpy.testing are pytest compatible (and independent of the >> >> > nose machinery). >> >> >> >> If we're migrating test runners, I'd rather drop all dependencies and target `python -m unittest discover` as the lowest common denominator rather than target a fuzzy "vanilla" subset of pytest in particular. >> > >> > I'd certainly expect to make full use of the pytest features, why use it otherwise? There is a reason that both nose and pytest extended unittest. The only advantage I see to unittest is that it is less likely to become abandon ware. >> >> Well, that's definitely not what Pauli was advocating, and my response was intending to clarify his position. >> >> >> This may well be the technical effect of what you are describing, but I think it's worth explicitly stating that focus. >> > >> > I don't think that us where we were headed. Can you make the case for unittest? >> >> There are two places where nose and pytest provide features; they are each a *test runner* and also a *test framework*. As a test runner, they provide features for discovering tests (essentially, they let you easily point to a subset of tests to run) and reporting the test results in a variety of user-friendly ways. Test discovery functionality works with plain-old-TestCases, and pretty much any test runner works with the plain-old-TestCases. These features face the user who is *running* the tests. >> >> The other place they provide features is as test frameworks. This is where things like the generator tests and additional fixtures come in (i.e. setup_class, setup_module, etc.). These require you to write your tests in framework-specific ways. Those written for nose don't necessarily work with those written for pytest and vice versa. And any other plain test runner is right out. These features face the developer who is *writing* the tests. >> >> Test discovery was the main reason we started using nose over plain unittest. At that time (and incidentally, when nose and pytest began), `python -m unittest discover` didn't exist. > > As Josef said, "import scipy; scipy.test()" must work (always useful, but especially important on Windows). Okay. And? >> To run unit tests, you had to manually collect the TestCases into a TestSuite and write your own logic for configuring any subsets from user input. It was a pain in the ass for large hierarchical packages like numpy and scipy. The test framework features of nose like generator tests were nice bonuses, but were not the main motivating factor. [Source: me; it was Jarrod and I who made the decision to use nose at a numpy sprint in Berkeley.] >> >> If we're going to make a switch, I'd rather give up those framework features (we don't use all that many) in order to get agnosticism on the test runner front. Being agnostic to the test runner lets us optimize for multiple use cases simultaneously. That is, the best test runner for "run the whole suite and record every detail with coverage under time constraints under Travis CI" is often different from what a developer wants for "run just the one test module for the thing I'm working on and report to me only what breaks as fast as possible with minimal clutter". > > I don't believe we can live with just plain unittest, we want to move to a better maintained framework and gain some features in this GSoC project, not go backwards in functionality. Ah, okay. The way the project abstract is written, this is not clear. It reads as if you want pytest as a mere additional possible test runner alongside nose. Instead, it seems that you want to migrate away from nose and to pytest, wholesale. I'm fine with that. But get Pauli on board with going beyond "vanilla" pytest and rewrite the abstract. Drop the bit about ideally running the test suite with both nose and pytest. If you're looking to gain functionality, there's no point, just maintenance headaches. If you want some runner-agnosticism, then unittest is the layer to do it on. Maintaining runner-agnosticism for just two runners (only one of which is maintained) is a wasted opportunity. That's my main contention. > Some other issues: > - unittest itself isn't going to gain any form of parametric testing any time soon (https://bugs.python.org/issue7897) and we use those extensively > - we need to keep test functions (i.e. not methods of a TestCase subclass) working > - we need something like numpy.testing.NoseTester to get us the right behavior for warnings, print versions/paths, etc. About the same work to accomplish with unittest as pytest. Despite the name, there isn't much nose-or-fancy-framework-specific in there. > - there's decorators like assert_raises_regex, setastest, etc. that we want Just for the record, things like assert_raises_regex are runner/framework-agnostic, and setastest doesn't matter if you are using TestCases. ;-) > - we want to gain a test timer: https://github.com/scipy/scipy/issues/6989 Also for the record, this is just a runner feature and has no real bearing on how the tests are written. A suite of plain-old-TestCases can be run by `pytest --durations` just fine. I'm gonna miss nose-progressive. https://pypi.python.org/pypi/nose-progressive/ -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Feb 28 14:30:05 2017 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 28 Feb 2017 11:30:05 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: Hi, On Tue, Feb 28, 2017 at 11:01 AM, Robert Kern wrote: > On Tue, Feb 28, 2017 at 1:31 AM, Ralf Gommers > wrote: >> >> On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern >> wrote: >>> >>> On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris >>> wrote: >>> > >>> > >>> > >>> > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern >>> > wrote: >>> >> >>> >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: >>> >> > >>> >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: >>> >> > [clip] >>> >> > > A few scattered thoughts about the project: >>> >> > > >>> >> > > - This is not exactly a transition to pytest as such. The goal is >>> >> > > to >>> >> > > allow an alternative test runner, with minimal changes to the test >>> >> > > suite. >>> >> > >>> >> > I'd suggest we should consider actually dropping nose, if we make >>> >> > the >>> >> > suite pytest compatible. >>> >> > >>> >> > I think retaining nose compatibility in the numpy/scipy test suites >>> >> > does >>> >> > not bring advantages --- if we are planning to retain it, I would >>> >> > like to >>> >> > understand why. Supporting two systems is more complicated, makes it >>> >> > harder to add new tests, and if the other one is not regularly used, >>> >> > it'll probably break often. >>> >> > >>> >> > I think we should avoid temptation to build compatibility layers, or >>> >> > custom test runners --- there's already complication in >>> >> > numpy.testing >>> >> > that IIRC originates from working around issues in nose, and I think >>> >> > not >>> >> > many remember how it works in depth. >>> >> > >>> >> > I think the needs for Scipy and Numpy are not demanding, so sticking >>> >> > with >>> >> > "vanilla" pytest features and using only its native test runner >>> >> > would >>> >> > sound best. The main requirement is numerical assert functions, but >>> >> > AFAIK >>> >> > the ones in numpy.testing are pytest compatible (and independent of >>> >> > the >>> >> > nose machinery). >>> >> >>> >> If we're migrating test runners, I'd rather drop all dependencies and >>> >> target `python -m unittest discover` as the lowest common denominator rather >>> >> than target a fuzzy "vanilla" subset of pytest in particular. >>> > >>> > I'd certainly expect to make full use of the pytest features, why use >>> > it otherwise? There is a reason that both nose and pytest extended unittest. >>> > The only advantage I see to unittest is that it is less likely to become >>> > abandon ware. >>> >>> Well, that's definitely not what Pauli was advocating, and my response >>> was intending to clarify his position. >>> >>> >> This may well be the technical effect of what you are describing, but >>> >> I think it's worth explicitly stating that focus. >>> > >>> > I don't think that us where we were headed. Can you make the case for >>> > unittest? >>> >>> There are two places where nose and pytest provide features; they are >>> each a *test runner* and also a *test framework*. As a test runner, they >>> provide features for discovering tests (essentially, they let you easily >>> point to a subset of tests to run) and reporting the test results in a >>> variety of user-friendly ways. Test discovery functionality works with >>> plain-old-TestCases, and pretty much any test runner works with the >>> plain-old-TestCases. These features face the user who is *running* the >>> tests. >>> >>> The other place they provide features is as test frameworks. This is >>> where things like the generator tests and additional fixtures come in (i.e. >>> setup_class, setup_module, etc.). These require you to write your tests in >>> framework-specific ways. Those written for nose don't necessarily work with >>> those written for pytest and vice versa. And any other plain test runner is >>> right out. These features face the developer who is *writing* the tests. >>> >>> Test discovery was the main reason we started using nose over plain >>> unittest. At that time (and incidentally, when nose and pytest began), >>> `python -m unittest discover` didn't exist. >> >> As Josef said, "import scipy; scipy.test()" must work (always useful, but >> especially important on Windows). > > Okay. And? > >>> To run unit tests, you had to manually collect the TestCases into a >>> TestSuite and write your own logic for configuring any subsets from user >>> input. It was a pain in the ass for large hierarchical packages like numpy >>> and scipy. The test framework features of nose like generator tests were >>> nice bonuses, but were not the main motivating factor. [Source: me; it was >>> Jarrod and I who made the decision to use nose at a numpy sprint in >>> Berkeley.] >>> >>> If we're going to make a switch, I'd rather give up those framework >>> features (we don't use all that many) in order to get agnosticism on the >>> test runner front. Being agnostic to the test runner lets us optimize for >>> multiple use cases simultaneously. That is, the best test runner for "run >>> the whole suite and record every detail with coverage under time constraints >>> under Travis CI" is often different from what a developer wants for "run >>> just the one test module for the thing I'm working on and report to me only >>> what breaks as fast as possible with minimal clutter". >> >> I don't believe we can live with just plain unittest, we want to move to a >> better maintained framework and gain some features in this GSoC project, not >> go backwards in functionality. > > Ah, okay. The way the project abstract is written, this is not clear. It > reads as if you want pytest as a mere additional possible test runner > alongside nose. Instead, it seems that you want to migrate away from nose > and to pytest, wholesale. I'm fine with that. But get Pauli on board with > going beyond "vanilla" pytest and rewrite the abstract. Drop the bit about > ideally running the test suite with both nose and pytest. If you're looking > to gain functionality, there's no point, just maintenance headaches. If you > want some runner-agnosticism, then unittest is the layer to do it on. > Maintaining runner-agnosticism for just two runners (only one of which is > maintained) is a wasted opportunity. That's my main contention. That seems reasonable. Maybe we have a problem here, in that we might prefer a big-bang switch from nose to pytest, but the GSoC format has a strong requirement for gradual progress over many weeks. That may mean some serious drag from keeping a long-lived parallel development branch up to date. At least it will require some thought. Matplotlib and scikit-image just migrated to pytest. I've Cc'ed Thomas Caswell and Stefan van der Walt, hoping they have some experience to share about the migration. See you, Matthew From stefanv at berkeley.edu Tue Feb 28 14:42:27 2017 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Tue, 28 Feb 2017 11:42:27 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> Hi all, On Tue, Feb 28, 2017, at 11:30, Matthew Brett wrote: > Matplotlib and scikit-image just migrated to pytest. I've Cc'ed > Thomas Caswell and Stefan van der Walt, hoping they have some > experience to share about the migration. We migrated scikit-image to pytest as the easiest option outside of nose (which I don't think is maintained any longer). Robert's suggested solution, of using only vanilla unittest seems ideal, if it can be achieved easily; pytest should be able to gather and execute those tests just fine. W.r.t. GSoC, this project is probably not suitable in scope (it's a weekend's worth of effort, maybe two). Best regards St?fan From robert.kern at gmail.com Tue Feb 28 15:06:59 2017 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Feb 2017 12:06:59 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> Message-ID: On Tue, Feb 28, 2017 at 11:42 AM, Stefan van der Walt wrote: > > Hi all, > > On Tue, Feb 28, 2017, at 11:30, Matthew Brett wrote: > > Matplotlib and scikit-image just migrated to pytest. I've Cc'ed > > Thomas Caswell and Stefan van der Walt, hoping they have some > > experience to share about the migration. > > We migrated scikit-image to pytest as the easiest option outside of nose > (which I don't think is maintained any longer). Robert's suggested > solution, of using only vanilla unittest seems ideal, if it can be > achieved easily; pytest should be able to gather and execute those tests > just fine. > > W.r.t. GSoC, this project is probably not suitable in scope (it's a > weekend's worth of effort, maybe two). Ensuring that the test suite runs under pytest doesn't take much work, sure, but numpy has a number of infrastructure pieces built around nose (e.g. scipy.test(), etc.) that would need to be reimplemented to use pytest instead. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Tue Feb 28 15:11:08 2017 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Tue, 28 Feb 2017 12:11:08 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> Message-ID: <1488312668.1424392.895946736.43F9120C@webmail.messagingengine.com> On Tue, Feb 28, 2017, at 12:06, Robert Kern wrote: > Ensuring that the test suite runs under pytest doesn't take much work, > sure, but numpy has a number of infrastructure pieces built around > nose (e.g. scipy.test(), etc.) that would need to be reimplemented to > use pytest instead. Still, we're talking about three months: https://developers.google.com/open-source/gsoc/timeline Perhaps the project can have a related, extended goal defined to catch any slack time. St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Feb 28 15:16:20 2017 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Feb 2017 12:16:20 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: <1488312668.1424392.895946736.43F9120C@webmail.messagingengine.com> References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> <1488312668.1424392.895946736.43F9120C@webmail.messagingengine.com> Message-ID: On Tue, Feb 28, 2017 at 12:11 PM, Stefan van der Walt wrote: > > On Tue, Feb 28, 2017, at 12:06, Robert Kern wrote: > >> Ensuring that the test suite runs under pytest doesn't take much work, sure, but numpy has a number of infrastructure pieces built around nose (e.g. scipy.test(), etc.) that would need to be reimplemented to use pytest instead. > > Still, we're talking about three months: https://developers.google.com/open-source/gsoc/timeline > > Perhaps the project can have a related, extended goal defined to catch any slack time. Implement nose-progressive in pytest, so I'm not sad! ;-) https://pypi.python.org/pypi/nose-progressive/ -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Tue Feb 28 15:21:07 2017 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Tue, 28 Feb 2017 12:21:07 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> <1488310947.1417234.895910200.35C22A41@webmail.messagingengine.com> <1488312668.1424392.895946736.43F9120C@webmail.messagingengine.com> Message-ID: <1488313267.1426804.895957504.75C61206@webmail.messagingengine.com> On Tue, Feb 28, 2017, at 12:16, Robert Kern wrote: > On Tue, Feb 28, 2017 at 12:11 PM, Stefan van der Walt > wrote: > > > > On Tue, Feb 28, 2017, at 12:06, Robert Kern wrote: > > > >> Ensuring that the test suite runs under pytest doesn't take much > >> work, sure, but numpy has a number of infrastructure pieces built > >> around nose (e.g. scipy.test(), etc.) that would need to be > >> reimplemented to use pytest instead. > > > > Still, we're talking about three months: > > https://developers.google.com/open-source/gsoc/timeline > > > > Perhaps the project can have a related, extended goal defined to > > catch any slack time. > > Implement nose-progressive in pytest, so I'm not sad! ;-) > > https://pypi.python.org/pypi/nose-progressive/ That's an excellent suggestion (even if it were tongue in cheek, which I hope it weren't). pytest-sugar looks promising already, but we may be able to improve it: https://pivotfinland.com/pytest-sugar/img/video.gif St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Tue Feb 28 18:01:56 2017 From: haberland at ucla.edu (Matt Haberland) Date: Tue, 28 Feb 2017 15:01:56 -0800 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: <2ad3a609-d29d-205e-ccfe-5151c9202261@crans.org> References: <20170222070745.GI3818748@phare.normalesup.org> <2ad3a609-d29d-205e-ccfe-5151c9202261@crans.org> Message-ID: Based on positive feedback, I've gone ahead and completed most of the finishing touches: documentation, pep8 compliance, etc... for the "interior-point" method of linprog. As this is my first contribution to SciPy, I would appreciate a real-time conversation with somebody regarding some remaining questions. Would an experienced contributor, like whomever might review the forthcoming pull request, be willing to schedule a time to chat? Thanks! Matt -------- P.S. Example questions: - How strict does PEP8 compliance need to be? For instance, do I really need to replace lambda functions (defined within functions) with regular functions? (There are a few other things autopep8 couldn't fix I want to ask about.) - The code currently spits out warnings using the standard warnings module. Is that OK or is there some other standard for SciPy? Does there need to to be an option to disable these, or can we let the user suppress them otherwise if desired? - I have a several non-essential feature questions like "Do I need to implement a callback interface like the simplex method does, or can that be left for an update?" Some of these are technical but each can be explained with about one sentence of background information. I'm just looking for opinions from anyone willing to think about it for 10 seconds. P.P.S. Re: Pierre's comment: regarding lack of accuracy, you mean when linprog fails to find a solution when it exists or reports that it has found a solution when it hasn't? Agreed, I think the interior point method would help fix that. And its presolve subroutine could help the simplex code avoid some of those problems. However because the simplex method traverses vertices to numerical precision, its solutions (when it finds them) will typically be more "exact", whereas interior point methods terminate when a user specified tolerance is met. Along those lines, interior point methods can return solutions at the boundary of the polytope but not at a vertex, and so they do not always identify an optimal basis, which some applications require. There are "crossover" methods to do this from an interior point solution but I haven't looked into those yet. For that reason, and to allow time for any bugs with the IP code to surface, should "simplex" remain the default, if the linprog documentation were to state that "interior-point" could be faster? On Thu, Feb 23, 2017 at 6:49 AM, Pierre Haessig wrote: > Hi, > > Le 23/02/2017 ? 01:44, Matt Haberland a ?crit : > > > > Please let me know if you'd like to see the interior point code > > included in SciPy, and I'll integrate with linprog and figure out the > > submission process. (If you are familiar with compiling from source on > > Windows, please shoot me an email. After pushing through a few > > roadblocks, I'm stuck with a very vague error message.) > I haven't looked at your detailed results, only your email. Still, the > fact you checked both the speed and the accuracy looks very convincing > to me (I've been bitten by the lack of accuracy of Scipy linprog in the > past). > > In terms of API, my only suggestion would be to keep one single linprog > routine, with a switch (between IP and simplex). Given your > demonstration of the superiority of your algo, it should be the default. > > best, > Pierre > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: