From PeterBell10 at live.co.uk Fri Aug 2 15:24:34 2019 From: PeterBell10 at live.co.uk (Peter Bell) Date: Fri, 2 Aug 2019 19:24:34 +0000 Subject: [SciPy-Dev] Using OpenMP in SciPy Message-ID: Hello all, I'd like to revisit a discussion started a while ago in gh-10239: Can we build and use OpenMP in a portable way? andyfaff's comment on the issue pointed out a number of portability issues that he's experienced that put him off of OpenMP. Summarising: 1. macOS's apple clang doesn't support openmp and supporting OpenMP requires distributing a runtime with your library. 2. On macOS, the openmp version crashes if mkl numpy is installed. This is a general problem because several different OpenMP runtimes exist. 3. openmp doesn't play nicely with multiprocessing.Pool To expand on that 3rd point, some openmp runtimes aren't fork-safe. Most notably, this includes gcc's libgomp. Upon entering the first openmp parallel region, the runtime initializes a thread pool which won't be rebuilt in the child after fork. This means that any parallel regions in the child will deadlock. Single threaded openmp loops seem to be safe though. Eric Larson has also recently had a chat with an sklearn maintainer which has been using OpenMP in the wild for ~6 months. A few of the key take-aways from his discussion were: * Packaging OpenMP code has many pitfalls but they think they've found a solution that works for them. * They use joblib with the "loky" backend to avoid multiprocessing and the associated forking issues. * They had issues related to incompatibility with macOS Accelerate for BLAS/LAPACK but SciPy has already dropped support for accelerate. * Some macOS users still needed the option to compile the library without OpenMP. Although, this might have also been Accelerate related. So, given these issues, are we willing to use OpenMP in SciPy? Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From PeterBell10 at live.co.uk Fri Aug 2 15:38:41 2019 From: PeterBell10 at live.co.uk (Peter Bell) Date: Fri, 2 Aug 2019 19:38:41 +0000 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: References: Message-ID: Apologies, the format stripping made my previous email quite hard to read so here it is again in plain text. Hello all, I'd like to revisit a discussion started a while ago in gh-10239: Can we build and use OpenMP in a portable way? andyfaff's comment on the issue pointed out a number of portability issues that he's experienced that put him off of OpenMP. Summarising: 1. 1. macOS's apple clang doesn't support openmp and supporting OpenMP requires distributing a runtime with your library. 2. 2. On macOS, the openmp version crashes if mkl numpy is installed. This is a general problem because several different OpenMP runtimes exist. 3. 3. openmp doesn't play nicely with multiprocessing.Pool To expand on that third point, some openmp runtimes aren't fork-safe. Most notably, this includes gcc's libgomp. Upon entering the first openmp parallel region, the runtime initializes a thread pool which won't be rebuilt in the child after fork. This means that any parallel regions in the child will deadlock. Single threaded openmp loops seem to be safe though. Eric Larson has also recently had a chat with an sklearn maintainer which has been using OpenMP in the wild for ~6 months. A few of the key take-aways from his discussion were: * * Packaging OpenMP code has many pitfalls but they think they've found a solution that works for them. * * They use joblib with the "loky" backend to avoid multiprocessing and the associated forking issues. * * They had issues related to incompatibility with macOS Accelerate for BLAS/LAPACK but SciPy has already dropped support for accelerate. * * Some macOS users still needed the option to compile the library without OpenMP. Although, this might have also been Accelerate related. * So, given these issues, are we willing to use OpenMP in SciPy? Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sat Aug 3 08:09:39 2019 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 03 Aug 2019 15:09:39 +0300 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: References: Message-ID: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> Hi, pe, 2019-08-02 kello 19:38 +0000, Peter Bell kirjoitti: [clip] > 3. openmp doesn't play nicely with multiprocessing.Pool > > To expand on that third point, some openmp runtimes aren?t fork-safe. > Most notably, this includes gcc?s libgomp. Upon entering the first > openmp parallel region, the runtime initializes a thread pool which > won?t be rebuilt in the child after fork. This means that any > parallel regions in the child will deadlock. Single threaded openmp > loops seem to be safe though. At first sight, this looks like a showstopper. As I understand, there's no workaround (e.g. even with pthread_atfork setting a scipy-global flag that forces #threads to 1 in any subsequent calls, it would still freeze)? multiprocessing is one of the most common parallelization schemes with Python, and breaking that sounds quite painful. I would be careful with shipping wheels built with OpenMP that breaks user code using multiprocessing. Environment flags might be used to mitigate, but probably should fail-safe and default to not using OpenMP. Since we don't have control over the user code, which has already been written, loky et al. I think are not really a solution. Pauli From josh.craig.wilson at gmail.com Sat Aug 3 20:21:49 2019 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Sat, 3 Aug 2019 17:21:49 -0700 Subject: [SciPy-Dev] Issues backlog Message-ID: Hey all, The SciPy repo currently has 1,242 open issues. The oldest open issue is from April 25th, 2013. There is definitely some signal in those old issues, but, as the project has evolved quite a bit from then, also quite a bit of noise (e.g. feature requests that one person cared about many years ago). We could spend a lot of time going through the old issues and closing the ones that are no longer relevant, but that's going to take a lot of time and I do not believe it is a high-leverage way to spend our limited developer time. Instead I propose that we simply close issues over a certain cutoff date. More concretely, we: - Determine a cutoff date beyond which we will close issues - Announce a date when we will close the old issues - During that time, if anyone wants to preserve an old issue they can open a new, updated issue, or perhaps adjust the SciPy roadmap if something is particularly important. Anyway, I am sure this will be controversial, so looking forward to hearing everyone's thoughts! - Josh From Dieter at Werthmuller.org Sun Aug 4 01:06:29 2019 From: Dieter at Werthmuller.org (=?UTF-8?Q?Dieter_Werthm=c3=bcller?=) Date: Sun, 4 Aug 2019 07:06:29 +0200 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: Message-ID: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> I think this is a very good idea, and it should be applied to issues and PRs. However, instead of age as a criteria I would use inactivity (e.g., if there is an old issue from 2013 that has activity discussions every year then it should not be closed). Various projects do this, and it is an ideal case for a bot, e.g., - https://github.com/bstriner/github-bot-close-inactive-issues - https://github.com/probot/stale Dieter On 04/08/2019 02:21, Joshua Wilson wrote: > Hey all, > > The SciPy repo currently has 1,242 open issues. The oldest open issue > is from April 25th, 2013. There is definitely some signal in those old > issues, but, as the project has evolved quite a bit from then, also > quite a bit of noise (e.g. feature requests that one person cared > about many years ago). > > We could spend a lot of time going through the old issues and closing > the ones that are no longer relevant, but that's going to take a lot > of time and I do not believe it is a high-leverage way to spend our > limited developer time. Instead I propose that we simply close issues > over a certain cutoff date. More concretely, we: > > - Determine a cutoff date beyond which we will close issues > - Announce a date when we will close the old issues > - During that time, if anyone wants to preserve an old issue they can > open a new, updated issue, or perhaps adjust the SciPy roadmap if > something is particularly important. > > Anyway, I am sure this will be controversial, so looking forward to > hearing everyone's thoughts! > > - Josh > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > From jeremie.du-boisberranger at inria.fr Sun Aug 4 07:34:03 2019 From: jeremie.du-boisberranger at inria.fr (Jeremie du Boisberranger) Date: Sun, 4 Aug 2019 13:34:03 +0200 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: References: Message-ID: <1ebd6149-8c1c-61c7-8d39-5526e6e0ec4d@inria.fr> Hi everyone, here's some feedback from the recent use of OpenMP in sklearn. > 1. macOS?s apple clang doesn't support openmp and supporting OpenMP requires distributing a runtime with your library. We ship libgomp in the wheels for macOS users. For users who want to build from source, they need to install libomp and set a few environment variables (apple clang can support OpenMP in preprocessing).Here are the instructions to build with OpenMP support on macOS. > 2. On macOS, the openmp version crashes if mkl numpy is installed. This is a general problem because several different OpenMP runtimes exist. This issue also exists on linux. The crash comes from intel's libiomp which errors when it tries to load although there's already another OpenMP runtime loaded. A workaround exists by setting the KMP_DUPLICATE_OK environment variable, which is what we do in the init of sklearn. I think it's fine as long as we don't do too fancy stuff such as trying to dynamically manage the threadpool of one lib while inside a parallel region of another lib. > 3. openmp doesn't play nicely with multiprocessing.Pool I thought that was fixed in python 3.4 with the forkserver start method. In sklearn we now only support 3.5+. > Some macOS users still needed the option to compile the library without OpenMP. Although, this might have also been Accelerate related. I don't know the reason why some users needed to build without openmp support. In sklearn we only use OpenMP in cython through prange. We added the possibility to build without OpenMP support via an environment variable (but we don't provide wheels without openmp). It allows to be sure there's a way to build sklearn without triggering any OpenMP issues because it was added recently and users haven't found all possible ways to break it yet :) J?r?mie On 02/08/2019 21:24, Peter Bell wrote: > Hello all, > I?d like to revisit a discussion started a while ago in _gh-10239_ > : Can we build and use > OpenMP in a portable way? > _andyfaff_ ?s comment on > the issue pointed out a number of portability issues that he?s > experienced that put him off of OpenMP. Summarising: > > 1. macOS?s apple clang doesn't support openmp and supporting OpenMP > requires distributing a runtime with your library. > 2. On macOS, the openmp version crashes if mkl numpy is installed. > This is a general problem because several different OpenMP > runtimes exist. > 3. openmp doesn't play nicely with multiprocessing.Pool > > To expand on that 3^rd point, some openmp runtimes aren?t fork-safe. > Most notably, this includes gcc?s libgomp. Upon entering the first > openmp parallel region, the runtime initializes a thread pool which > won?t be rebuilt in the child after fork. This means that any parallel > regions in the child will deadlock. Single threaded openmp loops seem > to be safe though. > Eric Larson has also recently had a chat with an sklearn maintainer > which has been using OpenMP in the wild for ~6 months.? A few of the > key take-aways from his discussion were: > > * Packaging OpenMP code has many pitfalls but they think they?ve > found a solution that works for them. > * They use joblib with the ?loky? backend to avoid multiprocessing > and the associated forking issues. > * They had issues related to incompatibility with macOS Accelerate > for BLAS/LAPACK but SciPy has already _dropped support for > accelerate_ > . > * Some macOS users still needed the option to compile the library > without OpenMP. Although, this might have also been Accelerate > related. > > So, given these issues, are we willing to use OpenMP in SciPy? > Peter > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Aug 4 10:29:13 2019 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 04 Aug 2019 17:29:13 +0300 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: <1ebd6149-8c1c-61c7-8d39-5526e6e0ec4d@inria.fr> References: <1ebd6149-8c1c-61c7-8d39-5526e6e0ec4d@inria.fr> Message-ID: <1acc88c2cb5fcdb9e5175e9989b1d0dddd6f1dd6.camel@iki.fi> su, 2019-08-04 kello 13:34 +0200, Jeremie du Boisberranger kirjoitti: [clip] > > 3. openmp doesn't play nicely with multiprocessing.Pool > > I thought that was fixed in python 3.4 with the forkserver start > method. In sklearn we now only support 3.5+. The `forkserver` is not the default method, except on Windows. There are some semantic differences in `fork` vs `forkserver`; strictly correct code should not rely on these, but there is also not strictly correct code. Pauli From PeterBell10 at live.co.uk Sun Aug 4 12:44:00 2019 From: PeterBell10 at live.co.uk (Peter Bell) Date: Sun, 4 Aug 2019 16:44:00 +0000 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> References: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> Message-ID: >> 3. openmp doesn't play nicely with multiprocessing.Pool >> >> To expand on that third point, some openmp runtimes aren?t fork-safe. >> Most notably, this includes gcc?s libgomp. Upon entering the first >> openmp parallel region, the runtime initializes a thread pool which >> won?t be rebuilt in the child after fork. This means that any parallel >> regions in the child will deadlock. Single threaded openmp loops seem >> to be safe though. >At first sight, this looks like a showstopper. As I understand, there's no workaround (e.g. even with pthread_atfork setting a scipy-global flag that forces #threads to 1 in any subsequent calls, it would still freeze)? I've tested it out and running with num_threads(1) in the child appears to work fine, at least with g++. Relying on this not to change in future might be dangerous though. Peter From stefanv at berkeley.edu Sun Aug 4 16:55:45 2019 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Sun, 04 Aug 2019 13:55:45 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> Message-ID: <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: > I think this is a very good idea, and it should be applied to issues and > PRs. However, instead of age as a criteria I would use inactivity (e.g., > if there is an old issue from 2013 that has activity discussions every > year then it should not be closed). Agreed, the last modified age may be a better measure. That way we don't need to recreate issues we want to keep open, we can simply leave a comment / add a label / edit the description. But this method for closing issues can also be infuriating to users: how many times have you come across a project where an issue was described in detail with debugging info, only to be closed by a bot due to inactivity? Perhaps that can be addressed by setting the inactivity-time-to-closure to a high enough duration, perhaps 3 years. If this method is enacted, I suggest an email to the list once a month with issues that were automatically closed. That may at least provide an opportunity for those interested to go back and "save" any important ones they care about. Best regards, St?fan From tyler.je.reddy at gmail.com Sun Aug 4 17:02:17 2019 From: tyler.je.reddy at gmail.com (Tyler Reddy) Date: Sun, 4 Aug 2019 15:02:17 -0600 Subject: [SciPy-Dev] Issues backlog In-Reply-To: <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: I think we've had this discussion about using bots to auto-close issues in SciPy a few times now. I've never been a fan of that & don't really tend to worry too much about having a large number of issues open, though I'm not hugely bothered if we do go that route. I don't typically try to systemically go through them, but instead use labels to filter as appropriate. It may be a little off-putting if an important issue is both neglected long-term and dismissed by a bot, as noted. There's nothing stopping devs or teams working on SciPy from filtering the issues / setting their priorities as they see fit irrespective of N open issues. Indeed, if and when GitHub improve metrics / project management tools and so on the large number of open issues may be navigated with greater ease and mined for relevant data. On Sun, 4 Aug 2019 at 14:56, Stefan van der Walt wrote: > On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: > > I think this is a very good idea, and it should be applied to issues and > > PRs. However, instead of age as a criteria I would use inactivity (e.g., > > if there is an old issue from 2013 that has activity discussions every > > year then it should not be closed). > > Agreed, the last modified age may be a better measure. That way we don't > need to recreate issues we want to keep open, we can simply leave a comment > / add a label / edit the description. > > But this method for closing issues can also be infuriating to users: how > many times have you come across a project where an issue was described in > detail with debugging info, only to be closed by a bot due to inactivity? > Perhaps that can be addressed by setting the inactivity-time-to-closure to > a high enough duration, perhaps 3 years. > > If this method is enacted, I suggest an email to the list once a month > with issues that were automatically closed. That may at least provide an > opportunity for those interested to go back and "save" any important ones > they care about. > > Best regards, > St?fan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Sun Aug 4 17:54:33 2019 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Sun, 04 Aug 2019 14:54:33 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: <396315bf-d4d9-4e1e-8d3a-5b95a9fe6edc@www.fastmail.com> On Sun, Aug 4, 2019, at 14:02, Tyler Reddy wrote: > There's nothing stopping devs or teams working on SciPy from filtering the issues / setting their priorities as they see fit irrespective of N open issues. Indeed, if and when GitHub improve metrics / project management tools and so on the large number of open issues may be navigated with greater ease and mined for relevant data. Also, if I understood correctly, Github now has triage privileges, which means we can have larger groups of volunteers help to organize issues, without needing to be core contributors. St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Sun Aug 4 17:54:33 2019 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Sun, 04 Aug 2019 14:54:33 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: <396315bf-d4d9-4e1e-8d3a-5b95a9fe6edc@www.fastmail.com> On Sun, Aug 4, 2019, at 14:02, Tyler Reddy wrote: > There's nothing stopping devs or teams working on SciPy from filtering the issues / setting their priorities as they see fit irrespective of N open issues. Indeed, if and when GitHub improve metrics / project management tools and so on the large number of open issues may be navigated with greater ease and mined for relevant data. Also, if I understood correctly, Github now has triage privileges, which means we can have larger groups of volunteers help to organize issues, without needing to be core contributors. St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Sun Aug 4 21:16:42 2019 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 5 Aug 2019 11:16:42 +1000 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: References: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> Message-ID: Because this topic is relevant to a personal project of mine, which stands to gain from using openmp, I'm investigating this again. I'll write notes on this at https://gist.github.com/andyfaff/084005bee32aee83d6b59e843278ab3e . At the moment the notes only correspond to how to build the openmp library on macOS. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.m.mccormick at gmail.com Mon Aug 5 06:28:02 2019 From: matthew.m.mccormick at gmail.com (Matthew McCormick) Date: Mon, 5 Aug 2019 06:28:02 -0400 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: On Sun, Aug 4, 2019 at 5:03 PM Tyler Reddy wrote: > I think we've had this discussion about using bots to auto-close issues in > SciPy a few times now. I've never been a fan of that & don't really tend to > worry too much about having a large number of issues open, though I'm not > hugely bothered if we do go that route. > > I don't typically try to systemically go through them, but instead use > labels to filter as appropriate. It may be a little off-putting if an > important issue is both neglected long-term and dismissed by a bot, as > noted. > For what it is worth, I have had very good experiences with Stale Probot: https://github.com/probot/stale It makes a comment and adds a label when there has not been activity on an issue. After a period of time, it will close the issue (although it can always be re-opened), with a friendly comment. In practice, if someone cares or intends on addressing an issue, it remains open. It does a great job of removing clutter from the issue tracker. It is also configurable. Hope this helps, Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From bennet at umich.edu Mon Aug 5 07:41:08 2019 From: bennet at umich.edu (Bennet Fauber) Date: Mon, 5 Aug 2019 07:41:08 -0400 Subject: [SciPy-Dev] Issues backlog In-Reply-To: <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: While I think that having autoclosure by a bot is probably not a good thing, neither is having issues that haven't been touched or acknowledged for years. All of the developers are volunteers, so time, interest, and energy are allocated not so much be what's been reported as by what people can and want to get to. St?fan's idea of issuing a report to the developers of which issues got closed as they get closed (once per month?) to make sure that someone sees what is being done and has a chance to redress is a good one; would it perhaps be better to have the bot send instead a message saying "These issues will be closed automatically _next_ month unless something is done to them" as a method of encouraging review prior to closure? Perhaps even include the issuer on the mail? That might be combined with some reasonable throttling rules, so that you aren't deluged with review messages right away. That would also make it possible to 'pilot' the scheme to see what the reaction from the issuers is prior to committing to the course completely. This seems like a thorny issue. Good luck coming up with something that works! Best, -- bennet On Sun, Aug 4, 2019 at 4:56 PM Stefan van der Walt wrote: > > On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: > > I think this is a very good idea, and it should be applied to issues and > > PRs. However, instead of age as a criteria I would use inactivity (e.g., > > if there is an old issue from 2013 that has activity discussions every > > year then it should not be closed). > > Agreed, the last modified age may be a better measure. That way we don't need to recreate issues we want to keep open, we can simply leave a comment / add a label / edit the description. > > But this method for closing issues can also be infuriating to users: how many times have you come across a project where an issue was described in detail with debugging info, only to be closed by a bot due to inactivity? Perhaps that can be addressed by setting the inactivity-time-to-closure to a high enough duration, perhaps 3 years. > > If this method is enacted, I suggest an email to the list once a month with issues that were automatically closed. That may at least provide an opportunity for those interested to go back and "save" any important ones they care about. > > Best regards, > St?fan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev From ralf.gommers at gmail.com Mon Aug 5 13:39:53 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 5 Aug 2019 10:39:53 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt wrote: > On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: > > I think this is a very good idea, and it should be applied to issues and > > PRs. However, instead of age as a criteria I would use inactivity (e.g., > > if there is an old issue from 2013 that has activity discussions every > > year then it should not be closed). > > Agreed, the last modified age may be a better measure. That way we don't > need to recreate issues we want to keep open, we can simply leave a comment > / add a label / edit the description. > > But this method for closing issues can also be infuriating to users: how > many times have you come across a project where an issue was described in > detail with debugging info, only to be closed by a bot due to inactivity? > Perhaps that can be addressed by setting the inactivity-time-to-closure to > a high enough duration, perhaps 3 years. > I agree with the infuriating part: each time I've had an experience like that with another project (e.g. pip), it has been extremely annoying. To the extent I decided to never contribute again. Issues closed that were valid, PRs closed that didn't get reviewed, etc. It is a good way to tell contributors that their contributions aren't valued, and/or lose useful information. In general, valid bug reports should simply not be closed imho. The exception is probably new feature requests. About 300 of the 1200 open issues have the "enhancement" label. Looking through the older enhancements shows that many can be closed. However even there, there's useful content in some. I'd much rather go through them and close by hand than have some bot do it. This has multiple advantages: - doesn't anger contributors - keeps the useful ones - is probably _less_ work (at 3 minutes per issue one could triage the 200 issues that haven't been touched in the last 2 years in a single day), choosing and maintaining a bot is easily going to take someone that much time. > If this method is enacted, I suggest an email to the list once a month > with issues that were automatically closed. That may at least provide an > opportunity for those interested to go back and "save" any important ones > they care about. > This is actually even more work. It asks all maintainers and mailing list contributors to triage with such an email, so many people will be doing duplicate work. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Aug 5 13:46:24 2019 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 05 Aug 2019 20:46:24 +0300 Subject: [SciPy-Dev] Issues backlog (and PRs) In-Reply-To: References: Message-ID: Hi, la, 2019-08-03 kello 17:21 -0700, Joshua Wilson kirjoitti: > The SciPy repo currently has 1,242 open issues. The oldest open issue > is from April 25th, 2013. There is definitely some signal in those > old > issues, but, as the project has evolved quite a bit from then, also > quite a bit of noise (e.g. feature requests that one person cared > about many years ago). > > We could spend a lot of time going through the old issues and closing > the ones that are no longer relevant, but that's going to take a lot > of time and I do not believe it is a high-leverage way to spend our > limited developer time. Instead I propose that we simply close issues > over a certain cutoff date. More concretely, we: > > - Determine a cutoff date beyond which we will close issues > - Announce a date when we will close the old issues > - During that time, if anyone wants to preserve an old issue they can > open a new, updated issue, or perhaps adjust the SciPy roadmap if > something is particularly important. > > Anyway, I am sure this will be controversial, so looking forward to > hearing everyone's thoughts! I'm not sure I like an auto-close bot myself --- a bug is a bug. The "defect" labels I think have been somewhat conservatively correctly assigned, and probably not so many of those are in reality invalidated. There's of course a problem that some old issues are open because the scope is big, and would need lots of work to address. With the "enhancement" etc. labels the situation may be a bit different, as some of those probably are specific to someone's needs at a specific time. A bot just tagging old issues with "stale" probably would be fine, as it's simple to filter the old ones out ("-label:stale"). *** We have a somewhat similar issue with old PRs --- there's quite many where the original submitter has been MIA for years. For many of them, I think there's usually some reason why they were not finished (i.e. some roadblock has been encountered, the approach wasn't quite right, polish is missing, etc.). I'd like to suggest the following: * Old and stale? Do you want to do it yourself? -> Do it yourself (possibly from scratch). * Old and stale? Is there something useful? -> Add `needs-champion` label and close. Say it's been closed due to lack of activity, but if someone wants to they can pick it up. * Just old and stale? -> As above, but don't add `needs-champion`. Probably should not be blindly applied, e.g. to cases where the problem is just that nobody got around to review at the time, and then it got buried. I'd also like to recommend reviewers to use the PR workflow features: * Did you complete reviewing a PR (so it's waiting for changes)? -> Tag with 'needs-work'. * Did you complete re-reviewing an updated PR (so it's waiting for changes again)? -> Remove 'needs-work' label, and then re-add 'needs-work' again. (A line at the bottom will then appear saying you removed and re-added the tag.) Alternatively, use Github's review system --- but when you have completed re-reviewing an updated PR, add a new "Changes requested" review comment instead of only adding new comments to the discussion. (Using the tags may be less hassle.) This will help e.g. my custom status tracking script to keep up: https://pav.iki.fi/scipy-needs-work/ Last time I looked, Github's web UI workflow tracking still couldn't list PRs that are awaiting reviewer response, so I guess I'm sticking with this script until the situation improves. Pauli From haberland at ucla.edu Mon Aug 5 14:27:57 2019 From: haberland at ucla.edu (Matt Haberland) Date: Mon, 5 Aug 2019 11:27:57 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: > I'd much rather go through them and close by hand than have some bot do it. Instead of automatically closing, how about a person or bot proposing an "issue of the day"? I did a survey of some old issues in preparation for the CZI proposal, and I got the sense that some of them just needed to find the right set of eyes. Showing the same issue to many of us at the same time could help with that. On Mon, Aug 5, 2019 at 10:41 AM Ralf Gommers wrote: > > > On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt > wrote: > >> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: >> > I think this is a very good idea, and it should be applied to issues and >> > PRs. However, instead of age as a criteria I would use inactivity (e.g., >> > if there is an old issue from 2013 that has activity discussions every >> > year then it should not be closed). >> >> Agreed, the last modified age may be a better measure. That way we don't >> need to recreate issues we want to keep open, we can simply leave a comment >> / add a label / edit the description. >> >> But this method for closing issues can also be infuriating to users: how >> many times have you come across a project where an issue was described in >> detail with debugging info, only to be closed by a bot due to inactivity? >> Perhaps that can be addressed by setting the inactivity-time-to-closure to >> a high enough duration, perhaps 3 years. >> > > I agree with the infuriating part: each time I've had an experience like > that with another project (e.g. pip), it has been extremely annoying. To > the extent I decided to never contribute again. Issues closed that were > valid, PRs closed that didn't get reviewed, etc. It is a good way to tell > contributors that their contributions aren't valued, and/or lose useful > information. > > In general, valid bug reports should simply not be closed imho. The > exception is probably new feature requests. About 300 of the 1200 open > issues have the "enhancement" label. Looking through the older enhancements > shows that many can be closed. However even there, there's useful content > in some. I'd much rather go through them and close by hand than have some > bot do it. This has multiple advantages: > - doesn't anger contributors > - keeps the useful ones > - is probably _less_ work (at 3 minutes per issue one could triage the 200 > issues that haven't been touched in the last 2 years in a single day), > choosing and maintaining a bot is easily going to take someone that much > time. > > > >> If this method is enacted, I suggest an email to the list once a month >> with issues that were automatically closed. That may at least provide an >> opportunity for those interested to go back and "save" any important ones >> they care about. >> > > This is actually even more work. It asks all maintainers and mailing list > contributors to triage with such an email, so many people will be doing > duplicate work. > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 6617A Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Aug 5 21:25:20 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 5 Aug 2019 18:25:20 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: On Mon, Aug 5, 2019 at 11:28 AM Matt Haberland wrote: > > I'd much rather go through them and close by hand than have some bot do > it. > > Instead of automatically closing, how about a person or bot proposing an > "issue of the day"? > This has the same issue imho, it's just more noise. We've tried this in the past (IIRC with docstrings) and it doesn't really work. > I did a survey of some old issues in preparation for the CZI proposal, and > I got the sense that some of them just needed to find the right set of > eyes. > Indeed. Or better: just any pair of eyes with some experience of SciPy usage/development. I'd be quite unhappy with more random pings or unfriendly bots. However, if we want to make a dent in this, I'm happy to volunteer to triage 50 or 100 issues over the next few weeks. If some more people do that, the "outdated issues" issue can be solved without too much trouble. Also note, there's a second way to do this: consolidating issues. It's a little more work, but can be quite effective. For example, search for "mannwhitneyu" in the issue tracker. It shows 9 open issues for that one function. I bet (almost) all of them are still valid bug reports. If someone would spend 30 minutes or so to write a summary, we could bring that down from 9 open issues to 1. Cheers, Ralf Showing the same issue to many of us at the same time could help with that. > > > On Mon, Aug 5, 2019 at 10:41 AM Ralf Gommers > wrote: > >> >> >> On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt >> wrote: >> >>> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: >>> > I think this is a very good idea, and it should be applied to issues >>> and >>> > PRs. However, instead of age as a criteria I would use inactivity >>> (e.g., >>> > if there is an old issue from 2013 that has activity discussions every >>> > year then it should not be closed). >>> >>> Agreed, the last modified age may be a better measure. That way we >>> don't need to recreate issues we want to keep open, we can simply leave a >>> comment / add a label / edit the description. >>> >>> But this method for closing issues can also be infuriating to users: how >>> many times have you come across a project where an issue was described in >>> detail with debugging info, only to be closed by a bot due to inactivity? >>> Perhaps that can be addressed by setting the inactivity-time-to-closure to >>> a high enough duration, perhaps 3 years. >>> >> >> I agree with the infuriating part: each time I've had an experience like >> that with another project (e.g. pip), it has been extremely annoying. To >> the extent I decided to never contribute again. Issues closed that were >> valid, PRs closed that didn't get reviewed, etc. It is a good way to tell >> contributors that their contributions aren't valued, and/or lose useful >> information. >> >> In general, valid bug reports should simply not be closed imho. The >> exception is probably new feature requests. About 300 of the 1200 open >> issues have the "enhancement" label. Looking through the older enhancements >> shows that many can be closed. However even there, there's useful content >> in some. I'd much rather go through them and close by hand than have some >> bot do it. This has multiple advantages: >> - doesn't anger contributors >> - keeps the useful ones >> - is probably _less_ work (at 3 minutes per issue one could triage the >> 200 issues that haven't been touched in the last 2 years in a single day), >> choosing and maintaining a bot is easily going to take someone that much >> time. >> >> >> >>> If this method is enacted, I suggest an email to the list once a month >>> with issues that were automatically closed. That may at least provide an >>> opportunity for those interested to go back and "save" any important ones >>> they care about. >>> >> >> This is actually even more work. It asks all maintainers and mailing list >> contributors to triage with such an email, so many people will be doing >> duplicate work. >> >> Cheers, >> Ralf >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > > -- > Matt Haberland > Assistant Adjunct Professor in the Program in Computing > Department of Mathematics > 6617A Math Sciences Building, UCLA > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Aug 5 22:23:07 2019 From: andyfaff at gmail.com (Andrew Nelson) Date: Tue, 6 Aug 2019 12:23:07 +1000 Subject: [SciPy-Dev] tools/win32 and tools/scipy-macosx-installer Message-ID: Is there any remaining need for these directories? Do we still make windows and OSX installer packages? -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Aug 5 22:27:34 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 5 Aug 2019 19:27:34 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: On Mon, Aug 5, 2019 at 10:39 AM Ralf Gommers wrote: > > > On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt > wrote: > >> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: >> > I think this is a very good idea, and it should be applied to issues and >> > PRs. However, instead of age as a criteria I would use inactivity (e.g., >> > if there is an old issue from 2013 that has activity discussions every >> > year then it should not be closed). >> >> Agreed, the last modified age may be a better measure. That way we don't >> need to recreate issues we want to keep open, we can simply leave a comment >> / add a label / edit the description. >> >> But this method for closing issues can also be infuriating to users: how >> many times have you come across a project where an issue was described in >> detail with debugging info, only to be closed by a bot due to inactivity? >> Perhaps that can be addressed by setting the inactivity-time-to-closure to >> a high enough duration, perhaps 3 years. >> > > I agree with the infuriating part: each time I've had an experience like > that with another project (e.g. pip), it has been extremely annoying. To > the extent I decided to never contribute again. Issues closed that were > valid, PRs closed that didn't get reviewed, etc. It is a good way to tell > contributors that their contributions aren't valued, and/or lose useful > information. > > In general, valid bug reports should simply not be closed imho. The > exception is probably new feature requests. About 300 of the 1200 open > issues have the "enhancement" label. Looking through the older enhancements > shows that many can be closed. However even there, there's useful content > in some. I'd much rather go through them and close by hand than have some > bot do it. This has multiple advantages: > - doesn't anger contributors > - keeps the useful ones > - is probably _less_ work (at 3 minutes per issue one could triage the 200 > issues that haven't been touched in the last 2 years in a single day), > choosing and maintaining a bot is easily going to take someone that much > time. > To validate this, I just reserved 30 minutes to go through enhancement issues, starting from the oldest ones. Results: closed 10 (with rationales), kept 1 open, pinged the relevant person on 2 to check whether the issue could be closed. Full list: gh-788: 2 min gh-818: 3 min gh-857: 3 min relevant, leaving open gh-881: 2 min gh-885: 2 min gh-983: 2 min, added question on leaving open or not gh-1002: 3 min gh-1005: 2 min gh-1089: 1 min gh-1175: 2 min, added question on leaving open or not gh-1219: 2 min gh-1335: 1 min gh-1338: 3 min So my 3 minutes per issue was a conservative estimate. Newer issues could require more time, but on the other hand I just looked at all issues in order rather than picking the ones only from modules I'm more familiar with. YMMV, but imho we can do a major cleanup of the issue list fairly easily. Cheers, Ralf > > >> If this method is enacted, I suggest an email to the list once a month >> with issues that were automatically closed. That may at least provide an >> opportunity for those interested to go back and "save" any important ones >> they care about. >> > > This is actually even more work. It asks all maintainers and mailing list > contributors to triage with such an email, so many people will be doing > duplicate work. > > Cheers, > Ralf > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Aug 5 22:28:57 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 5 Aug 2019 19:28:57 -0700 Subject: [SciPy-Dev] tools/win32 and tools/scipy-macosx-installer In-Reply-To: References: Message-ID: On Mon, Aug 5, 2019 at 7:23 PM Andrew Nelson wrote: > Is there any remaining need for these directories? Do we still make > windows and OSX installer packages? > No, we haven't done so in years. Cleanup PR very welcome:) Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From larson.eric.d at gmail.com Mon Aug 5 23:35:46 2019 From: larson.eric.d at gmail.com (Eric Larson) Date: Mon, 5 Aug 2019 23:35:46 -0400 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: Thanks for working through those issues (and more at this very moment it appears!). I'll try to make time for this sort of thing as well. Initially I was overall in favor of the auto-closing-after-a-warning bot, but the continued discussion here has persuaded me that manual intervention might actually be better (higher SNR and more efficient) in the end. Eric On Mon, Aug 5, 2019 at 10:28 PM Ralf Gommers wrote: > > > On Mon, Aug 5, 2019 at 10:39 AM Ralf Gommers > wrote: > >> >> >> On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt >> wrote: >> >>> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: >>> > I think this is a very good idea, and it should be applied to issues >>> and >>> > PRs. However, instead of age as a criteria I would use inactivity >>> (e.g., >>> > if there is an old issue from 2013 that has activity discussions every >>> > year then it should not be closed). >>> >>> Agreed, the last modified age may be a better measure. That way we >>> don't need to recreate issues we want to keep open, we can simply leave a >>> comment / add a label / edit the description. >>> >>> But this method for closing issues can also be infuriating to users: how >>> many times have you come across a project where an issue was described in >>> detail with debugging info, only to be closed by a bot due to inactivity? >>> Perhaps that can be addressed by setting the inactivity-time-to-closure to >>> a high enough duration, perhaps 3 years. >>> >> >> I agree with the infuriating part: each time I've had an experience like >> that with another project (e.g. pip), it has been extremely annoying. To >> the extent I decided to never contribute again. Issues closed that were >> valid, PRs closed that didn't get reviewed, etc. It is a good way to tell >> contributors that their contributions aren't valued, and/or lose useful >> information. >> >> In general, valid bug reports should simply not be closed imho. The >> exception is probably new feature requests. About 300 of the 1200 open >> issues have the "enhancement" label. Looking through the older enhancements >> shows that many can be closed. However even there, there's useful content >> in some. I'd much rather go through them and close by hand than have some >> bot do it. This has multiple advantages: >> - doesn't anger contributors >> - keeps the useful ones >> - is probably _less_ work (at 3 minutes per issue one could triage the >> 200 issues that haven't been touched in the last 2 years in a single day), >> choosing and maintaining a bot is easily going to take someone that much >> time. >> > > To validate this, I just reserved 30 minutes to go through enhancement > issues, starting from the oldest ones. Results: closed 10 (with > rationales), kept 1 open, pinged the relevant person on 2 to check whether > the issue could be closed. Full list: > > gh-788: 2 min > gh-818: 3 min > gh-857: 3 min relevant, leaving open > gh-881: 2 min > gh-885: 2 min > gh-983: 2 min, added question on leaving open or not > gh-1002: 3 min > gh-1005: 2 min > gh-1089: 1 min > gh-1175: 2 min, added question on leaving open or not > gh-1219: 2 min > gh-1335: 1 min > gh-1338: 3 min > > So my 3 minutes per issue was a conservative estimate. Newer issues could > require more time, but on the other hand I just looked at all issues in > order rather than picking the ones only from modules I'm more familiar > with. YMMV, but imho we can do a major cleanup of the issue list fairly > easily. > > Cheers, > Ralf > > > > >> >> >>> If this method is enacted, I suggest an email to the list once a month >>> with issues that were automatically closed. That may at least provide an >>> opportunity for those interested to go back and "save" any important ones >>> they care about. >>> >> >> This is actually even more work. It asks all maintainers and mailing list >> contributors to triage with such an email, so many people will be doing >> duplicate work. >> >> Cheers, >> Ralf >> >> _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Aug 5 23:55:53 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 5 Aug 2019 20:55:53 -0700 Subject: [SciPy-Dev] Issues backlog In-Reply-To: References: <5cca10c7-44a9-00d3-2965-de685ff2d4ed@Werthmuller.org> <67515dca-9ce2-4222-8da0-448756cbdead@www.fastmail.com> Message-ID: On Mon, Aug 5, 2019 at 8:36 PM Eric Larson wrote: > Thanks for working through those issues (and more at this very moment it > appears!). > yeah, it feels kind of productive:) down from 1242 issues 2 days ago to 1185 issues right now. and I believe there's enough useful info in some of the older issues that it's not easily possible to close them automatically. I'll try to make time for this sort of thing as well. > thanks! Ralf > Initially I was overall in favor of the auto-closing-after-a-warning bot, > but the continued discussion here has persuaded me that manual intervention > might actually be better (higher SNR and more efficient) in the end. > > Eric > > > On Mon, Aug 5, 2019 at 10:28 PM Ralf Gommers > wrote: > >> >> >> On Mon, Aug 5, 2019 at 10:39 AM Ralf Gommers >> wrote: >> >>> >>> >>> On Sun, Aug 4, 2019 at 1:56 PM Stefan van der Walt >>> wrote: >>> >>>> On Sat, Aug 3, 2019, at 22:22, Dieter Werthm?ller wrote: >>>> > I think this is a very good idea, and it should be applied to issues >>>> and >>>> > PRs. However, instead of age as a criteria I would use inactivity >>>> (e.g., >>>> > if there is an old issue from 2013 that has activity discussions every >>>> > year then it should not be closed). >>>> >>>> Agreed, the last modified age may be a better measure. That way we >>>> don't need to recreate issues we want to keep open, we can simply leave a >>>> comment / add a label / edit the description. >>>> >>>> But this method for closing issues can also be infuriating to users: >>>> how many times have you come across a project where an issue was described >>>> in detail with debugging info, only to be closed by a bot due to >>>> inactivity? Perhaps that can be addressed by setting the >>>> inactivity-time-to-closure to a high enough duration, perhaps 3 years. >>>> >>> >>> I agree with the infuriating part: each time I've had an experience like >>> that with another project (e.g. pip), it has been extremely annoying. To >>> the extent I decided to never contribute again. Issues closed that were >>> valid, PRs closed that didn't get reviewed, etc. It is a good way to tell >>> contributors that their contributions aren't valued, and/or lose useful >>> information. >>> >>> In general, valid bug reports should simply not be closed imho. The >>> exception is probably new feature requests. About 300 of the 1200 open >>> issues have the "enhancement" label. Looking through the older enhancements >>> shows that many can be closed. However even there, there's useful content >>> in some. I'd much rather go through them and close by hand than have some >>> bot do it. This has multiple advantages: >>> - doesn't anger contributors >>> - keeps the useful ones >>> - is probably _less_ work (at 3 minutes per issue one could triage the >>> 200 issues that haven't been touched in the last 2 years in a single day), >>> choosing and maintaining a bot is easily going to take someone that much >>> time. >>> >> >> To validate this, I just reserved 30 minutes to go through enhancement >> issues, starting from the oldest ones. Results: closed 10 (with >> rationales), kept 1 open, pinged the relevant person on 2 to check whether >> the issue could be closed. Full list: >> >> gh-788: 2 min >> gh-818: 3 min >> gh-857: 3 min relevant, leaving open >> gh-881: 2 min >> gh-885: 2 min >> gh-983: 2 min, added question on leaving open or not >> gh-1002: 3 min >> gh-1005: 2 min >> gh-1089: 1 min >> gh-1175: 2 min, added question on leaving open or not >> gh-1219: 2 min >> gh-1335: 1 min >> gh-1338: 3 min >> >> So my 3 minutes per issue was a conservative estimate. Newer issues could >> require more time, but on the other hand I just looked at all issues in >> order rather than picking the ones only from modules I'm more familiar >> with. YMMV, but imho we can do a major cleanup of the issue list fairly >> easily. >> >> Cheers, >> Ralf >> >> >> >> >>> >>> >>>> If this method is enacted, I suggest an email to the list once a month >>>> with issues that were automatically closed. That may at least provide an >>>> opportunity for those interested to go back and "save" any important ones >>>> they care about. >>>> >>> >>> This is actually even more work. It asks all maintainers and mailing >>> list contributors to triage with such an email, so many people will be >>> doing duplicate work. >>> >>> Cheers, >>> Ralf >>> >>> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Aug 6 19:46:36 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 6 Aug 2019 16:46:36 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon Message-ID: Hi all, Google has announced the Season of Docs participants for this year [1]. We had a lot of excellent candidates and had to make some hard choices. We applied for extra slots, but unfortunately didn't win the lottery for those; we got one slot for NumPy and one for SciPy. We chose the projects of Anne for NumPy and Maja for SciPy: Anne Bonner, "Making "The Basics" a Little More Basic: Improving the Introductory NumPy Sections" [2] Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] That's not all though. There was some space left in the budget of the NumPy BIDS grant, and St?fan has reserved that so we can accept more writers and provide them the same mentoring and funding as they would have gotten through GSoD. We could only start the conversations about that once Google made its decisions, so a further announcement will follow. However, we already have one extra project confirmed, from Brandon: Brandon David, "Improve the documentation of scipy.stats" (project details to be published). I will send out a poll to find a good time for everyone for a kickoff call. Our intent is to build a documentation team with multiple writers and mentors interacting and able to help each other out. And all of this will also interact with the numpy.org website redesign and the people putting energy into that:) I'm very happy to welcome Anne, Maja and Brandon! Cheers, Ralf [1] https://developers.google.com/season-of-docs/docs/participants/ [2] https://developers.google.com/season-of-docs/docs/participants/project-numpy [3] https://developers.google.com/season-of-docs/docs/participants/project-scipy -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Tue Aug 6 21:33:56 2019 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Wed, 7 Aug 2019 03:33:56 +0200 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: Great news, welcome all! On Wed, Aug 7, 2019 at 1:47 AM Ralf Gommers wrote: > Hi all, > > Google has announced the Season of Docs participants for this year [1]. We > had a lot of excellent candidates and had to make some hard choices. We > applied for extra slots, but unfortunately didn't win the lottery for > those; we got one slot for NumPy and one for SciPy. We chose the projects > of Anne for NumPy and Maja for SciPy: > > Anne Bonner, "Making "The Basics" a Little More Basic: Improving the > Introductory NumPy Sections" [2] > > Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] > > That's not all though. There was some space left in the budget of the > NumPy BIDS grant, and St?fan has reserved that so we can accept more > writers and provide them the same mentoring and funding as they would have > gotten through GSoD. We could only start the conversations about that once > Google made its decisions, so a further announcement will follow. However, > we already have one extra project confirmed, from Brandon: > > Brandon David, "Improve the documentation of scipy.stats" (project details > to be published). > > I will send out a poll to find a good time for everyone for a kickoff > call. Our intent is to build a documentation team with multiple writers and > mentors interacting and able to help each other out. And all of this will > also interact with the numpy.org website redesign and the people putting > energy into that:) > > I'm very happy to welcome Anne, Maja and Brandon! > > Cheers, > Ralf > > > [1] https://developers.google.com/season-of-docs/docs/participants/ > [2] > https://developers.google.com/season-of-docs/docs/participants/project-numpy > [3] > https://developers.google.com/season-of-docs/docs/participants/project-scipy > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Wed Aug 7 03:42:05 2019 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Wed, 07 Aug 2019 00:42:05 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: <08923eaf-a987-47e9-b28b-81216df4e92c@www.fastmail.com> On Tue, Aug 6, 2019, at 16:47, Ralf Gommers wrote: > Google has announced the Season of Docs participants for this year [1]. We had a lot of excellent candidates and had to make some hard choices. We applied for extra slots, but unfortunately didn't win the lottery for those; we got one slot for NumPy and one for SciPy. We chose the projects of Anne for NumPy and Maja for SciPy: > > Anne Bonner, "Making "The Basics" a Little More Basic: Improving the Introductory NumPy Sections" [2] > > Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] Fantastic and exciting news; welcome aboard to our new team members! This is such tremendously important work, and it has been languishing for *way* too long. I look forward to working with you. Best regards, St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From angeline.burrell at nrl.navy.mil Wed Aug 7 14:23:30 2019 From: angeline.burrell at nrl.navy.mil (Burrell, Angeline) Date: Wed, 7 Aug 2019 18:23:30 +0000 Subject: [SciPy-Dev] Proposed new feature: nan insensitive circular statistics Message-ID: <6540F43D-2270-4274-8233-84D77EC861C5@nrl.navy.mil> I have NaN insensitive versions of the SciPy circular mean and standard deviation routines that I would like to contribute to the scipy.stats subpackage. Since the hacking guidelines recommend discussing new contributions on this mailing list, I'd like to get the communal go-ahead before proceeding to integrate the routines and unit tests into scipy. For consistency, I would also add a NaN insensitive version of the circular variance routine. Cheers, Angeline ----------------------------------------------------- Dr. Angeline G. Burrell [she/her/hers] Research Physicist, Bldg. 209 Naval Research Laboratory (NRL) 4555 Overlook Ave SW Washington, DC 20375 (P) 202-404-4065 ----------------------------------------------------- From ralf.gommers at gmail.com Wed Aug 7 21:03:15 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 7 Aug 2019 18:03:15 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers wrote: > Hi all, > > Google has announced the Season of Docs participants for this year [1]. We > had a lot of excellent candidates and had to make some hard choices. We > applied for extra slots, but unfortunately didn't win the lottery for > those; we got one slot for NumPy and one for SciPy. We chose the projects > of Anne for NumPy and Maja for SciPy: > > Anne Bonner, "Making "The Basics" a Little More Basic: Improving the > Introductory NumPy Sections" [2] > > Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] > > That's not all though. There was some space left in the budget of the > NumPy BIDS grant, and St?fan has reserved that so we can accept more > writers and provide them the same mentoring and funding as they would have > gotten through GSoD. We could only start the conversations about that once > Google made its decisions, so a further announcement will follow. However, > we already have one extra project confirmed, from Brandon: > > Brandon David, "Improve the documentation of scipy.stats" (project details > to be published). > Happy to announce that we have a fourth participant: Shekhar Rajak, "numpy.org redesign and high level documentation restructuring for end user focus" Welcome Shekhar! I will send out a poll to find a good time for everyone for a kickoff call. > Our intent is to build a documentation team with multiple writers and > mentors interacting and able to help each other out. And all of this will > also interact with the numpy.org website redesign and the people putting > energy into that:) > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope we can find a time that works for everyone - we're split over all US timezones, Europe and India. So it's going to be early morning or late evening somewhere. Sending this out in public, so anyone who wants to participate is welcome to join. I've Bcc'd all participants and mentors, to make sure they see this. Cheers, Ralf > > I'm very happy to welcome Anne, Maja and Brandon! > > Cheers, > Ralf > > > [1] https://developers.google.com/season-of-docs/docs/participants/ > [2] > https://developers.google.com/season-of-docs/docs/participants/project-numpy > [3] > https://developers.google.com/season-of-docs/docs/participants/project-scipy > -------------- next part -------------- An HTML attachment was scrubbed... URL: From shekharrajak.1994 at gmail.com Thu Aug 8 02:55:59 2019 From: shekharrajak.1994 at gmail.com (Shekhar Rajak) Date: Thu, 8 Aug 2019 06:55:59 +0000 (UTC) Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: <2133160200.2039445.1565247359284@mail.yahoo.com> Thanks for the opportunity. I have marked my preferred time in doodle poll link. Looking forward to talking with the team and excited to explore it further. Regards,Shekhar Prasad Rajak, Contact : +918142478937Blog?|?Github?|?TwitterSkype: shekhar.rajak1 On Thursday, 8 August 2019, 06:33:26 am GMT+5:30, Ralf Gommers wrote: On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers wrote: Hi all, Google has announced the Season of Docs participants for this year [1]. We had a lot of excellent candidates and had to make some hard choices. We applied for extra slots, but unfortunately didn't win the lottery for those; we got one slot for NumPy and one for SciPy. We chose the projects of Anne for NumPy and Maja for SciPy: Anne Bonner, "Making "The Basics" a Little More Basic: Improving the Introductory NumPy Sections" [2] Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] That's not all though. There was some space left in the budget of the NumPy BIDS grant, and St?fan has reserved that so we can accept more writers and provide them the same mentoring and funding as they would have gotten through GSoD. We could only start the conversations about that once Google made its decisions, so a further announcement will follow. However, we already have one extra project confirmed, from Brandon: Brandon David, "Improve the documentation of scipy.stats" (project details to be published). Happy to announce that we have a fourth participant: Shekhar Rajak, "numpy.org redesign and high level documentation restructuring for end user focus" Welcome Shekhar! I will send out a poll to find a good time for everyone for a kickoff call. Our intent is to build a documentation team with multiple writers and mentors interacting and able to help each other out. And all of this will also interact with the numpy.org website redesign and the people putting energy into that:) Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope we can find a time that works for everyone - we're split over all US timezones, Europe and India. So it's going to be early morning or late evening somewhere. Sending this out in public, so anyone who wants to participate is welcome to join. I've Bcc'd all participants and mentors, to make sure they see this. Cheers, Ralf ? I'm very happy to welcome Anne, Maja and Brandon! Cheers, Ralf [1] https://developers.google.com/season-of-docs/docs/participants/[2] https://developers.google.com/season-of-docs/docs/participants/project-numpy[3] https://developers.google.com/season-of-docs/docs/participants/project-scipy -------------- next part -------------- An HTML attachment was scrubbed... URL: From bennet at umich.edu Thu Aug 8 08:20:44 2019 From: bennet at umich.edu (Bennet Fauber) Date: Thu, 8 Aug 2019 08:20:44 -0400 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: I can offer some time for alpha or beta reading and proofreading. I have experience as a proofreader, copy editor (American Statistical Association), and mathematical typesetter (Wiley, Academic Press, Addison-Wesley, et al.). I've taught statistical software workshops, (very) introductory python, have a Software Carpentry instructor certificate, and work daily with people who are finding themselves needing technical and scientific computing but who don't have strong backgrounds. Hopefully that would be good context for early review of a couple of the projects. I will add my name to the poll, if that's OK? -- bennet On Wed, Aug 7, 2019 at 9:03 PM Ralf Gommers wrote: > > > > On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers wrote: >> >> Hi all, >> >> Google has announced the Season of Docs participants for this year [1]. We had a lot of excellent candidates and had to make some hard choices. We applied for extra slots, but unfortunately didn't win the lottery for those; we got one slot for NumPy and one for SciPy. We chose the projects of Anne for NumPy and Maja for SciPy: >> >> Anne Bonner, "Making "The Basics" a Little More Basic: Improving the Introductory NumPy Sections" [2] >> >> Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] >> >> That's not all though. There was some space left in the budget of the NumPy BIDS grant, and St?fan has reserved that so we can accept more writers and provide them the same mentoring and funding as they would have gotten through GSoD. We could only start the conversations about that once Google made its decisions, so a further announcement will follow. However, we already have one extra project confirmed, from Brandon: >> >> Brandon David, "Improve the documentation of scipy.stats" (project details to be published). > > > Happy to announce that we have a fourth participant: > > Shekhar Rajak, "numpy.org redesign and high level documentation restructuring for end user focus" > > Welcome Shekhar! > >> I will send out a poll to find a good time for everyone for a kickoff call. Our intent is to build a documentation team with multiple writers and mentors interacting and able to help each other out. And all of this will also interact with the numpy.org website redesign and the people putting energy into that:) > > > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope we can find a time that works for everyone - we're split over all US timezones, Europe and India. So it's going to be early morning or late evening somewhere. > > Sending this out in public, so anyone who wants to participate is welcome to join. I've Bcc'd all participants and mentors, to make sure they see this. > > Cheers, > Ralf > > >> >> >> I'm very happy to welcome Anne, Maja and Brandon! >> >> Cheers, >> Ralf >> >> >> [1] https://developers.google.com/season-of-docs/docs/participants/ >> [2] https://developers.google.com/season-of-docs/docs/participants/project-numpy >> [3] https://developers.google.com/season-of-docs/docs/participants/project-scipy > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev From s.denaxas at gmail.com Thu Aug 8 08:32:22 2019 From: s.denaxas at gmail.com (Spiros Denaxas) Date: Thu, 8 Aug 2019 13:32:22 +0100 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: Hello - I would also be happy to help. best Spiros On Thu, Aug 8, 2019 at 1:21 PM Bennet Fauber wrote: > I can offer some time for alpha or beta reading and proofreading. > > I have experience as a proofreader, copy editor (American Statistical > Association), and mathematical typesetter (Wiley, Academic Press, > Addison-Wesley, et al.). I've taught statistical software workshops, > (very) introductory python, have a Software Carpentry instructor > certificate, and work daily with people who are finding themselves > needing technical and scientific computing but who don't have strong > backgrounds. Hopefully that would be good context for early review of > a couple of the projects. > > I will add my name to the poll, if that's OK? > > -- bennet > > On Wed, Aug 7, 2019 at 9:03 PM Ralf Gommers > wrote: > > > > > > > > On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers > wrote: > >> > >> Hi all, > >> > >> Google has announced the Season of Docs participants for this year [1]. > We had a lot of excellent candidates and had to make some hard choices. We > applied for extra slots, but unfortunately didn't win the lottery for > those; we got one slot for NumPy and one for SciPy. We chose the projects > of Anne for NumPy and Maja for SciPy: > >> > >> Anne Bonner, "Making "The Basics" a Little More Basic: Improving the > Introductory NumPy Sections" [2] > >> > >> Maja Gwozdz, "User-oriented documentation and thorough restructuring" > [3] > >> > >> That's not all though. There was some space left in the budget of the > NumPy BIDS grant, and St?fan has reserved that so we can accept more > writers and provide them the same mentoring and funding as they would have > gotten through GSoD. We could only start the conversations about that once > Google made its decisions, so a further announcement will follow. However, > we already have one extra project confirmed, from Brandon: > >> > >> Brandon David, "Improve the documentation of scipy.stats" (project > details to be published). > > > > > > Happy to announce that we have a fourth participant: > > > > Shekhar Rajak, "numpy.org redesign and high level documentation > restructuring for end user focus" > > > > Welcome Shekhar! > > > >> I will send out a poll to find a good time for everyone for a kickoff > call. Our intent is to build a documentation team with multiple writers and > mentors interacting and able to help each other out. And all of this will > also interact with the numpy.org website redesign and the people putting > energy into that:) > > > > > > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope > we can find a time that works for everyone - we're split over all US > timezones, Europe and India. So it's going to be early morning or late > evening somewhere. > > > > Sending this out in public, so anyone who wants to participate is > welcome to join. I've Bcc'd all participants and mentors, to make sure they > see this. > > > > Cheers, > > Ralf > > > > > >> > >> > >> I'm very happy to welcome Anne, Maja and Brandon! > >> > >> Cheers, > >> Ralf > >> > >> > >> [1] https://developers.google.com/season-of-docs/docs/participants/ > >> [2] > https://developers.google.com/season-of-docs/docs/participants/project-numpy > >> [3] > https://developers.google.com/season-of-docs/docs/participants/project-scipy > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Aug 8 13:08:55 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 8 Aug 2019 10:08:55 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: On Thu, Aug 8, 2019 at 5:33 AM Spiros Denaxas wrote: > Hello - I would also be happy to help. > Thanks Spiros! All help is very welcome:) Cheers, Ralf > best > Spiros > > On Thu, Aug 8, 2019 at 1:21 PM Bennet Fauber wrote: > >> I can offer some time for alpha or beta reading and proofreading. >> >> I have experience as a proofreader, copy editor (American Statistical >> Association), and mathematical typesetter (Wiley, Academic Press, >> Addison-Wesley, et al.). I've taught statistical software workshops, >> (very) introductory python, have a Software Carpentry instructor >> certificate, and work daily with people who are finding themselves >> needing technical and scientific computing but who don't have strong >> backgrounds. Hopefully that would be good context for early review of >> a couple of the projects. >> >> I will add my name to the poll, if that's OK? >> >> -- bennet >> >> On Wed, Aug 7, 2019 at 9:03 PM Ralf Gommers >> wrote: >> > >> > >> > >> > On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers >> wrote: >> >> >> >> Hi all, >> >> >> >> Google has announced the Season of Docs participants for this year >> [1]. We had a lot of excellent candidates and had to make some hard >> choices. We applied for extra slots, but unfortunately didn't win the >> lottery for those; we got one slot for NumPy and one for SciPy. We chose >> the projects of Anne for NumPy and Maja for SciPy: >> >> >> >> Anne Bonner, "Making "The Basics" a Little More Basic: Improving the >> Introductory NumPy Sections" [2] >> >> >> >> Maja Gwozdz, "User-oriented documentation and thorough restructuring" >> [3] >> >> >> >> That's not all though. There was some space left in the budget of the >> NumPy BIDS grant, and St?fan has reserved that so we can accept more >> writers and provide them the same mentoring and funding as they would have >> gotten through GSoD. We could only start the conversations about that once >> Google made its decisions, so a further announcement will follow. However, >> we already have one extra project confirmed, from Brandon: >> >> >> >> Brandon David, "Improve the documentation of scipy.stats" (project >> details to be published). >> > >> > >> > Happy to announce that we have a fourth participant: >> > >> > Shekhar Rajak, "numpy.org redesign and high level documentation >> restructuring for end user focus" >> > >> > Welcome Shekhar! >> > >> >> I will send out a poll to find a good time for everyone for a kickoff >> call. Our intent is to build a documentation team with multiple writers and >> mentors interacting and able to help each other out. And all of this will >> also interact with the numpy.org website redesign and the people putting >> energy into that:) >> > >> > >> > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I >> hope we can find a time that works for everyone - we're split over all US >> timezones, Europe and India. So it's going to be early morning or late >> evening somewhere. >> > >> > Sending this out in public, so anyone who wants to participate is >> welcome to join. I've Bcc'd all participants and mentors, to make sure they >> see this. >> > >> > Cheers, >> > Ralf >> > >> > >> >> >> >> >> >> I'm very happy to welcome Anne, Maja and Brandon! >> >> >> >> Cheers, >> >> Ralf >> >> >> >> >> >> [1] https://developers.google.com/season-of-docs/docs/participants/ >> >> [2] >> https://developers.google.com/season-of-docs/docs/participants/project-numpy >> >> [3] >> https://developers.google.com/season-of-docs/docs/participants/project-scipy >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at python.org >> > https://mail.python.org/mailman/listinfo/scipy-dev >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Aug 8 21:45:04 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 8 Aug 2019 18:45:04 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon In-Reply-To: References: Message-ID: On Wed, Aug 7, 2019 at 6:03 PM Ralf Gommers wrote: > > > On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers > wrote: > > I will send out a poll to find a good time for everyone for a kickoff >> call. Our intent is to build a documentation team with multiple writers and >> mentors interacting and able to help each other out. And all of this will >> also interact with the numpy.org website redesign and the people putting >> energy into that:) >> > > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope > we can find a time that works for everyone - we're split over all US > timezones, Europe and India. So it's going to be early morning or late > evening somewhere. > > Sending this out in public, so anyone who wants to participate is welcome > to join. I've Bcc'd all participants and mentors, to make sure they see > this. > That worked out pretty well; all participants, mentors and a few more people can make the meeting on 13 Aug at 3pm UTC. Sent out an invite, and meeting notes doc with Hangouts link can be found at https://hackmd.io/oB_boakvRqKR-_2jRV-Qjg Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From tyler.je.reddy at gmail.com Fri Aug 9 00:06:38 2019 From: tyler.je.reddy at gmail.com (Tyler Reddy) Date: Thu, 8 Aug 2019 22:06:38 -0600 Subject: [SciPy-Dev] ANN: SciPy 1.3.1 Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 Hi all, On behalf of the SciPy development team I'm pleased to announce the release of SciPy 1.3.1, which is a bug fix release. Sources and binary wheels can be found at: https://pypi.org/project/scipy/ and at: https://github.com/scipy/scipy/releases/tag/v1.3.1 One of a few ways to install this release with pip: pip install scipy==1.3.1 ========================== SciPy 1.3.1 Release Notes ========================== SciPy 1.3.1 is a bug-fix release with no new features compared to 1.3.0. Authors ======= * Matt Haberland * Geordie McBain * Yu Feng * Evgeni Burovski * Sturla Molden * Tapasweni Pathak * Eric Larson * Peter Bell * Carlos Ramos Carre?o + * Ralf Gommers * David Hagen * Antony Lee * Ayappan P * Tyler Reddy * Pauli Virtanen A total of 15 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Issues closed for 1.3.1 ------------------------------- * `#5040 `__: BUG: Empty data handling of (c)KDTrees * `#9901 `__: lsoda fails to detect stiff problem when called from solve_ivp * `#10206 `__: sparse matrices indexing with scipy 1.3 * `#10232 `__: Exception in loadarff with quoted nominal attributes in scipy... * `#10292 `__: DOC/REL: Some sections of the release notes are not nested correctly. * `#10303 `__: BUG: optimize: `linprog` failing TestLinprogSimplexBland::test_unbounded_below_no_presolve_corrected * `#10376 `__: TST: Travis CI fails (with pytest 5.0 ?) * `#10384 `__: CircleCI doc build failing on new warnings * `#10398 `__: Scipy 1.3.0 build broken in AIX * `#10501 `__: BUG: scipy.spatial.HalfspaceIntersection works incorrectly * `#10514 `__: BUG: cKDTree GIL handling is incorrect * `#10535 `__: TST: master branch CI failures * `#10572 `__: BUG: ckdtree query_ball_point errors on discontiguous input * `#10597 `__: BUG: No warning on PchipInterpolator changing from bernstein base to local power base Pull requests for 1.3.1 ------------------------------ * `#10071 `__: DOC: reconstruct SuperLU permutation matrices avoiding SparseEfficiencyWarning * `#10196 `__: Fewer checks on xdata for curve_fit. * `#10207 `__: BUG: Compressed matrix indexing should return a scalar * `#10233 `__: Fix for ARFF reader regression (#10232) * `#10306 `__: BUG: optimize: Fix for 10303 * `#10309 `__: BUG: Pass jac=None directly to lsoda * `#10377 `__: TST, MAINT: adjustments for pytest 5.0 * `#10379 `__: BUG: sparse: set writeability to be forward-compatible with numpy>=1.17 * `#10426 `__: MAINT: Fix doc build bugs * `#10431 `__: Update numpy version for AIX * `#10457 `__: BUG: Allow ckdtree to accept empty data input * `#10503 `__: BUG: spatial/qhull: get HalfspaceIntersection.dual_points from the correct array * `#10516 `__: BUG: Use nogil contexts in cKDTree * `#10520 `__: DOC: Proper .rst formatting for deprecated features and Backwards incompatible changes * `#10540 `__: MAINT: Fix Travis and Circle * `#10573 `__: BUG: Fix query_ball_point with discontiguous input * `#10600 `__: BUG: interpolate: fix broken conversions between PPoly/BPoly objects Checksums ========= MD5 ~~~ 818dc6325a4511d656582ff2946eed80 scipy-1.3.1-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 83a11b3127b19e71353ee2a04c4be20c scipy-1.3.1-cp35-cp35m-manylinux1_i686.whl 49c52e00706b47b7311171fe37b9efac scipy-1.3.1-cp35-cp35m-manylinux1_x86_64.whl 015b3d443e2a9e4e664a50af64a7f5b6 scipy-1.3.1-cp35-cp35m-win32.whl 43cf62be72bf7b8e42a1a0fad6570e22 scipy-1.3.1-cp35-cp35m-win_amd64.whl 08d697cdeeb2a4121bbeca8d8d756da9 scipy-1.3.1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 25f6364aa052213d4f504bc96031e431 scipy-1.3.1-cp36-cp36m-manylinux1_i686.whl 2fb8c8c5c17dd7d811165b59d070ef4a scipy-1.3.1-cp36-cp36m-manylinux1_x86_64.whl 6559f4a0438d849cac85ea57f1baa3ba scipy-1.3.1-cp36-cp36m-win32.whl 8164a4832b3b5e948135b92f91d6e8fa scipy-1.3.1-cp36-cp36m-win_amd64.whl 1054925a3d2130f803b27a30e8779282 scipy-1.3.1-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl cf90a33975bfcd08f7275bbc885785cf scipy-1.3.1-cp37-cp37m-manylinux1_i686.whl a9f6dbda23d81ff544d5e0debcc78320 scipy-1.3.1-cp37-cp37m-manylinux1_x86_64.whl 701657c017eb5164582035232bde5769 scipy-1.3.1-cp37-cp37m-win32.whl f7d8824ff9193c34b10017a1d67ce9fe scipy-1.3.1-cp37-cp37m-win_amd64.whl 69db58ceb4b4c3ff3f3ea816e4e426b9 scipy-1.3.1.tar.gz 66e95ade5399a9a336c1b2b78edb2d3a scipy-1.3.1.tar.xz 62ebcbc144342800c5e6b1c3ba86ba0d scipy-1.3.1.zip SHA256 ~~~~~~ 3ae3692616975d3c10aca6d574d6b4ff95568768d4525f76222fb60f142075b9 scipy-1.3.1-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 7ccfa44a08226825126c4ef0027aa46a38c928a10f0a8a8483c80dd9f9a0ad44 scipy-1.3.1-cp35-cp35m-manylinux1_i686.whl cbc0611699e420774e945f6a4e2830f7ca2b3ee3483fca1aa659100049487dd5 scipy-1.3.1-cp35-cp35m-manylinux1_x86_64.whl 435d19f80b4dcf67dc090cc04fde2c5c8a70b3372e64f6a9c58c5b806abfa5a8 scipy-1.3.1-cp35-cp35m-win32.whl 243b04730d7223d2b844bda9500310eecc9eda0cba9ceaf0cde1839f8287dfa8 scipy-1.3.1-cp35-cp35m-win_amd64.whl 46a5e55850cfe02332998b3aef481d33f1efee1960fe6cfee0202c7dd6fc21ab scipy-1.3.1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl dd3b52e00f93fd1c86f2d78243dfb0d02743c94dd1d34ffea10055438e63b99d scipy-1.3.1-cp36-cp36m-manylinux1_i686.whl 75b513c462e58eeca82b22fc00f0d1875a37b12913eee9d979233349fce5c8b2 scipy-1.3.1-cp36-cp36m-manylinux1_x86_64.whl 396eb4cdad421f846a1498299474f0a3752921229388f91f60dc3eda55a00488 scipy-1.3.1-cp36-cp36m-win32.whl a81da2fe32f4eab8b60d56ad43e44d93d392da228a77e229e59b51508a00299c scipy-1.3.1-cp36-cp36m-win_amd64.whl 0baa64bf42592032f6f6445a07144e355ca876b177f47ad8d0612901c9375bef scipy-1.3.1-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl d02d813ec9958ed63b390ded463163685af6025cb2e9a226ec2c477df90c6957 scipy-1.3.1-cp37-cp37m-manylinux1_i686.whl 89dd6a6d329e3f693d1204d5562dd63af0fd7a17854ced17f9cbc37d5b853c8d scipy-1.3.1-cp37-cp37m-manylinux1_x86_64.whl ac37eb652248e2d7cbbfd89619dce5ecfd27d657e714ed049d82f19b162e8d45 scipy-1.3.1-cp37-cp37m-win32.whl a9d606d11eb2eec7ef893eb825017fbb6eef1e1d0b98a5b7fc11446ebeb2b9b1 scipy-1.3.1-cp37-cp37m-win_amd64.whl 2643cfb46d97b7797d1dbdb6f3c23fe3402904e3c90e6facfe6a9b98d808c1b5 scipy-1.3.1.tar.gz 326ffdad79f113659ed0bca80f5d0ed5e28b2e967b438bb1f647d0738073a92e scipy-1.3.1.tar.xz 47dd0ff4a9f17d97e9b68b363c54111f11b73a388cac55069d6e88a602d20552 scipy-1.3.1.zip -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iQIcBAEBCAAGBQJdTOWrAAoJELD/41ZX0J71Y1UP/ivoDqq6BDDdIB45Oo+XIH6h oZ3c5np+ui/jo/SIF82vM7W/7ZfTEf/BZEKM/ypqO8vk/EltF6jJb8pwyte8Mdic 5hlAQEbLLez2hipGvIIe3K2kS7OFWWorrZWcd+xuZxn9xV3BK4wDXNndpUW61xBF ElKI+HqfwZ6BLSOpDE9UuMWfIQ9wVACr+TtZSmJsQ1A55o9pcMR5HC0HrTBT40Sj SPQM1ECOJw2yNzijaYhEUvPIZsFIHjLqtAVkk+4HG675TtLw0NyqZIw9jhUax6vj rdkj1tSChtSEXE7D2AfZ446bHT+KsxHW6gMXhI96SFi0w698+h2UErZ7uqbnBkQO OAyzewcUp9yqgSf9pciR6WNiRKvPW8yW+l4XJGqERZDePQ1U4d1Ou9yx1efAkb6t j4SxpziM4n3ib3TKzRwQ574EJKKOK5TNYHAwH0BIqN4Ace3aN3D/XNojmelUh8gH d1TklvMGj54pwZjHnqeJMQG+vXbJdI2TXN5qUjFUcsghGvhKPWhl+Un9qoi0+/ZX nBOEXylKF1MkdV1vd4xLS5gFuyCf7EPnJl9vKMHnrkoUuCoxLM2YtAzjgiUiTjK2 lJAxppWZuNMgTcBsDgg6rtHiLcgl44YehPquV1pvs5DrxQzdq5kVwGu7yU2XRSbf QIsVcdAujT8MNGOvoJae =G7vC -----END PGP SIGNATURE----- -------------- next part -------------- An HTML attachment was scrubbed... URL: From PeterBell10 at live.co.uk Fri Aug 9 11:59:33 2019 From: PeterBell10 at live.co.uk (Peter Bell) Date: Fri, 9 Aug 2019 15:59:33 +0000 Subject: [SciPy-Dev] Using OpenMP in SciPy In-Reply-To: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> References: <006effd5078e452ee85476665fc5d6f34a4c9f12.camel@iki.fi> Message-ID: Since it looks like depending on OpenMP isn't going to be an option, I've migrated pypocketfft over to a custom thread pool that I've written using the C++11 threading library [1]. This only replaces the minimal set of OpenMP functionality used by pypocketfft which is basically `#pragma omp parallel` and `omp_get_thread_num()`. The key part is that on posix platforms it uses pthread_atfork to shutdown the thread pool in preparation for the fork and re-initialize it afterwards in both the parent and child process. In the corresponding scipy.fft PR (gh-10614) I have tests verifying that the thread pool can safely be mixed with python multiprocessing. Peter [1]: https://gitlab.mpcdf.mpg.de/mtr/pypocketfft/merge_requests/23 -----Original Message----- From: SciPy-Dev On Behalf Of Pauli Virtanen Sent: 03 August 2019 13:10 To: scipy-dev at python.org Subject: Re: [SciPy-Dev] Using OpenMP in SciPy Hi, pe, 2019-08-02 kello 19:38 +0000, Peter Bell kirjoitti: [clip] > 3. openmp doesn't play nicely with multiprocessing.Pool > > To expand on that third point, some openmp runtimes aren?t fork-safe. > Most notably, this includes gcc?s libgomp. Upon entering the first > openmp parallel region, the runtime initializes a thread pool which > won?t be rebuilt in the child after fork. This means that any parallel > regions in the child will deadlock. Single threaded openmp loops seem > to be safe though. At first sight, this looks like a showstopper. As I understand, there's no workaround (e.g. even with pthread_atfork setting a scipy-global flag that forces #threads to 1 in any subsequent calls, it would still freeze)? multiprocessing is one of the most common parallelization schemes with Python, and breaking that sounds quite painful. I would be careful with shipping wheels built with OpenMP that breaks user code using multiprocessing. Environment flags might be used to mitigate, but probably should fail-safe and default to not using OpenMP. Since we don't have control over the user code, which has already been written, loky et al. I think are not really a solution. Pauli _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev From josephgama at yahoo.com Sun Aug 11 08:01:02 2019 From: josephgama at yahoo.com (Joseph Gama) Date: Sun, 11 Aug 2019 12:01:02 +0000 (UTC) Subject: [SciPy-Dev] Matrix identification methods References: <472352712.2840127.1565524862170.ref@mail.yahoo.com> Message-ID: <472352712.2840127.1565524862170@mail.yahoo.com> Hi, Just submitted a PR with these methods:is_matrix_hermitianis_matrix_symmetricis_matrix_skew_symmetric?is_matrix_nonsingularis_matrix_singularis_matrix_idempotentis_matrix_positive_definiteis_matrix_positive_semidefiniteis_matrix_negative_definiteis_matrix_negative_semidefiniteis_matrix_indefinite Feedback is welcome. :) https://github.com/tuxcell/scipy/tree/helperfuncs -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Aug 13 13:40:09 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 13 Aug 2019 10:40:09 -0700 Subject: [SciPy-Dev] triage team Message-ID: Hi all, Now that GitHub has introduced triage permissions [1] and we've got Season of Docs starting, I have create a triage team [2] for the SciPy org. This allows labelling closing/reopening/assigning issues and PRs. I've invited a few people already, if anyone else would like an invite please let me know. Cheers, Ralf [1] https://github.blog/changelog/2019-05-23-triage-and-maintain-roles-beta/ [2] https://github.com/orgs/scipy/teams/triage -------------- next part -------------- An HTML attachment was scrubbed... URL: From shekharrajak.1994 at gmail.com Tue Aug 13 14:04:49 2019 From: shekharrajak.1994 at gmail.com (Shekhar Rajak) Date: Tue, 13 Aug 2019 18:04:49 +0000 (UTC) Subject: [SciPy-Dev] triage team In-Reply-To: References: Message-ID: <1573520149.3497962.1565719489563@mail.yahoo.com> Hello, I can see that SciPy and NumPy use the same sphinx theme for documentation. We will be discussing a better documentation site for NumPy (that could be used for SciPy as well). So I just want to be updated with the changes that team will be doing in this Season of Docs. If possible kindly invite me as well. My Github username is Shekharrajak. Thanks and regards,Shekhar Prasad Rajak, Contact : +918142478937Blog?|?Github?|?TwitterSkype: shekhar.rajak1 On Tuesday, 13 August 2019, 11:10:58 pm GMT+5:30, Ralf Gommers wrote: Hi all, Now that GitHub has introduced triage permissions [1] and we've got Season of Docs starting, I have create a triage team [2] for the SciPy org. This allows labelling closing/reopening/assigning issues and PRs. I've invited a few people already, if anyone else would like an invite please let me know. Cheers, Ralf [1] https://github.blog/changelog/2019-05-23-triage-and-maintain-roles-beta/ [2] https://github.com/orgs/scipy/teams/triage_______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From angeline.burrell at nrl.navy.mil Wed Aug 14 13:49:49 2019 From: angeline.burrell at nrl.navy.mil (Burrell, Angeline) Date: Wed, 14 Aug 2019 17:49:49 +0000 Subject: [SciPy-Dev] NaN insensitive circular statistics Message-ID: <6DEB95FA-2CCA-4C32-9CD5-D77585D92375@contoso.com> Second attempt for a response for a new feature to scipy.stats, since I was alerted that my first email was going into spam folders. I do hope to hear back (positively or negatively) from the community. I have NaN insensitive versions of the SciPy circular mean and standard deviation routines that I would like to contribute to the scipy.stats subpackage. Since the hacking guidelines recommend discussing new contributions on this mailing list, I'd like to get the communal go-ahead before proceeding to integrate the routines and unit tests into scipy. For consistency, I would also add a NaN insensitive version of the circular variance routine. Cheers, Angeline ----------------------------------------------------- Dr. Angeline G. Burrell [she/her/hers] Research Physicist, Bldg. 209 Naval Research Laboratory (NRL) 4555 Overlook Ave SW Washington, DC 20375 (P) 202-404-4065 ----------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Aug 14 21:01:35 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 14 Aug 2019 18:01:35 -0700 Subject: [SciPy-Dev] triage team In-Reply-To: <1573520149.3497962.1565719489563@mail.yahoo.com> References: <1573520149.3497962.1565719489563@mail.yahoo.com> Message-ID: On Tue, Aug 13, 2019 at 11:05 AM Shekhar Rajak wrote: > Hello, > > I can see that SciPy and NumPy use the same sphinx theme for > documentation. We will be discussing a better documentation site for NumPy > (that could be used for SciPy as well). > Yes, we've got a good amount of energy for improvements in this area! Cheers, Ralf > > So I just want to be updated with the changes that team will be doing in > this Season of Docs. If possible kindly invite me as well. My Github > username is Shekharrajak. > done! Cheers, Ralf > Thanks and regards, > Shekhar Prasad Rajak, > Contact : +918142478937 > Blog | Github > | Twitter > > Skype: shekhar.rajak1 > > > > On Tuesday, 13 August 2019, 11:10:58 pm GMT+5:30, Ralf Gommers < > ralf.gommers at gmail.com> wrote: > > > Hi all, > > Now that GitHub has introduced triage permissions [1] and we've got Season > of Docs starting, I have create a triage team [2] for the SciPy org. This > allows labelling closing/reopening/assigning issues and PRs. I've invited a > few people already, if anyone else would like an invite please let me know. > > Cheers, > Ralf > > > [1] > https://github.blog/changelog/2019-05-23-triage-and-maintain-roles-beta/ > [2] https://github.com/orgs/scipy/teams/triage > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Wed Aug 14 23:31:54 2019 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Wed, 14 Aug 2019 23:31:54 -0400 Subject: [SciPy-Dev] Matrix identification methods In-Reply-To: <472352712.2840127.1565524862170@mail.yahoo.com> References: <472352712.2840127.1565524862170.ref@mail.yahoo.com> <472352712.2840127.1565524862170@mail.yahoo.com> Message-ID: It would be helpful to note the PR number. - Joe On Sun, Aug 11, 2019, 8:02 AM Joseph Gama wrote: > Hi, > > Just submitted a PR with these methods: > is_matrix_hermitian > is_matrix_symmetric > is_matrix_skew_symmetric > is_matrix_nonsingular > is_matrix_singular > is_matrix_idempotent > is_matrix_positive_definite > is_matrix_positive_semidefinite > is_matrix_negative_definite > is_matrix_negative_semidefinite > is_matrix_indefinite > > Feedback is welcome. :) > > https://github.com/tuxcell/scipy/tree/helperfuncs > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josephgama at yahoo.com Thu Aug 15 14:56:04 2019 From: josephgama at yahoo.com (Joseph Gama) Date: Thu, 15 Aug 2019 18:56:04 +0000 (UTC) Subject: [SciPy-Dev] Matrix identification methods In-Reply-To: References: <472352712.2840127.1565524862170.ref@mail.yahoo.com> <472352712.2840127.1565524862170@mail.yahoo.com> Message-ID: <1053451617.4183320.1565895364509@mail.yahoo.com> Hi, Thank you for letting me know! :) Matrix identification methods and test ?#10639 Tuxcell On Thursday, August 15, 2019, 05:32:07 AM GMT+2, Joseph Fox-Rabinovitz wrote: It would be helpful to note the PR number. - Joe On Sun, Aug 11, 2019, 8:02 AM Joseph Gama wrote: Hi, Just submitted a PR with these methods:is_matrix_hermitianis_matrix_symmetricis_matrix_skew_symmetric?is_matrix_nonsingularis_matrix_singularis_matrix_idempotentis_matrix_positive_definiteis_matrix_positive_semidefiniteis_matrix_negative_definiteis_matrix_negative_semidefiniteis_matrix_indefinite Feedback is welcome. :) https://github.com/tuxcell/scipy/tree/helperfuncs _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From josephgama at yahoo.com Fri Aug 16 15:52:20 2019 From: josephgama at yahoo.com (Joseph Gama) Date: Fri, 16 Aug 2019 19:52:20 +0000 (UTC) Subject: [SciPy-Dev] Cholesky decomposition with full pivot References: <374399703.290575.1565985140964.ref@mail.yahoo.com> Message-ID: <374399703.290575.1565985140964@mail.yahoo.com> Hello, I recently added wrappers for Lapack pstrf and psft2 with the purpose of adding full pivot capabilities to the cholesky decomposition because it allows the use of positive semidefinite matrices. R has this feature and it is useful in many areas.?All feedback is welcome! :)Tuxcell P.S.: my proposed changes ("is_hermitian" is currently in PR?Matrix identification methods and test by tuxcell ? Pull Request #10639 ? scipy/scipy) | | | | | | | | | | | Matrix identification methods and test by tuxcell ? Pull Request #10639 ... Adds several methods for checking matrices It implements these checks: is_hermitian, is_symmetric, is_skew_symm... | | | def _cholesky(a, lower=False, overwrite_a=False, clean=True,? ? ? ? ? ? ? check_finite=True, full_pivot=False, pivot_tol=-1):? ? """Common code for cholesky() and cho_factor().""" ? ? a1 = asarray_chkfinite(a) if check_finite else asarray(a)? ? a1 = atleast_2d(a1) ? ? # Dimension check? ? if a1.ndim != 2:? ? ? ? raise ValueError('Input array needs to be 2 dimensional but received '? ? ? ? ? ? ? ? ? ? ? ? ?'a {}d-array.'.format(a1.ndim))? ? # Squareness check? ? if a1.shape[0] != a1.shape[1]:? ? ? ? raise ValueError('Input array is expected to be square but has '? ? ? ? ? ? ? ? ? ? ? ? ?'the shape: {}.'.format(a1.shape)) ? ? # Quick return for square empty array? ? if a1.size == 0:? ? ? ? return a1.copy(), lower ? ? if not is_hermitian():? ? ? ? raise LinAlgError("Expected symmetric or hermitian matrix") ? ? overwrite_a = overwrite_a or _datacopied(a1, a) ? ? # if the pivot flag is false, return the result? ? if not full_pivot:? ? ? ? potrf, = get_lapack_funcs(('potrf',), (a1,))? ? ? ? c, info = potrf(a1, lower=lower, overwrite_a=overwrite_a, clean=clean)? ? ? ? if info > 0:? ? ? ? ? ? raise LinAlgError("%d-th leading minor of the array is not positive "? ? ? ? ? ? ? ? ? ? ? ? ? ? ? "definite" % info)? ? ? ? if info < 0:? ? ? ? ? ? raise ValueError('LAPACK reported an illegal value in {}-th argument'? ? ? ? ? ? ? ? ? ? ? ? ? ? ?'on entry to "POTRF".'.format(-info))? ? ? ? return c, lower? ? else: # if the pivot flag is true, return the result plus rank and pivot ? ? ? ? pstrf, = get_lapack_funcs(('pstrf',), (a1,))? ? ? ? c, pivot, rank, info=pstrf(a1, lower=False, overwrite_a=False, clean=True, tol=pivot_tol) ? ? ? ? if info > 0:? ? ? ? ? ? if rank == 0:? ? ? ? ? ? ? ? raise LinAlgError("%d-th leading minor of the array is not positive "? ? ? ? ? ? ? ? ? ? ? ? ? ? ? "semidefinite" % info)? ? ? ? ? ? else:? ? ? ? ? ? ? ? raise LinAlgError("The array is rank deficient with "? ? ? ? ? ? ? ? ? ? ? ? ? ? ? "computed rank %d" % info) ? ? ? ? if info < 0:? ? ? ? ? ? raise ValueError('LAPACK reported an illegal value in {}-th argument'? ? ? ? ? ? ? ? ? ? ? ? ?'on entry to "PSTRF".'.format(-info))? ? ? ? return c, lower, rank, pivot def cholesky(a, lower=False, overwrite_a=False, check_finite=True, full_pivot=False, pivot_tol=-1):? ? """? ? Compute the Cholesky decomposition of a matrix. ? ? Returns the Cholesky decomposition, :math:`A = L L^*` or? ? :math:`A = U^* U` of a Hermitian positive-definite matrix A. ? ? Parameters? ? ----------? ? a : (M, M) array_like? ? ? ? Matrix to be decomposed? ? lower : bool, optional? ? ? ? Whether to compute the upper or lower triangular Cholesky? ? ? ? factorization.? Default is upper-triangular.? ? overwrite_a : bool, optional? ? ? ? Whether to overwrite data in `a` (may improve performance).? ? check_finite : bool, optional? ? ? ? Whether to check that the input matrix contains only finite numbers.? ? ? ? Disabling may give a performance gain, but may result in problems? ? ? ? (crashes, non-termination) if the inputs do contain infinities or NaNs.? ? full_pivot : bool, optional? ? ? ? Whether to use full pivot or not? ? pivot_tol : float, optional? ? ? ? Tolerance for the pivot, if < 0 then tolerance = N*U*MAX( A(M,M) ) ? ? Returns? ? -------? ? c : (M, M) ndarray? ? ? ? Upper- or lower-triangular Cholesky factor of `a`. ? ? Raises? ? ------? ? LinAlgError : if decomposition fails. ? ? Examples? ? --------? ? >>> from scipy.linalg import cholesky? ? >>> a = np.array([[1,-2j],[2j,5]])? ? >>> L = cholesky(a, lower=True)? ? >>> L? ? array([[ 1.+0.j,? 0.+0.j],? ? ? ? ? ?[ 0.+2.j,? 1.+0.j]])? ? >>> L @ L.T.conj()? ? array([[ 1.+0.j,? 0.-2.j],? ? ? ? ? ?[ 0.+2.j,? 5.+0.j]]) ? ? """? ? if not full_pivot:? ? ? ? c,lower = _cholesky(a, lower=lower, overwrite_a=overwrite_a, clean=True, check_finite=check_finite)? ? ? ? return c? ? else:? ? ? ? c,lower, rank_bn, piv = _cholesky(a, lower=lower, overwrite_a=overwrite_a, clean=True, check_finite=check_finite, full_pivot=full_pivot, pivot_tol=pivot_tol)? ? ? ? return c, rank_bn, piv -------------- next part -------------- An HTML attachment was scrubbed... URL: From christoph.baumgarten at gmail.com Sun Aug 18 04:53:55 2019 From: christoph.baumgarten at gmail.com (Christoph Baumgarten) Date: Sun, 18 Aug 2019 10:53:55 +0200 Subject: [SciPy-Dev] NaN insensitive circular statistics In-Reply-To: References: Message-ID: Hi, thanks for your proposal. The Methods in stats often have a Keyword nan_policy that allows to omit NaNs or to raise an error. There are some inconsitencies though (you can find the issues on Github). What kind of treatment do you propose for circular mean / std? Christoph 1. NaN insensitive circular statistics (Burrell, Angeline) > 2. Re: triage team (Ralf Gommers) > 3. Re: Matrix identification methods (Joseph Fox-Rabinovitz) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 14 Aug 2019 17:49:49 +0000 > From: "Burrell, Angeline" > To: "scipy-dev at python.org" > Subject: [SciPy-Dev] NaN insensitive circular statistics > Message-ID: <6DEB95FA-2CCA-4C32-9CD5-D77585D92375 at contoso.com> > Content-Type: text/plain; charset="utf-8" > > Second attempt for a response for a new feature to scipy.stats, since I > was alerted that my first email was going into spam folders. I do hope to > hear back (positively or negatively) from the community. > > I have NaN insensitive versions of the SciPy circular mean and standard > deviation routines that I would like to contribute to the scipy.stats > subpackage. Since the hacking guidelines recommend discussing new > contributions on this mailing list, I'd like to get the communal go-ahead > before proceeding to integrate the routines and unit tests into scipy. For > consistency, I would also add a NaN insensitive version of the circular > variance routine. > > Cheers, > Angeline > > ----------------------------------------------------- > Dr. Angeline G. Burrell [she/her/hers] > Research Physicist, Bldg. 209 > Naval Research Laboratory (NRL) > 4555 Overlook Ave SW > Washington, DC 20375 > (P) 202-404-4065 > ----------------------------------------------------- > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Sun Aug 18 21:36:50 2019 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 19 Aug 2019 11:36:50 +1000 Subject: [SciPy-Dev] callable jacobians for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr. Message-ID: Clarification question to close GH9042. Which of the minimizer methods (CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg, trust-krylov,trust-exact and trust-constr): - require a *callable* jacobian? - are satisfied with a jacobian estimated by finite differences? -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Aug 18 23:28:24 2019 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 18 Aug 2019 20:28:24 -0700 Subject: [SciPy-Dev] Season of Docs - welcome Anne, Maja, Brandon, Shekhar, Christina In-Reply-To: References: Message-ID: Hi all, Happy to announce that we have a fifth participant: Christina Lee, "SciPy documentation: Design, Usability and Content". Welcome Christina! I expect that this is the final announcement. Really enjoying the momentum that's already building up around documentation and website, and looking forward to the next couple of months! Cheers, Ralf On Wed, Aug 7, 2019 at 6:03 PM Ralf Gommers wrote: > > > On Tue, Aug 6, 2019 at 4:46 PM Ralf Gommers > wrote: > >> Hi all, >> >> Google has announced the Season of Docs participants for this year [1]. >> We had a lot of excellent candidates and had to make some hard choices. We >> applied for extra slots, but unfortunately didn't win the lottery for >> those; we got one slot for NumPy and one for SciPy. We chose the projects >> of Anne for NumPy and Maja for SciPy: >> >> Anne Bonner, "Making "The Basics" a Little More Basic: Improving the >> Introductory NumPy Sections" [2] >> >> Maja Gwozdz, "User-oriented documentation and thorough restructuring" [3] >> >> That's not all though. There was some space left in the budget of the >> NumPy BIDS grant, and St?fan has reserved that so we can accept more >> writers and provide them the same mentoring and funding as they would have >> gotten through GSoD. We could only start the conversations about that once >> Google made its decisions, so a further announcement will follow. However, >> we already have one extra project confirmed, from Brandon: >> >> Brandon David, "Improve the documentation of scipy.stats" (project >> details to be published). >> > > Happy to announce that we have a fourth participant: > > Shekhar Rajak, "numpy.org redesign and high level documentation > restructuring for end user focus" > > Welcome Shekhar! > > I will send out a poll to find a good time for everyone for a kickoff >> call. Our intent is to build a documentation team with multiple writers and >> mentors interacting and able to help each other out. And all of this will >> also interact with the numpy.org website redesign and the people putting >> energy into that:) >> > > Here is the poll link: https://doodle.com/poll/skgbk74gsg8zpziu. I hope > we can find a time that works for everyone - we're split over all US > timezones, Europe and India. So it's going to be early morning or late > evening somewhere. > > Sending this out in public, so anyone who wants to participate is welcome > to join. I've Bcc'd all participants and mentors, to make sure they see > this. > > Cheers, > Ralf > > > >> >> I'm very happy to welcome Anne, Maja and Brandon! >> >> Cheers, >> Ralf >> >> >> [1] https://developers.google.com/season-of-docs/docs/participants/ >> [2] >> https://developers.google.com/season-of-docs/docs/participants/project-numpy >> [3] >> https://developers.google.com/season-of-docs/docs/participants/project-scipy >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Aug 19 03:00:51 2019 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 19 Aug 2019 17:00:51 +1000 Subject: [SciPy-Dev] Rationalisation of finite differences for minimize Message-ID: Dear devs, In order to fully address https://github.com/scipy/scipy/issues/6026 I propose the following behaviour for optimize.minimize. For minimize methods that use a jacobian: 1) if a callable jacobian is supplied simply use that. (no change) 2) if `jac is True`, then `fun` is assumed to return function and gradient. (no change) 3) if `jac is None` or `bool(jac)` evaluates to `False`, then use forward differences with an absolute step of epsilon. This behaviour ensures back compatibility is kept for those methods that already accept an `epsilon` or `eps` keyword. 4) have an additional `finite_diff_rel_step` keyword. For further information see `optimize._numdiff.approx_derivative`. This would be changed behaviour for some methods. trust-constr already has this keyword (which is why I suggested it). This keyword would only be added to those that already have an `epsilon` or `eps` keyword. 5) if `jac in ['2-point', '3-point', 'cs']`, in this case the finite differences would not use an absolute step size, but would use either forward differences/central differences/complex step for gradient calculation. The step size would be that specified by `finite_diff_rel_step`. This would only work for those methods that now possess the `finite_diff_rel_step` keyword. Gradient calculation would be done by `_differentiable_functions.ScalarFunction` which eventually calls `_numdiff.approx_derivative`. This approach is already used by trust-constr. `ScalarFunction` caches function evaluations. 6) there are some methods that only work if points 1 or 2 are satisfied (i.e. `callable(jac) or jac is True`). These methods are: trust-krylov, trust-ncg, trust-exact, dogleg, Newton-CG (I hope I have them all). They raise an exception if a callable gradient function is not provided by the user. It's unclear to me if these methods can use a gradient calculated by finite differences. If they can, then I propose to add the ability to use finite differences with points 1, 2, 4, 5 (not 3) above. I need feedback on this point from subject experts. 7) Possibly remove `approx_jacobian` from slsqp.py. This should be do-able by `approx_derivative` instead. 8) Fix up documentation for these options. Enhancements -------------------- a) the better tested `approx_derivative` is now used instead of `approx_fprime`. b) relative step sizes are much preferred to absolute step size as the truncation and round off errors are balanced. c) there is hopefully a more consistent interface to jac specification d) gradients calculated by central-differences are more accurate than 2-point calculation, it's nice to have the option to use that (at the expense of function evaluations). e) it should be possible to reduce the number of function evaluations by a small amount because ScalarFunction caches values. f) Bounds can be supplied to ScalarFunction/approx_derivative. If a parameter value is at a boundary, then either of those two functions should not take a step outside the boundary whilst evaluating the gradient. Drawbacks --------------- g) Technically this meets back compatibility according to documentation. However, some users may be using jac=`2-point`, which at the moment may be converted to an absolute step+forward difference calculation. The documentation doesn't say that an absolute step is being used internally. The proposed behaviour would move it to a relative step. Thus users would get a slightly changed implementation. Backwards compatibility is only assured by setting `jac=False` or `jac=None`. h) minimize is a complex beast with lots of interlocking behaviour, I don't think this will increase complexity by an appreciable amount, but there's definitely a possibility someone is upset because behaviour is changed slightly. I'd appreciate feedback on this, especially why gradient functions are required (point 6) for some methods. Hopefully it's not too controversial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From serge.guelton at telecom-bretagne.eu Mon Aug 19 09:13:44 2019 From: serge.guelton at telecom-bretagne.eu (Serge Guelton) Date: Mon, 19 Aug 2019 15:13:44 +0200 Subject: [SciPy-Dev] =?iso-8859-1?q?Pythran_0=2E9=2E3_-_Ha=F1v?= Message-ID: <20190819131344.GA6939@sguelton.remote.csb> Hi folks, I just released version 0.9.3 of the Pythran package, Short reminder: Pythran is an ahead-of-time compiler for scientific Python, with a focus on high-level numerical kernels, parallelism and vectorisation. Here is a simple kernel example, with a pythran annotation. Note that that kernel is still Python-compatible (from https://stackoverflow.com/questions/57199248/) : import numpy as np #pythran export col_sum(int[:,:] or float[:,:], int[:]) def col_sum(data, idx): return data.T[idx].sum(0) The Pythran package is available on PyPI, Github and Conda https://pypi.org/project/pythran/ https://anaconda.org/conda-forge/pythran https://github.com/serge-sans-paille/pythran The interested reader can have a look to the changelog for details https://pythran.readthedocs.io/en/latest/Changelog.html Long story short: bug fixes and better 32bit arch support. Plus (Thanks to Miro Hron?ok), pythran is now available on Fedora \o/ Huge thanks to all contributors and bug reporters: Jean Laroche Yann Diorcet DWesl Miro Hron?ok Piotr Bartmann Jochen Schr?der Sylwester Arabas Marti Bosch rorroiga Pierre Augier Anubhab Haldar nbecker From angeline.burrell at nrl.navy.mil Mon Aug 19 16:54:25 2019 From: angeline.burrell at nrl.navy.mil (Burrell, Angeline) Date: Mon, 19 Aug 2019 20:54:25 +0000 Subject: [SciPy-Dev] NaN insensitive circular statistics Message-ID: <1C0454B6-5476-4904-9FAC-B32D30382CDF@nrl.navy.mil> Hi Christoph, Thank you for your response. The circ methods (circmean, circstd, and circvar) do not currently use the nan_policy keyword, and I wasn't able to find any pull requests addressing this on the GitHub page. I propose to add the nan_policy keyword option to circmean, circstd, and circvar. This keyword would then be passed to _circfuncs_commons, which currently tests the input samples and returns the samples as an array of angles in radians so that the circular statistics can be calculated. This makes it a sensible place to identify or omit any NaN (if desired). This way, if nan_policy is 'propagate', the circ methods would behave as they currently do, 'raise' would case _circfuncs_commons to throw an error, and 'omit' would remove all the NaN values before testing the sample size. Cheers, Angeline From haberland at ucla.edu Wed Aug 21 13:12:26 2019 From: haberland at ucla.edu (Matt Haberland) Date: Wed, 21 Aug 2019 10:12:26 -0700 Subject: [SciPy-Dev] Rationalisation of finite differences for minimize In-Reply-To: References: Message-ID: 1, 2, and 5-8 look good to me. 3) if `jac is None` or `bool(jac)` evaluates to `False`, then use forward > differences with an absolute step of epsilon. This behaviour ensures back > compatibility is kept for those methods that already accept an `epsilon` or > `eps` keyword. > I would suggest that if `eps` is explicitly defined by the user, then the value should be respected, but if it is not explicitly defined and `bool(jac)` is `False`, we might upgrade the derivative approximation to approx_derivative. If this is deemed a real backwards compatibility issue, never mind. > 4) have an additional `finite_diff_rel_step` keyword. For further > information see `optimize._numdiff.approx_derivative`. This would be > changed behaviour for some methods. trust-constr already has this keyword > (which is why I suggested it). This keyword would only be added to those > that already have an `epsilon` or `eps` keyword. > What will happen if the user specifies both? Rather than having a new argument, I might have suggested that we use a provided `eps` as a relative step size if `jac` is one of the allowed strings. Then again, if `finite_diff_rel_step` is already used by `trust-constr`, it makes sense to use it. 6) there are some methods that only work if points 1 or 2 are satisfied > (i.e. `callable(jac) or jac is True`). These methods are: trust-krylov, > trust-ncg, trust-exact, dogleg, Newton-CG (I hope I have them all). They > raise an exception if a callable gradient function is not provided by the > user. > It's unclear to me if these methods can use a gradient calculated by > finite differences. If they can, then I propose to add the ability to use > finite differences with points 1, 2, 4, 5 (not 3) above. I need feedback on > this point from subject experts. > If you don't hear back from the experts, I'd say it's worth a shot to try adding this ability to use finite difference approximations to these algorithms. Some algorithms are more robust to derivative approximation errors than others, sure, but I've not heard of an algorithm that requires its derivatives to be accurate to same machine precision as the function evaluation. Drawbacks > --------------- > g) Technically this meets back compatibility according to documentation. > However, some users may be using jac=`2-point`, which at the moment may be > converted to an absolute step+forward difference calculation. The > documentation doesn't say that an absolute step is being used internally. > The proposed behaviour would move it to a relative step. Thus users would > get a slightly changed implementation. Backwards compatibility is only > assured by setting `jac=False` or `jac=None`. > (If we think the suggestion for 3 above is OK, then backwards compatibility is assured by setting `jac=False` or `jac=None` AND explicitly defining `eps`) -------------- next part -------------- An HTML attachment was scrubbed... URL: From juanlu001 at gmail.com Thu Aug 22 10:44:37 2019 From: juanlu001 at gmail.com (Juan Luis Cano) Date: Thu, 22 Aug 2019 16:44:37 +0200 Subject: [SciPy-Dev] Sprints at EuroSciPy Message-ID: Hi all, There are some free slots for the EuroSciPy sprints and the organizers encouraged us to propose more: https://www.euroscipy.org/2019/program.html I wonder if there are other people attending EuroSciPy interested in sprinting, what topics could we choose (general bug triaging? high priority defects? a specific sub package needing some love? some work on numba-scipy?), and if people not attending (especially core developers) would be open to participate remotely, perhaps answering questions on IRC (or Matrix, see https://riot.im/app/#/room/#freenode_#scipy:matrix.org). I have a general interest in scipy.integrate, scipy.optimize and scipy.signal but I'm not an expert in anything. What I'd like is to see a SciPy sprint at EuroSciPy to, you know, honor the name of the conference :) If there's interest, I will propose it to the organizers. Best, -- Juan Luis Cano -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Thu Aug 22 13:35:33 2019 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Thu, 22 Aug 2019 19:35:33 +0200 Subject: [SciPy-Dev] Sprints at EuroSciPy In-Reply-To: References: Message-ID: I was planning to attend for the maintainers summit but didn't hear back from them on Twitter and I couldn't get into contact with the organizers which is probably an error on my side. Then I was thinking about presenting something but then I decided not to, since like last year, this is kind of a PyData-variant rather than Scipy, or a "Euroscipy.stats" if you will. Just glancing through, I think I've spotted only 3-4 out of "a lot > 30" that are not necessarily data/ML related. Having said all that, we need all the love there is out there :) Especially about adding examples to documentation. Since you mention scipy.signal have a look at https://github.com/scipy/scipy/issues/7168 On Thu, Aug 22, 2019 at 4:45 PM Juan Luis Cano wrote: > Hi all, > > There are some free slots for the EuroSciPy sprints and the organizers > encouraged us to propose more: > > https://www.euroscipy.org/2019/program.html > > I wonder if there are other people attending EuroSciPy interested in > sprinting, what topics could we choose (general bug triaging? high priority > defects? a specific sub package needing some love? some work on > numba-scipy?), and if people not attending (especially core developers) > would be open to participate remotely, perhaps answering questions on IRC > (or Matrix, see https://riot.im/app/#/room/#freenode_#scipy:matrix.org). > > I have a general interest in scipy.integrate, scipy.optimize and > scipy.signal but I'm not an expert in anything. What I'd like is to see a > SciPy sprint at EuroSciPy to, you know, honor the name of the conference :) > If there's interest, I will propose it to the organizers. > > Best, > > -- > Juan Luis Cano > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Aug 26 21:25:12 2019 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 26 Aug 2019 19:25:12 -0600 Subject: [SciPy-Dev] NumPy 1.17.1 released Message-ID: Hi All, On behalf of the NumPy team I am pleased to announce that NumPy 1.17.1 has been released. This release contains a number of fixes for bugs reported against NumPy 1.17.0 along with a few documentation and build improvements. The Python versions supported are 3.5-3.7, note that Python 2.7 has been dropped. Python 3.8b3 should work with the released source packages, but there are no future guarantees. Downstream developers should use Cython >= 0.29.13 for Python 3.8 support and OpenBLAS >= 3.7 to avoid wrong results on the Skylake architecture. The NumPy wheels on PyPI are built from the OpenBLAS development branch in order to avoid those problems. Wheels for this release can be downloaded from PyPI , source archives and release notes are available from Github . *Contributors* A total of 17 people contributed to this release. People with a "+" by their names contributed a patch for the first time. - Alexander Jung + - Allan Haldane - Charles Harris - Eric Wieser - Giuseppe Cuccu + - Hiroyuki V. Yamazaki - J?r?mie du Boisberranger - Kmol Yuan + - Matti Picus - Max Bolingbroke + - Maxwell Aladago + - Oleksandr Pavlyk - Peter Andreas Entschev - Sergei Lebedev - Seth Troisi + - Vladimir Pershin + - Warren Weckesser *Pull requests merged* A total of 24 pull requests were merged for this release. - gh-14156: TST: Allow fuss in testing strided/non-strided exp/log loops - gh-14157: BUG: avx2_scalef_ps must be static - gh-14158: BUG: Remove stray print that causes a SystemError on python 3.7. - gh-14159: BUG: Fix DeprecationWarning in python 3.8. - gh-14160: BLD: Add missing gcd/lcm definitions to npy_math.h - gh-14161: DOC, BUILD: cleanups and fix (again) 'build dist' - gh-14166: TST: Add 3.8-dev to travisCI testing. - gh-14194: BUG: Remove the broken clip wrapper (Backport) - gh-14198: DOC: Fix hermitian argument docs in svd. - gh-14199: MAINT: Workaround for Intel compiler bug leading to failing test - gh-14200: TST: Clean up of test_pocketfft.py - gh-14201: BUG: Make advanced indexing result on read-only subclass writeable... - gh-14236: BUG: Fixed default BitGenerator name - gh-14237: ENH: add c-imported modules for freeze analysis in np.random - gh-14296: TST: Pin pytest version to 5.0.1 - gh-14301: BUG: Fix leak in the f2py-generated module init and `PyMem_Del`... - gh-14302: BUG: Fix formatting error in exception message - gh-14307: MAINT: random: Match type of SeedSequence.pool_size to DEFAULT_POOL_SIZE. - gh-14308: BUG: Fix numpy.random bug in platform detection - gh-14309: ENH: Enable huge pages in all Linux builds - gh-14330: BUG: Fix segfault in `random.permutation(x)` when x is a string. - gh-14338: BUG: don't fail when lexsorting some empty arrays (#14228) - gh-14339: BUG: Fix misuse of .names and .fields in various places (backport... - gh-14345: BUG: fix behavior of structured_to_unstructured on non-trivial... - gh-14350: REL: Prepare 1.17.1 release Cheers, Charles Harris -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Aug 27 22:43:42 2019 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 27 Aug 2019 20:43:42 -0600 Subject: [SciPy-Dev] NumPy 1.16.5 released Message-ID: Hi All, On behalf of the NumPy team I am pleased to announce that NumPy 1.16.5 has been released. This release fixes bugs reported against the 1.16.4 release and backports several enhancements from master that seem appropriate for the LTS release series that is the last to support Python 2.7. Downstream developers should use Cython >= 0.29.2 and OpenBLAS >= 3.7 to avoid wrong results on the Skylake architecture. The NumPy wheels on PyPI are built from the OpenBLAS development branch in order to avoid those problems. Wheels for this release can be downloaded from PyPI , source archives and release notes are available from Github . *Contributors* A total of 18 people contributed to this release. People with a "+" by their names contributed a patch for the first time. - Alexander Shadchin - Allan Haldane - Bruce Merry + - Charles Harris - Colin Snyder + - Dan Allan + - Emile + - Eric Wieser - Grey Baker + - Maksim Shabunin + - Marten van Kerkwijk - Matti Picus - Peter Andreas Entschev + - Ralf Gommers - Richard Harris + - Sebastian Berg - Sergei Lebedev + - Stephan Hoyer *Pull requests merged* A total of 23 pull requests were merged for this release. - gh-13742: ENH: Add project URLs to setup.py - gh-13823: TEST, ENH: fix tests and ctypes code for PyPy - gh-13845: BUG: use npy_intp instead of int for indexing array - gh-13867: TST: Ignore DeprecationWarning during nose imports - gh-13905: BUG: Fix use-after-free in boolean indexing - gh-13933: MAINT/BUG/DOC: Fix errors in _add_newdocs - gh-13984: BUG: fix byte order reversal for datetime64[ns] - gh-13994: MAINT,BUG: Use nbytes to also catch empty descr during allocation - gh-14042: BUG: np.array cleared errors occured in PyMemoryView_FromObject - gh-14043: BUG: Fixes for Undefined Behavior Sanitizer (UBSan) errors. - gh-14044: BUG: ensure that casting to/from structured is properly checked. - gh-14045: MAINT: fix histogram*d dispatchers - gh-14046: BUG: further fixup to histogram2d dispatcher. - gh-14052: BUG: Replace contextlib.suppress for Python 2.7 - gh-14056: BUG: fix compilation of 3rd party modules with Py_LIMITED_API... - gh-14057: BUG: Fix memory leak in dtype from dict contructor - gh-14058: DOC: Document array_function at a higher level. - gh-14084: BUG, DOC: add new recfunctions to `__all__` - gh-14162: BUG: Remove stray print that causes a SystemError on python 3.7 - gh-14297: TST: Pin pytest version to 5.0.1. - gh-14322: ENH: Enable huge pages in all Linux builds - gh-14346: BUG: fix behavior of structured_to_unstructured on non-trivial... - gh-14382: REL: Prepare for the NumPy 1.16.5 release. Cheers, Charles Harris -------------- next part -------------- An HTML attachment was scrubbed... URL: