From andyfaff at gmail.com Tue Aug 2 02:40:43 2016 From: andyfaff at gmail.com (Andrew Nelson) Date: Tue, 2 Aug 2016 16:40:43 +1000 Subject: [SciPy-Dev] Changing termination conditions for differential evolution Message-ID: The issue at https://github.com/scipy/scipy/issues/6382 discusses termination of the differential evolution solver. At the moment termination occurs when: np.std(population_energies) / np.abs(np.mean(population_energies)) < tol. However, if the mean of the population_energies is 0, then this unnecessarily extends the number of iterations that the solver carries out. I propose to change this behaviour to: If `tol` is a tuple (rtol, atol) then terminate when np.std(population_energies) <= (atol + rtol * np.abs(np.mean(population_energies))). However, if `tol` is a single float then the existing behaviour will be honoured. Thoughts, comments? _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Aug 3 03:48:12 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 3 Aug 2016 09:48:12 +0200 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: References: Message-ID: On Wed, Jul 27, 2016 at 7:52 PM, Eric Quintero wrote: > Hi All, > > In gh-6058 , I?ve run into a > test failure because a numpy function was not available in version 1.7 (the > function being np.fft.rfftfreq). This is easy enough to work around, but I > wonder when we should decide to move up the minimum required numpy version. > There's no conclusion here yet, but the other responses to this email were examples of pain points with 1.8 and that the cost of upgrading numpy has gone down. In general I think we bump the minimum version if the costs are starting to outweigh the benefits. That's probably the case for 1.7 now. Looking back, we typically have supported 4 numpy versions with a scipy release. Right now we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the next scipy release. So my conclusion: +1 for bumping the minimum numpy version to 1.8.2. Ralf > Debian stable currently offers numpy 1.8.2. Ubuntu 12.04 LTS (which is EOL > next April) is stuck on 1.6.1, so they?re already left behind. Ubuntu 14.04 > LTS has 1.8.2. > > Has anyone ran into compatibility issues with 1.7 before? In > _lib._numpy_compat, there is another 1.7 workaround in place for > np.testing.assert_warns. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Wed Aug 3 04:32:05 2016 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Wed, 3 Aug 2016 10:32:05 +0200 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: References: Message-ID: On 3 August 2016 at 09:48, Ralf Gommers wrote: > > There's no conclusion here yet, but the other responses to this email were > examples of pain points with 1.8 and that the cost of upgrading numpy has > gone down. > > In general I think we bump the minimum version if the costs are starting to > outweigh the benefits. That's probably the case for 1.7 now. Looking back, > we typically have supported 4 numpy versions with a scipy release. Right now > we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the next > scipy release. What is the use case that requires an 2 years and 9 months old numpy and the latest, bleeding edge, scipy? The deprecation cycle is of two versions, so I think it should be enough to support the last two numpy versions at the time of the previous scipy release. So, scipy 0.19 would support from numpy 1.10, that was released in October last year, and people would have had a year to adapt. Note also that pip's default behaviour is to upgrade every dependency. Side question: do we check the other way around? Will a new numpy work with an old scipy? Say, numpy 1.11 with scipy 0.13 (as old as 1.8)? /David. From ralf.gommers at gmail.com Wed Aug 3 06:48:41 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 3 Aug 2016 12:48:41 +0200 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: References: Message-ID: On Wed, Aug 3, 2016 at 10:32 AM, Da?id wrote: > On 3 August 2016 at 09:48, Ralf Gommers wrote: > > > > There's no conclusion here yet, but the other responses to this email > were > > examples of pain points with 1.8 and that the cost of upgrading numpy has > > gone down. > > > > In general I think we bump the minimum version if the costs are starting > to > > outweigh the benefits. That's probably the case for 1.7 now. Looking > back, > > we typically have supported 4 numpy versions with a scipy release. Right > now > > we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the > next > > scipy release. > > What is the use case that requires an 2 years and 9 months old numpy > and the latest, bleeding edge, scipy? See the responses to the last time you asked this, those are still valid: https://mail.scipy.org/pipermail/scipy-dev/2014-December/020266.html :) > The deprecation cycle is of two > versions, so I think it should be enough to support the last two numpy > versions at the time of the previous scipy release. So, scipy 0.19 > would support from numpy 1.10, that was released in October last year, > and people would have had a year to adapt. > Note also that pip's > default behaviour is to upgrade every dependency. > > That pip behavior is completely braindead, and there is agreement by the pip/packaging crowd that it needs to be changed to a more minimalist upgrade strategy. The debate is just about how to introduce the change. > Side question: do we check the other way around? Will a new numpy work > with an old scipy? Say, numpy 1.11 with scipy 0.13 (as old as 1.8)? > Yes it should, except for the few things that needed to be adapted in newer scipy versions for removals of deprecated features and unintentional backwards compat breaks. Older scipy binaries will also work with a newer numpy. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Aug 3 16:09:32 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 3 Aug 2016 14:09:32 -0600 Subject: [SciPy-Dev] Numpy 1.11.2 Message-ID: Hi All, I would like to release Numpy 1.11.2rc1 this weekend. It will contain a few small fixes and enhancements for windows and the last Scipy release. If there are any pending PRs that you think should go in or be backported for this release, please speak up. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vm.rod25 at gmail.com Thu Aug 4 12:42:48 2016 From: vm.rod25 at gmail.com (Victor Rodriguez) Date: Thu, 4 Aug 2016 11:42:48 -0500 Subject: [SciPy-Dev] Scipy linked to openblas Message-ID: Hi I am part of the Clear Linux (clearlinux.org) OS project . I wonder how to link scipy to openblas. I have read in some parts that if numpy is linked to openblas by default scipy gets linked. However I wonder if there is any specific way to tell scipy to be linked agains openblas library. This is our config log gcc -pthread /tmp/tmp2f_4k1/tmp/tmp2f_4k1/source.o -L/usr/lib64 -lopenblas -o /tmp/tmp2f_4k1/a.out libraries openblas not found in ['', ''] Runtime library openblas was not found. Ignoring FOUND: libraries = ['openblas', 'openblas'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] FOUND: libraries = ['openblas', 'openblas'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] Running from scipy source directory. Splitting linalg.interpolative Fortran source files blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in ['/usr/lib64', '/usr/lib', '/usr/lib/'] NOT AVAILABLE openblas_info: libraries openblas not found in ['', ''] Runtime library openblas was not found. Ignoring libraries gfortran not found in ['', ''] Runtime library gfortran was not found. Ignoring FOUND: libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] FOUND: libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] This is due to the fact that our openblas is linked to avx technology and it boost the performance of math funtions form numpy and R: https://01.org/blogs/2016/improving-python-performance-scientific-tools-and-libraries Thanks for the help Victor Rodriguez From ralf.gommers at gmail.com Thu Aug 4 14:51:36 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 4 Aug 2016 20:51:36 +0200 Subject: [SciPy-Dev] Scipy linked to openblas In-Reply-To: References: Message-ID: On Thu, Aug 4, 2016 at 6:42 PM, Victor Rodriguez wrote: > Hi > > I am part of the Clear Linux (clearlinux.org) OS project . I wonder > how to link scipy to openblas. I have read in some parts that if numpy > is linked to openblas by default scipy gets linked. However I wonder > if there is any specific way to tell scipy to be linked agains > openblas library. > Rename site.cfg.example to site.cfg and uncomment and edit the OpenBLAS section to point to the right paths. Here: https://github.com/scipy/scipy/blob/master/site.cfg.example#L103 Ralf > > > This is our config log > > gcc -pthread /tmp/tmp2f_4k1/tmp/tmp2f_4k1/source.o -L/usr/lib64 > -lopenblas -o /tmp/tmp2f_4k1/a.out > libraries openblas not found in ['', ''] > Runtime library openblas was not found. Ignoring > FOUND: > libraries = ['openblas', 'openblas'] > library_dirs = ['/usr/lib64'] > language = c > define_macros = [('HAVE_CBLAS', None)] > FOUND: > libraries = ['openblas', 'openblas'] > library_dirs = ['/usr/lib64'] > language = c > define_macros = [('HAVE_CBLAS', None)] > Running from scipy source directory. > Splitting linalg.interpolative Fortran source files > blas_opt_info: > blas_mkl_info: > libraries mkl,vml,guide not found in ['/usr/lib64', '/usr/lib', > '/usr/lib/'] > NOT AVAILABLE > openblas_info: > libraries openblas not found in ['', ''] > Runtime library openblas was not found. Ignoring > libraries gfortran not found in ['', ''] > Runtime library gfortran was not found. Ignoring > FOUND: > libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] > library_dirs = ['/usr/lib64'] > language = c > define_macros = [('HAVE_CBLAS', None)] > FOUND: > libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] > library_dirs = ['/usr/lib64'] > language = c > define_macros = [('HAVE_CBLAS', None)] > > > This is due to the fact that our openblas is linked to avx technology > and it boost the performance of math funtions form numpy and R: > > https://01.org/blogs/2016/improving-python-performance- > scientific-tools-and-libraries > > Thanks for the help > > Victor Rodriguez > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vm.rod25 at gmail.com Fri Aug 5 09:39:00 2016 From: vm.rod25 at gmail.com (Victor Rodriguez) Date: Fri, 5 Aug 2016 08:39:00 -0500 Subject: [SciPy-Dev] Scipy linked to openblas In-Reply-To: References: Message-ID: thanks I will try this :) On Thu, Aug 4, 2016 at 1:51 PM, Ralf Gommers wrote: > > > On Thu, Aug 4, 2016 at 6:42 PM, Victor Rodriguez wrote: >> >> Hi >> >> I am part of the Clear Linux (clearlinux.org) OS project . I wonder >> how to link scipy to openblas. I have read in some parts that if numpy >> is linked to openblas by default scipy gets linked. However I wonder >> if there is any specific way to tell scipy to be linked agains >> openblas library. > > > Rename site.cfg.example to site.cfg and uncomment and edit the OpenBLAS > section to point to the right paths. Here: > https://github.com/scipy/scipy/blob/master/site.cfg.example#L103 > > Ralf > >> >> >> >> This is our config log >> >> gcc -pthread /tmp/tmp2f_4k1/tmp/tmp2f_4k1/source.o -L/usr/lib64 >> -lopenblas -o /tmp/tmp2f_4k1/a.out >> libraries openblas not found in ['', ''] >> Runtime library openblas was not found. Ignoring >> FOUND: >> libraries = ['openblas', 'openblas'] >> library_dirs = ['/usr/lib64'] >> language = c >> define_macros = [('HAVE_CBLAS', None)] >> FOUND: >> libraries = ['openblas', 'openblas'] >> library_dirs = ['/usr/lib64'] >> language = c >> define_macros = [('HAVE_CBLAS', None)] >> Running from scipy source directory. >> Splitting linalg.interpolative Fortran source files >> blas_opt_info: >> blas_mkl_info: >> libraries mkl,vml,guide not found in ['/usr/lib64', '/usr/lib', >> '/usr/lib/'] >> NOT AVAILABLE >> openblas_info: >> libraries openblas not found in ['', ''] >> Runtime library openblas was not found. Ignoring >> libraries gfortran not found in ['', ''] >> Runtime library gfortran was not found. Ignoring >> FOUND: >> libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] >> library_dirs = ['/usr/lib64'] >> language = c >> define_macros = [('HAVE_CBLAS', None)] >> FOUND: >> libraries = ['openblas', 'gfortran', 'openblas', 'gfortran'] >> library_dirs = ['/usr/lib64'] >> language = c >> define_macros = [('HAVE_CBLAS', None)] >> >> >> This is due to the fact that our openblas is linked to avx technology >> and it boost the performance of math funtions form numpy and R: >> >> >> https://01.org/blogs/2016/improving-python-performance-scientific-tools-and-libraries >> >> Thanks for the help >> >> Victor Rodriguez >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From davidmenhur at gmail.com Fri Aug 5 10:06:32 2016 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Fri, 5 Aug 2016 16:06:32 +0200 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: References: Message-ID: On 3 August 2016 at 12:48, Ralf Gommers wrote: > On Wed, Aug 3, 2016 at 10:32 AM, Da?id wrote: >> >> On 3 August 2016 at 09:48, Ralf Gommers wrote: >> > >> > There's no conclusion here yet, but the other responses to this email >> > were >> > examples of pain points with 1.8 and that the cost of upgrading numpy >> > has >> > gone down. >> > >> > In general I think we bump the minimum version if the costs are starting >> > to >> > outweigh the benefits. That's probably the case for 1.7 now. Looking >> > back, >> > we typically have supported 4 numpy versions with a scipy release. Right >> > now >> > we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the >> > next >> > scipy release. >> >> What is the use case that requires an 2 years and 9 months old numpy >> and the latest, bleeding edge, scipy? > > > See the responses to the last time you asked this, those are still valid: > https://mail.scipy.org/pipermail/scipy-dev/2014-December/020266.html :) I totally forgot I ever said that, sorry. Thank you for bearing with my fish memory. > A lot more packages depend on numpy than on scipy, so upgrading numpy can have way more impact. I see. I guess this is mostly relevant as a part of a large package repository, where there is a higher chance of finding outdated packages. On the other hand, numpy is pretty backwards compatible, except for long deprecation cycles, so there is usually plenty of time to upgrade (one year of deprecation, plus one year of scipy catching up in my scheme). All the backwards incompatible changes I can think of so far required fairly minor adjustments (for example, random_integers -> randint), unless I have forgotten something (fish memory). Or, of course, the codebase is huge, but I have no experience there. > And there may be institutes/companies that ship a fixed version of numpy that needs to be supported for a long time. My question in this case remains, if they ship an ancient numpy, why would they need the latest scipy? If the component is mission critical and upgrading is a risk, I'd expect the same would be true for scipy. Most other cases would be covered by using virtualenvs or anaconda. And I would add one reason to encourage people to keep up to date: deprecation cycles work under the assumption that developers try every version, and use the warnings to adapt their code. If someone were to upgrade directly from numpy 1.8 to 1.12, the safety net is circumvented, and they may find unexpected changes in behaviour. /David. From prabhu at aero.iitb.ac.in Mon Aug 8 02:38:06 2016 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Mon, 8 Aug 2016 12:08:06 +0530 Subject: [SciPy-Dev] [ANN] Mayavi-4.5.0 release Message-ID: Hello, We are pleased to announce Mayavi-4.5.0. Mayavi is a general purpose, cross-platform Python package for 2-D and 3-D scientific data visualization. Mayavi integrates seamlessly with numpy and provides a convenient Pythonic wrapper for the VTK API. It provides a high-level visualization API that sits on top of the powerful VTK (http://www.vtk.org) library. It provides a stand-alone UI to help visualize your data. Mayavi is easy to extend and embed in your own dialogs and UIs. For more information see here: http://docs.enthought.com/mayavi/mayavi/index.html New features ------------- - Jupyter notebook support: Adds basic support for displaying Mayavi images or interactive x3d scenes. For more information see here http://docs.enthought.com/mayavi/mayavi/tips.html#using-mayavi-in-jupyter-notebooks - Add support for recording movies and animating timesteps. - Support for the new matplotlib colorschemes. - This release improves the experimental Python 3 support from the previous release. This release should work with VTK-5.x, VTK-6.x, and 7.x. For more details on the full set of changes see: http://docs.enthought.com/mayavi/mayavi/auto/changes.html#mayavi-4-5-0 More than 50 pull requests were merged since the last release. We are thankful to Gregory R. Lee, Ioannis Tziakos, Kit Yan Choi, Patrick Snape, Prabhu Ramachandran, Ryan Pepper, SiggyF, Stefano Borini, and daytonb for their contributions towards this release. Contributions from Ioannis Tziakos, Kit Yan Choi and Stefano Borini are funded and supported by the SimPhoNy project, an EU-project funded by the 7th Framework Programme (Project number 604005) under the call NMP.2013.1.4-1. cheers, Mayavi Developers From josef.pktd at gmail.com Tue Aug 9 19:03:14 2016 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 9 Aug 2016 19:03:14 -0400 Subject: [SciPy-Dev] change in 0.8 to fmin Message-ID: Was there a change related to starting values or stepsize in the latest release in Nelder-Mead fmin? I only saw changes related to convergence criteria in browsing the scipy.optimize history. I'm getting a test failure in statsmodels where the solution has the opposite sign from before. https://github.com/statsmodels/statsmodels/issues/3128#issuecomment-238673265 AFAICS, flipping the sign to negative is still a valid solution. The parameters correspond to the standard deviation and only the square (covariance matrix) is used. However, the starting values are 0.1 and all previous versions of scipy produce positive parameters, i.e. positive standard deviations. The optimization for the test case is pretty deep inside one of the models, and it's not easy to figure out what might be going on. (There are two more statsmodels test failures related to scipy optimize, but I haven't looked at the details yet.) Josef From pav at iki.fi Tue Aug 9 19:27:53 2016 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 9 Aug 2016 23:27:53 +0000 (UTC) Subject: [SciPy-Dev] change in 0.8 to fmin References: Message-ID: Tue, 09 Aug 2016 19:03:14 -0400, josef.pktd kirjoitti: > Was there a change related to starting values or stepsize in the latest > release in Nelder-Mead fmin? > I only saw changes related to convergence criteria in browsing the > scipy.optimize history. I don't think so. Option to specify the initial simplex was also added, but this should not have changed the default one. Indeed, I don't see any change between 0.17.0 and 0.18.0 in examples. If your function modifies the input vector (if yes, probably a bad idea), that may change things, since copying is not necessarily managed the same way. From josef.pktd at gmail.com Tue Aug 9 20:02:29 2016 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 9 Aug 2016 20:02:29 -0400 Subject: [SciPy-Dev] change in 0.8 to fmin In-Reply-To: References: Message-ID: On Tue, Aug 9, 2016 at 7:27 PM, Pauli Virtanen wrote: > Tue, 09 Aug 2016 19:03:14 -0400, josef.pktd kirjoitti: >> Was there a change related to starting values or stepsize in the latest >> release in Nelder-Mead fmin? >> I only saw changes related to convergence criteria in browsing the >> scipy.optimize history. > > I don't think so. Option to specify the initial simplex was also added, > but this should not have changed the default one. Indeed, I don't see any > change between 0.17.0 and 0.18.0 in examples. > > If your function modifies the input vector (if yes, probably a bad idea), > that may change things, since copying is not necessarily managed the same > way. Thanks, I consider this as indeterminate sign then, and fix the unit test. I'm not very familiar with that code, but I don't think there is any modification of the input vector. The parameters are put into a larger array that has just a few unkown or unconstrained values. Josef > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev From ericq at caltech.edu Wed Aug 10 13:05:24 2016 From: ericq at caltech.edu (Eric Quintero) Date: Wed, 10 Aug 2016 10:05:24 -0700 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: References: Message-ID: <66FC12B4-F364-47F2-AE61-9158F95A437D@caltech.edu> So far, it seems no-one has argued against bumping the minimum version to 1.8. If this is the necessary level of consensus, what steps need to happen to make this transition? -Eric Q. > On Aug 5, 2016, at 7:06 AM, Da?id wrote: > > On 3 August 2016 at 12:48, Ralf Gommers wrote: >> On Wed, Aug 3, 2016 at 10:32 AM, Da?id wrote: >>> >>> On 3 August 2016 at 09:48, Ralf Gommers wrote: >>>> >>>> There's no conclusion here yet, but the other responses to this email >>>> were >>>> examples of pain points with 1.8 and that the cost of upgrading numpy >>>> has >>>> gone down. >>>> >>>> In general I think we bump the minimum version if the costs are starting >>>> to >>>> outweigh the benefits. That's probably the case for 1.7 now. Looking >>>> back, >>>> we typically have supported 4 numpy versions with a scipy release. Right >>>> now >>>> we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the >>>> next >>>> scipy release. >>> >>> What is the use case that requires an 2 years and 9 months old numpy >>> and the latest, bleeding edge, scipy? >> >> >> See the responses to the last time you asked this, those are still valid: >> https://mail.scipy.org/pipermail/scipy-dev/2014-December/020266.html :) > > I totally forgot I ever said that, sorry. Thank you for bearing with > my fish memory. > >> A lot more packages depend on numpy than on scipy, so upgrading numpy can have way more impact. > > I see. I guess this is mostly relevant as a part of a large package > repository, where there is a higher chance of finding outdated > packages. On the other hand, numpy is pretty backwards compatible, > except for long deprecation cycles, so there is usually plenty of time > to upgrade (one year of deprecation, plus one year of scipy catching > up in my scheme). > > All the backwards incompatible changes I can think of so far required > fairly minor adjustments (for example, random_integers -> randint), > unless I have forgotten something (fish memory). Or, of course, the > codebase is huge, but I have no experience there. > >> And there may be institutes/companies that ship a fixed version of numpy that needs to be supported for a long time. > > My question in this case remains, if they ship an ancient numpy, why > would they need the latest scipy? If the component is mission critical > and upgrading is a risk, I'd expect the same would be true for scipy. > Most other cases would be covered by using virtualenvs or anaconda. > > > > And I would add one reason to encourage people to keep up to date: > deprecation cycles work under the assumption that developers try every > version, and use the warnings to adapt their code. If someone were to > upgrade directly from numpy 1.8 to 1.12, the safety net is > circumvented, and they may find unexpected changes in behaviour. > > /David. > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev From ralf.gommers at gmail.com Wed Aug 10 17:07:36 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 11 Aug 2016 09:07:36 +1200 Subject: [SciPy-Dev] Policy for minimum numpy version In-Reply-To: <66FC12B4-F364-47F2-AE61-9158F95A437D@caltech.edu> References: <66FC12B4-F364-47F2-AE61-9158F95A437D@caltech.edu> Message-ID: On Thu, Aug 11, 2016 at 5:05 AM, Eric Quintero wrote: > So far, it seems no-one has argued against bumping the minimum version to > 1.8. If this is the necessary level of consensus, I think it is. > what steps need to happen to make this transition? > A PR with changes to setup.py, scipy/__init__.py, pavement.py and doc/release/0.19.0-notes.rst for changing 1.7.1/1.7.2 to 1.8.2 at least. And then in the same or another PR all the special-casing and tests skipped for 1.7.x can be removed. > -Eric Q. > > On Aug 5, 2016, at 7:06 AM, Da?id wrote: > > > > On 3 August 2016 at 12:48, Ralf Gommers wrote: > >> On Wed, Aug 3, 2016 at 10:32 AM, Da?id wrote: > >>> > >>> On 3 August 2016 at 09:48, Ralf Gommers > wrote: > >>>> > >>>> There's no conclusion here yet, but the other responses to this email > >>>> were > >>>> examples of pain points with 1.8 and that the cost of upgrading numpy > >>>> has > >>>> gone down. > >>>> > >>>> In general I think we bump the minimum version if the costs are > starting > >>>> to > >>>> outweigh the benefits. That's probably the case for 1.7 now. Looking > >>>> back, > >>>> we typically have supported 4 numpy versions with a scipy release. > Right > >>>> now > >>>> we're at 5 (1.7 - 1.11), and numpy 1.12 will likely be out before the > >>>> next > >>>> scipy release. > >>> > >>> What is the use case that requires an 2 years and 9 months old numpy > >>> and the latest, bleeding edge, scipy? > >> > >> > >> See the responses to the last time you asked this, those are still > valid: > >> https://mail.scipy.org/pipermail/scipy-dev/2014-December/020266.html :) > > > > I totally forgot I ever said that, sorry. Thank you for bearing with > > my fish memory. > No worries. > > > >> A lot more packages depend on numpy than on scipy, so upgrading numpy > can have way more impact. > > > > I see. I guess this is mostly relevant as a part of a large package > > repository, where there is a higher chance of finding outdated > > packages. On the other hand, numpy is pretty backwards compatible, > > except for long deprecation cycles, so there is usually plenty of time > > to upgrade (one year of deprecation, plus one year of scipy catching > > up in my scheme). > One year is not a lot at all for deployed server installs, companies shipping qualified sets of packages, etc. And even occasional users won't be updating actively. For people reading this mailing list a year is a lot, but those are not average users. > > > > All the backwards incompatible changes I can think of so far required > > fairly minor adjustments (for example, random_integers -> randint), > > unless I have forgotten something (fish memory). Or, of course, the > > codebase is huge, but I have no experience there. > > > >> And there may be institutes/companies that ship a fixed version of > numpy that needs to be supported for a long time. > > > > My question in this case remains, if they ship an ancient numpy, why > > would they need the latest scipy? > They may need a specific feature or bugfix. > If the component is mission critical > > and upgrading is a risk, I'd expect the same would be true for scipy. > Sure, but any extra package to upgrade is extra work. And work is time/money. This really comes down to putting the pain somewhere, either with some devs that run into an issue with older numpy when making a PR, or with a very hard to estimate (but certainly much larger) group of users. > > Most other cases would be covered by using virtualenvs or anaconda. > > > > > > > > And I would add one reason to encourage people to keep up to date: > > deprecation cycles work under the assumption that developers try every > > version, and use the warnings to adapt their code. If someone were to > > upgrade directly from numpy 1.8 to 1.12, the safety net is > > circumvented, and they may find unexpected changes in behaviour. > Most (ideally all) of those should be exceptions or things like shape changes, not silent changes in numerical values. Anyway, again this a cost/benefit trade-off. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From jor at informatik.uni-kiel.de Mon Aug 15 05:35:30 2016 From: jor at informatik.uni-kiel.de (Joscha Reimer) Date: Mon, 15 Aug 2016 11:35:30 +0200 Subject: [SciPy-Dev] scipy.sparse: add save and load functions for sparse matrices Message-ID: Hallo, I would like to propose a new save and load functionality for sparse matrices in SciPy. So far, the scipy.io.savemat/loadmat functions allow to save and load sparse matrices in MATLAB file format (version 4 and 5). However, this has some serious drawbacks. Big (sparse) matrices are not storable in a mat file (version 4 and 5) since maximal 2^31 bytes per variable are supported. Besides sparse matrices are stored in a mat file always in csc matrix format. Thus, the original matrix format is not preserved. If another matrix format is used, the format has to be converted from the original format to csc before saving and back to the original format after loading. For large matrices this can take a lot of time. In addition, the indices must be sorted in a mat file. Which can take a lot of additional time. Since the sparse matrices are always stored in csc format, the advantages of other matrix formats regarding disk consumption can not be exploited. For example, some suitable block matrices can be stored with much less disk consumption in bsr matrix format as in csc matrix format. I propose to store directly the data arrays of the sparse matrics together with the matrix format in one file using NumPys savez and savez_compressed functions. The reconstruction while loading is then possible without much effort. This can be done easily for the (csc, csr, bsr, dia and coo) formats. (The remaining dok and lil formats should only be used for construction sparse matrices anyway and than be converted to another matrix format.) This would allow to store big sparse matrices and to benefit from the advantages of the different matrix formats. A pull request (for the csc, csr and bsr matrix formats) is here: https://github.com/scipy/scipy/pull/6394 Best regards, Joscha Reimer -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 4263 bytes Desc: S/MIME Cryptographic Signature URL: From ralf.gommers at gmail.com Wed Aug 17 06:35:23 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 17 Aug 2016 22:35:23 +1200 Subject: [SciPy-Dev] Scipy citations and article(s) Message-ID: Hi all, On GitHub and off-list there has been some discussion on making Scipy releases citable and producing one or more papers. So here's a short summary for those who haven't seen that yet. Here's a proposal to use Zenodo to generate a DOI for every new Scipy release: https://github.com/scipy/scipy/issues/6446. Looks like everyone is in favor so far. Furthermore it looks like we want to produce at least an "abstract paper" for a new release, for example in Journal of Open Source Software ( http://joss.theoj.org). The main objective there being to provide a way for academics to receive the credit they deserve for contributing by being a co-author on a peer-reviewed paper that can be cited. Evgeni suggested to start a new repo for this under the scipy org, which makes sense to me. The name I propose is scipy-articles. Better ideas welcome of course. Barring objections I'll create that in a day or two. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Aug 17 07:16:20 2016 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 17 Aug 2016 12:16:20 +0100 Subject: [SciPy-Dev] Scipy citations and article(s) In-Reply-To: References: Message-ID: On Wed, Aug 17, 2016 at 11:35 AM, Ralf Gommers wrote: > > Hi all, > > On GitHub and off-list there has been some discussion on making Scipy releases citable and producing one or more papers. So here's a short summary for those who haven't seen that yet. +1 to everything. Thanks! -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: