From cimrman3 at ntc.zcu.cz Tue Dec 1 05:31:17 2015 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 1 Dec 2015 11:31:17 +0100 Subject: [SciPy-Dev] ANN: SfePy 2015.4 Message-ID: <565D76F5.8060201@ntc.zcu.cz> I am pleased to announce release 2015.4 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software for solving systems of coupled partial differential equations by the finite element method or by the isogeometric analysis (preliminary support). It is distributed under the new BSD license. Home page: http://sfepy.org Mailing list: http://groups.google.com/group/sfepy-devel Git (source) repository, issue tracker, wiki: http://github.com/sfepy Highlights of this release -------------------------- - basic support for restart files - new type of linear combination boundary conditions - balloon inflation example For full release notes see http://docs.sfepy.org/doc/release_notes.html#id1 (rather long and technical). Best regards, Robert Cimrman on behalf of the SfePy development team --- Contributors to this release in alphabetical order: Robert Cimrman Grant Stephens From lists at hilboll.de Fri Dec 4 02:27:51 2015 From: lists at hilboll.de (Andreas Hilboll) Date: Fri, 4 Dec 2015 08:27:51 +0100 Subject: [SciPy-Dev] conference.scipy.org Certificate expired Message-ID: <56614077.1060405@hilboll.de> Good morning, I just noticed that the SSL certificate on https://conference.scipy.org/ expired some three months ago. Cheers, Andreas From archibald at astron.nl Fri Dec 4 08:47:33 2015 From: archibald at astron.nl (Anne Archibald) Date: Fri, 04 Dec 2015 13:47:33 +0000 Subject: [SciPy-Dev] dense output option in integrate.ode In-Reply-To: References: Message-ID: Dense output is actually already available from some of the integrators - I believe vode in particular. It is limited to evaluation within the last step, so the most reliable way to use it is to allow the adaptive stepper to take a single step then evaluate within it. For many purposes, for example stopping criteria, evaluation within the last step is cheap and sufficient, and many solver implementations provide an interface for it. But it would occasionally be useful to have a solution object that can be evaluated anywhere - preferably including one or more derivatives. Anne On Mon, Nov 23, 2015, 12:43 David Mikolas wrote: > OK, Jim Martin has started this and I'm trying to add momentum and any > help that I can. I'm already using this feature (access to interpolation > coefficients) by installing an environment with his changes, so I can run > the integration flat out, yet get results at arbitrary density with nearly > the same precision. > > I'm new to SciPy-dev and mail lists in general, appreciate any/all > suggestions. Thanks! > > David > > On Mon, Nov 23, 2015 at 4:28 PM, Evgeni Burovski < > evgeny.burovskiy at gmail.com> wrote: > >> Hi David, >> >> This sounds like a useful feature. >> I guess a pull request would be welcome. (even though I'm not a user, and >> I won't be able to review it). >> >> Evgeni >> 23.11.2015 10:47 ???????????? "David Mikolas" >> ???????: >> >> One way that dense output for dopri5 and dop853 can be very useful is if >>> the integration is expensive/long and you don't want to repeat it, but you >>> want to obtain results at new time points at a later date, or even iterate >>> on it - for example, find the time of closest approac. This is done by >>> saving the interpolation coefficients. I put a simple example here, though >>> there is no saving to disk yet. >>> >>> http://pastebin.com/e6qNjbL9 dendop_test_v00.py >>> >>> I wonder if this could be developed into an option to return an >>> interpolator object or function. If I can help let me know. >>> >>> On Sat, Nov 14, 2015 at 12:19 AM, Evgeni Burovski < >>> evgeny.burovskiy at gmail.com> wrote: >>> >>>> Hi David, Hi Jim, >>>> >>>> > I am new to SciPy-dev. Will the dense output option in >>>> scipy.integrate.ode >>>> > become available in 0.17.0? This is a feature already available in the >>>> > original FORTAN, but wasn't implemented in the wrapper. >>>> >>>> If the feature is sent as a pull request against the scipy master >>>> branch, the PR is reviewed by the maintainers of the integrate package >>>> and merged into master before the release split, then yes, it would be >>>> available in 0.17.0. >>>> >>>> So far I do not see any progress towards it. >>>> >>>> >>>> > I wrote these dense output extensions that you listed: >>>> > >>>> > >>>> https://github.com/jddmartin/scipy/tree/dense_output_from_dopri5_and_dop853 >>>> > https://github.com/jddmartin/dense_output_example_usage >>>> > >>>> > but I didn't issue any pull request to scipy. I sent this message to >>>> the >>>> > scipy developers list: >>>> > http://article.gmane.org/gmane.comp.python.scientific.devel/19635/ >>>> > explaining the changes. I was hoping for some feedback before >>>> issuing a >>>> > pull request. >>>> >>>> Ah, I see that the email likely fell through the cracks back in April. >>>> Sorry about that. >>>> You might want to ping that email thread once more or send a pull >>>> request on github (or both). >>>> >>>> >>>> >>>> Cheers, >>>> >>>> Evgeni >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andy.terrel at gmail.com Fri Dec 4 12:12:38 2015 From: andy.terrel at gmail.com (Andy Ray Terrel) Date: Fri, 4 Dec 2015 09:12:38 -0800 Subject: [SciPy-Dev] conference.scipy.org Certificate expired In-Reply-To: <56614077.1060405@hilboll.de> References: <56614077.1060405@hilboll.de> Message-ID: Hmm I'm not sure there is a reason to have https on that domain any more. Didrik told me he would look into it, but the only reason we had it there was for the conference registration pieces that have been outsourced now. What page are you hitting that needs https? -- Andy On Thu, Dec 3, 2015 at 11:27 PM, Andreas Hilboll wrote: > Good morning, > > I just noticed that the SSL certificate on https://conference.scipy.org/ > expired some three months ago. > > Cheers, > Andreas > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at hilboll.de Fri Dec 4 12:15:08 2015 From: lists at hilboll.de (Andreas Hilboll) Date: Fri, 4 Dec 2015 18:15:08 +0100 Subject: [SciPy-Dev] conference.scipy.org Certificate expired In-Reply-To: References: <56614077.1060405@hilboll.de> Message-ID: <5661CA1C.60209@hilboll.de> On 04.12.2015 18:12, Andy Ray Terrel wrote: > Hmm I'm not sure there is a reason to have https on that domain any > more. Didrik told me he would look into it, but the only reason we had > it there was for the conference registration pieces that have been > outsourced now. > > What page are you hitting that needs https? None. I use HTTPS by default, or had the address with HTTPS in my browser history. And then I saw the error. From charlesr.harris at gmail.com Mon Dec 7 20:41:06 2015 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 7 Dec 2015 18:41:06 -0700 Subject: [SciPy-Dev] Numpy 1.10.2rc2 released Message-ID: Hi All, I'm pleased to announce the release of Numpy 1.10.2rc2. After two months of stomping bugs I think the house is clean and we are almost ready to put it up for sale. However, bugs are persistent and may show up at anytime, so please inspect and test thoroughly. Windows binaries and source releases can be found at the usual place on Sourceforge . If there are no reports of problems in the next week I plan to release the final. Further bug squashing will be left to the 1.11 release except possibly for regressions. The release notes give more detail on the changes. *bon app?tit,* Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Fri Dec 11 07:52:53 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Fri, 11 Dec 2015 13:52:53 +0100 Subject: [SciPy-Dev] [SciPy-User] pip install scipy: error adding symbols: bad value In-Reply-To: References: Message-ID: Moving this to Scipy-dev On 10 December 2015 at 20:52, Charles R Harris wrote: > > On Thu, Dec 10, 2015 at 2:41 AM, Da?id wrote: > >> >> On 9 December 2015 at 17:11, David Cournapeau wrote: >> >>> You may have CFLAGS/CXXFLAGS defined, which override the distutils >>> flags. Unset those (or append `-fPIC` to them) >> >> >> I figured this one out. The culprit is actually FFLAGS. You can set >> CFLAGS and CXXFLAGS freely. >> >> I noticed that Numpy compiled just fine, but produced errors with f2py; >> and the errors in Scipy's installation arise when gfortran is invoked. >> Also, last week I was building some FORTRAN, and I set the corresponding >> flags; hence the problem. Unsetting FFLAGS fixed the issue (but setting it >> to the empty string didn't!). >> >> I confirm that distutils is just appending my CFLAGS to the preset ones, >> which seems to me a sane behaviour. I think f2py should do the same. >> > > Please open an numpy issue for this. > > https://github.com/numpy/numpy/issues/6809 Pauli Virtanen pointed out that that has been considered a feature ( https://github.com/numpy/numpy/issues/1171): "The reason we need to override these flags is because Fortran compilers change. We don't always have the most up-to-date set of flags that a particular Fortran compiler needs to link a Python extension. The presence of incorrect flags completely prevents the extension from being built correctly. Consequently, we need a way to completely override the flags. In the current incarnation of numpy.distutils, this is LDFLAGS and friends." (Robert Kern, 2007). That was 7 years ago, are FORTRAN compilers still that unstable? If that is the case, I think a workaround would be to print a warning before overwriting, printing the minimum set of flags that we think are required, so in case it fails, the user has a bit more information. Another option I believe would behave more as expected is to append FFLAGS, and overwrite NUMPY_FFLAGS (or bikesheded equivalent) instead. This scheme would also allow to overwrite C(XX)FLAGS, that are currently not overwritten (distutils/setuptools allowing). Thanks, /David. -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Sun Dec 13 19:37:53 2015 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 14 Dec 2015 11:37:53 +1100 Subject: [SciPy-Dev] diifferential_evolution - moving core loop of solve method into a generator function Message-ID: Dear devs, I'd like to submit a PR to extract the population evolving loop in the DifferentialEvolutionSolver.solve method (currently ~L500 in _differentialevolution.py), and put it into a generator function (called evolve). This means that the population evolution would occur by the solve method advancing the evolve generator repeatedly. This would not appear to have any use if one is calling the differential_evolution function. However, if one uses the DifferentialEvolutionSolver class then you could: 1) Inspect the population after every evolved generation (if you call the evolve generator directly, instead of through solve). You can then apply your own convergence criterion to the optimization. 2) it breaks up the solve method into smaller pieces. 3) for me it's a little more pythonic. Comments, questions? _____________________________________ Dr. Andrew Nelson _____________________________________ From freddyrietdijk at fridh.nl Mon Dec 14 09:28:35 2015 From: freddyrietdijk at fridh.nl (Freddy Rietdijk) Date: Mon, 14 Dec 2015 15:28:35 +0100 Subject: [SciPy-Dev] Fwd: Imsave and imwrite In-Reply-To: References: Message-ID: On February the 5th the package `imageio` [1] was announced. It's a lightweight library requiring only `numpy`. I think we should recommend users to use this package instead. [1] http://imageio.github.io/ On Mon, Nov 9, 2015 at 9:41 PM, Warren Weckesser wrote: > > > On Sat, Nov 7, 2015 at 4:32 PM, Ralf Gommers > wrote: > >> >> >> On Wed, Nov 4, 2015 at 12:31 AM, Alan Wu wrote: >> >>> I moved imsave into imwrite and wrapped imsave to call imwrite for >>> backwards compatibility, >>> in order to be consistent in naming read/write and with other packages >>> like matlab and opencv. >>> >> >>> I would like to submit it and related test as a pull request. >>> >> >> Alan submitted this PR at https://github.com/scipy/scipy/pull/5458 >> >> On the PR I commented already that the name change doesn't really warrant >> a deprecation imho. Both "imsave" and "imwrite" are pretty clear. The >> latter may be slightly more precise and consistent with OpenCV and Matlab, >> but on the other hand Matplotlib also uses "imsave". >> >> However: we do want to get rid of scipy.misc. At least I do. So the >> useful stuff in there (not that much) needs to move. We recently discussed >> "imread" on a PR, and that has two versions - one in misc and one in >> ndimage. Warren suggested that the most logical place for it would actually >> be scipy.io. Which makes sense. >> >> If we add an image read and save function to scipy.io, then we're free >> to choose the names of those functions. So: >> - long-term, do we want to keep an imread and an imsave/write function? >> - if so, what's the preferred name? >> >> > > I prefer the read/write verb pairing over read/save, so I'd be fine with > moving `imread` and `imsave` from `misc` to `io` and renaming `imsave` to > `imwrite`. (The `misc` versions would have the usual deprecation cycle, of > course.) > > For comparison, the other I/O functions in `scipy.io` that use either > read/write or load/save are (sorry if the following doesn't come out > properly formatted in your email viewer): > > Format Input Output > WAV wavfile.read wavfile.write > Matrix market mmread mmwrite > IDL readsav > Matlab files loadmat savemat > ARFF arff.loadarff > > > Warren > > P.S. As long as we're on the subject of writing image files, I'll put in a > plug for the package "numpngw" for writing numpy arrays to PNG files, > including animated PNG: https://github.com/WarrenWeckesser/numpngw > > > >> >> My previous emails messed up on my phone after sending. I'm sending >>> this message again >>> on my desktop as plain text, hopefully with better formatting. Is >>> there a way to remove the previous >>> posted messages? >>> >> >> There's no way to remove posted messages. >> >> Cheers, >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aidan at brightcloud.com Mon Dec 14 13:13:20 2015 From: aidan at brightcloud.com (Aidan Macdonald) Date: Mon, 14 Dec 2015 10:13:20 -0800 Subject: [SciPy-Dev] diifferential_evolution - moving core loop of solve method into a generator function In-Reply-To: References: Message-ID: On Sun, Dec 13, 2015 at 4:37 PM, Andrew Nelson wrote: > Dear devs, > I'd like to submit a PR to extract the population evolving loop in the > DifferentialEvolutionSolver.solve method (currently ~L500 in > _differentialevolution.py), and put it into a generator function > (called evolve). This means that the population evolution would occur > by the solve method advancing the evolve generator repeatedly. > > This would not appear to have any use if one is calling the > differential_evolution function. However, if one uses the > DifferentialEvolutionSolver class then you could: > > 1) Inspect the population after every evolved generation (if you call > the evolve generator directly, instead of through solve). You can then > apply your own convergence criterion to the optimization. > 2) it breaks up the solve method into smaller pieces. > Could this perhaps allow for parallelization? Last I checked, Differential Evolution wasn't parallelized in Scipy. I wrote my own version because I needed MPI Parallelization. 3) for me it's a little more pythonic. > > Comments, questions? > I support this change, and especially if I would be able to parallelize my optimization via MPI. > _____________________________________ > Dr. Andrew Nelson > > > _____________________________________ > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Dec 14 16:24:20 2015 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 14 Dec 2015 14:24:20 -0700 Subject: [SciPy-Dev] NumPy 1.10.2 release Message-ID: Hi All, I'm pleased to announce the release of Numpy 1.10.2. This release should take care of the bugs discovered in the 1.10.1 release, some of them severe. Upgrading is strongly advised if you are currently using 1.10.1. Windows binaries and source releases can be found at the usual place on Sourceforge . The sources are also available from PyPi . Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Tue Dec 15 18:38:39 2015 From: andyfaff at gmail.com (Andrew Nelson) Date: Wed, 16 Dec 2015 10:38:39 +1100 Subject: [SciPy-Dev] diifferential_evolution - moving core loop of solve method into a generator function In-Reply-To: References: Message-ID: > I support this change, and especially if I would be able to parallelize my > optimization via MPI. It's not going to enable parallelisation of the the optimization, that's the subject of another (currently stalled) PR. From robert.kern at gmail.com Thu Dec 17 10:18:54 2015 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 17 Dec 2015 15:18:54 +0000 Subject: [SciPy-Dev] tempita In-Reply-To: References: Message-ID: On 2015-12-17 09:36, Sturla Molden wrote: > SciPy also relies on this 'implementation detail'. > > Perhaps SciPy should provide its own copy of tempita? It falls back correctly. https://github.com/scipy/scipy/blob/master/tools/cythonize.py#L84-L92 -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From rlucente at pipeline.com Fri Dec 18 07:09:06 2015 From: rlucente at pipeline.com (Robert Lucente - Pipeline) Date: Fri, 18 Dec 2015 07:09:06 -0500 Subject: [SciPy-Dev] Math Optimization - Kaggle Competition: Santa's Stolen Sleigh Message-ID: <0a3d01d1398c$e50accf0$af2066d0$@pipeline.com> I realize that this is a little off topic but hopefully some people will find the problem at least interesting or learn from the discussion or actually participate https://www.kaggle.com/c/santas-stolen-sleigh/details/evaluation Your goal is to minimize total weighted reindeer weariness: 1. Weighted Reindeer Weariness = (distance traveled) * (weights carried for that segment) 2. All sleighs start from north pole (Lat=90, Long=0), then head to each gift in the order that a user gives, and then head back to north pole 3. Sleighs have a base weight = 10 4. Each sleigh has a weight limit = 1000 (excluding the sleigh base weight) Mathematically, weighted reindeer weariness is calculated by . -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Fri Dec 18 16:10:55 2015 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Fri, 18 Dec 2015 22:10:55 +0100 Subject: [SciPy-Dev] Introducing outer/orthonongal indexing to numpy Message-ID: <1450473055.24953.46.camel@sipsolutions.net> Hello all, sorry for cross posting (discussion should go to the numpy list). But I would like to get a bit of discussion on the introduction of (mostly) two new ways to index numpy arrays. This would also define a way for code working with different array-likes, some of which implement outer indexing (i.e. xray and dask I believe), to avoid ambiguity. The new methods are (names up for discussion): 1. arr.oindex[...] 2. arr.vindex[...] The difference beeing that `oindex` will return outer/orthogonal type indexing, while `vindex` would be a (hopefully) less confusing variant of "fancy" indexing. The biggest reason for introducing this is to provide `oindex` for situations such as: >>> arr = np.arange(25).reshape((5, 5)) >>> arr[[0, 1], [1, 2]] array([1, 7]) >>> # While most might expect the result to be: >>> arr.oindex[[0, 1], [1, 2]] array([[1, 2], [6, 7]]) To provide backwards compatibility the current plan is to also introduce `arr.legacy_index[...]` or similar, with the (long term) plan to force the users to explicitly choose `oindex`, `vindex`, or `legacy_index` if the indexing operation is otherwise not well defined. There are still some open questions for me regarding, for example: * the exact time line (should we start deprecation immediately, etc.) * the handling of boolean indexing arrays * questions that might crop up about other array-likes/subclasses * Are there indexing needs that we are forgetting but are related? More details the current status of my NEP, which has a lot of examples, can be found at: https://github.com/numpy/numpy/pull/6256/files?short_path=01e4dd9#diff-01e4dd9d2ecf994b24e5883f98f789e6 and comments about are very welcome. There is a fully functional implementation available at https://github.com/numpy/numpy/pull/6075 and you can test it using (after cloning numpy): git fetch upstream pull/6075/head:pr-6075 && git checkout pr-6075; python runtests.py --ipython # Inside ipython (too see the deprecations): import warnings; warnings.simplefilter("always") My current hope for going forward is to get clear feedback of what is wanted, for the naming and generally from third party module people, so that we can polish up the NEP and the community can accept it. With good feedback, I think we may be able to get the new attributes into 1.11. So if you are interested in teaching and have suggestions for the names, or have thoughts about subclasses, or... please share your thoughts! :) Regards, Sebastian -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From philip.deboer at gmail.com Fri Dec 18 23:37:03 2015 From: philip.deboer at gmail.com (Philip DeBoer) Date: Fri, 18 Dec 2015 23:37:03 -0500 Subject: [SciPy-Dev] build errors Message-ID: Hi, I would like to test some new functions before submitting a pull request to incorporate them into scipy. I have the latest scipy/master, and my understanding is that I should run >python runtests.py -v. This seems to rebuild scipy first, then run tests. I assume I need to do this before >python setup.py build_ext -i . The runtests.py is failing to import sgegv_, specfun, and doccer. The first of these is a deprecated function which I am trying to fix; I haven't looked at the latter two yet. My python is 2.7.3. I built the latest numpy and atlas (3.10.2; lapack 3.6). My in-place build is finding these, but I cannot import scipy.stats or scipy.linalg due to this issue. This may be related to recently deprecated functions, but I thought that was fixed on scipy/master. If you could provide me with some pointers I would appreciate it. Thanks, Philip -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Dec 19 04:52:00 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 19 Dec 2015 10:52:00 +0100 Subject: [SciPy-Dev] build errors In-Reply-To: References: Message-ID: On Sat, Dec 19, 2015 at 5:37 AM, Philip DeBoer wrote: > Hi, I would like to test some new functions before submitting a pull > request to incorporate them into scipy. > > I have the latest scipy/master, and my understanding is that I should run > >python runtests.py -v. This seems to rebuild scipy first, then run tests. > I assume I need to do this before >python setup.py build_ext -i . > You want to use either runtests.py or the in-place build, not both (see http://docs.scipy.org/doc/numpy-dev/dev/development_environment.html). > The runtests.py is failing to import sgegv_, specfun, and doccer. The > first of these is a deprecated function which I am trying to fix; I haven't > looked at the latter two yet. > The missing *gegv functions are fixed by https://github.com/scipy/scipy/pull/5576, can you see if that fixes the issue for you? If so, a comment on that PR would be useful. > > My python is 2.7.3. I built the latest numpy and atlas (3.10.2; lapack > 3.6). My in-place build is finding these, but I cannot import scipy.stats > or scipy.linalg due to this issue. This may be related to recently > deprecated functions, but I thought that was fixed on scipy/master. > > If you could provide me with some pointers I would appreciate it. > If the above things don't fix everything yet - maybe the specfun issue - then please send us the relevant part of your build log. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Mon Dec 21 03:30:22 2015 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Mon, 21 Dec 2015 08:30:22 +0000 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release Message-ID: Hi, Here's a proposal to follow the numpy lead (https://mail.scipy.org/pipermail/numpy-discussion/2015-December/074422.html) and stop providing the official binaries for 32-bit windows. The reasons are basically the same as for numpy: these binaries contain BLAS/LAPACK which is necessarily lower quality than what's available from Christoph Gohlke/Canopy/Anaconda; they do not and cannot support python 3.5; the build toolchain is currently broken and to the best of my knowledge nobody knows how to fix it. With regards to the scipy 0.17.0 release, win32 problems are responsible for delaying the release for almost a month (OK, the original schedule was too optimistic, but still). The alternatives, as I see them, are: * stop providing the windows binaries (until a MinGW-64 based toolchain is ready, but this is separate) and make the release. * release 0.17.0 with only python 3.4 binaries available (this is the only python version I can build the binaries which do not segfault immediately :-)) * delay the release with no ETA and no clear path forward. Thoughts? Evgeni From mskhan_iut at yahoo.com Mon Dec 21 10:50:47 2015 From: mskhan_iut at yahoo.com (Shakir Khan) Date: Mon, 21 Dec 2015 15:50:47 +0000 (UTC) Subject: [SciPy-Dev] Enquiry References: <123755637.1838467.1450713047211.JavaMail.yahoo.ref@mail.yahoo.com> Message-ID: <123755637.1838467.1450713047211.JavaMail.yahoo@mail.yahoo.com> Hello I would like to ask the following thing in my post: "Hi, I am looking for Fast Gauss Transform algorithm in python. There are a number of?matlab (mex) function available but I am in need of pure python code. Has anyone any idea? Thanks in advance." Regards,?Md Shakir Khan? DMT Student | EIT Digital Master SchoolT:?+31 6 53113203E:?M.S.Khan at student.tudelft.nl | mskha at kth.se?? -------------- next part -------------- An HTML attachment was scrubbed... URL: From sturla.molden at gmail.com Mon Dec 21 22:20:39 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Tue, 22 Dec 2015 04:20:39 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: Message-ID: I would be in favor of temporarily dropping Windows binary installers and wheels. I would be in favor of just distributing the source code, on any platform, including Windows, Linux and Mac. SciPy is even worse off than NumPy with respect to Windows, because it also needs a Fortran compiler. Currently the only build tools that really works is Intel compilers and MKL. This requires a non-free license for the binary distributions. The official Python 2.7 installer is tied to an outdated version of MSVC. Anaconda, Canopy et al. do not have this limitation. They can build their own Python with a current version of Visual Studio. Installing a full SciPy stack and taking care of all dependencies requires something like Anaconda or Enthought Canopy anyway. I am not sure PyPI is sufficient. This is not just a Windows problem, though. Python and SciPy on MacOSX has many of the same issues, and we are stuck with an outdated gcc based toolchain to make "fat" binaries. Also it encourages abuse of the system Python; on El Capitan the system Python has a special protection layer preventing package installation. We should not encourage anyone to install something into it. Users should be adviced to use tools like Anaconda, and it is roughly the same situation as we have on Windows. On Linux the maintainers of the distributions can build and ship whatever packages they want, so that is not our problem. Sturla On 21/12/15 09:30, Evgeni Burovski wrote: > Hi, > > Here's a proposal to follow the numpy lead > (https://mail.scipy.org/pipermail/numpy-discussion/2015-December/074422.html) > and stop providing the official binaries for 32-bit windows. > > The reasons are basically the same as for numpy: these binaries > contain BLAS/LAPACK which is necessarily lower quality than what's > available from Christoph Gohlke/Canopy/Anaconda; they do not and > cannot support python 3.5; the build toolchain is currently broken and > to the best of my knowledge nobody knows how to fix it. > > With regards to the scipy 0.17.0 release, win32 problems are > responsible for delaying the release for almost a month (OK, the > original schedule was too optimistic, but still). > > The alternatives, as I see them, are: > * stop providing the windows binaries (until a MinGW-64 based > toolchain is ready, but this is separate) and make the release. > * release 0.17.0 with only python 3.4 binaries available (this is the > only python version I can build the binaries which do not segfault > immediately :-)) > * delay the release with no ETA and no clear path forward. > > Thoughts? > > Evgeni > From ralf.gommers at gmail.com Tue Dec 22 01:01:53 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 22 Dec 2015 07:01:53 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: Message-ID: On Tue, Dec 22, 2015 at 4:20 AM, Sturla Molden wrote: > I would be in favor of temporarily dropping Windows binary installers and > wheels. I would be in favor of just distributing the source code, on any > platform, including Windows, Linux and Mac. > There's no good reason that I can see to drop OS X wheels - they don't cause us any issues at the moment. > SciPy is even worse off than NumPy with respect to Windows, because it > also needs a Fortran compiler. Currently the only build tools that really > works is Intel compilers and MKL. This requires a non-free license for the > binary distributions. > > The official Python 2.7 installer is tied to an outdated version of MSVC. > Anaconda, Canopy et al. do not have this limitation. They can build their > own Python with a current version of Visual Studio. > > Installing a full SciPy stack and taking care of all dependencies requires > something like Anaconda or Enthought Canopy anyway. I am not sure PyPI is > sufficient. This is not just a Windows problem, though. > > Python and SciPy on MacOSX has many of the same issues, and we are stuck > with an outdated gcc based toolchain to make "fat" binaries. Also it > encourages abuse of the system Python; on El Capitan the system Python has > a special protection layer preventing package installation. We should not > encourage anyone to install something into it. We don't encourage that. The main advice is to use Anaconda/Canopy ( http://scipy.org/install.html). If people want to manage their own installs, we recommend python.org Python or Homebrew/Macports/Fink ( http://scipy.org/scipylib/building/macosx.html#python) - anything but system Python basically. > Users should be adviced to use tools like Anaconda, and it is roughly the > same situation as we have on Windows. > > On Linux the maintainers of the distributions can build and ship whatever > packages they want, so that is not our problem. > > Sturla > > > > On 21/12/15 09:30, Evgeni Burovski wrote: > >> Hi, >> >> Here's a proposal to follow the numpy lead >> ( >> https://mail.scipy.org/pipermail/numpy-discussion/2015-December/074422.html >> ) >> and stop providing the official binaries for 32-bit windows. >> > +1 > The reasons are basically the same as for numpy: these binaries >> contain BLAS/LAPACK which is necessarily lower quality than what's >> available from Christoph Gohlke/Canopy/Anaconda; they do not and >> cannot support python 3.5; the build toolchain is currently broken and >> to the best of my knowledge nobody knows how to fix it. >> >> With regards to the scipy 0.17.0 release, win32 problems are >> responsible for delaying the release for almost a month (OK, the >> original schedule was too optimistic, but still). >> >> The alternatives, as I see them, are: >> * stop providing the windows binaries (until a MinGW-64 based >> toolchain is ready, but this is separate) and make the release. >> * release 0.17.0 with only python 3.4 binaries available (this is the >> only python version I can build the binaries which do not segfault >> immediately :-)) >> > I would be in favor of dropping them all, and clearly communicating that. Keeping only Python 3.4 32-bit would probably be more confusing than helpful. Ralf > * delay the release with no ETA and no clear path forward. >> >> Thoughts? >> >> Evgeni >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Tue Dec 22 02:45:27 2015 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 22 Dec 2015 08:45:27 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: Message-ID: <20151222074527.GI2408540@phare.normalesup.org> I think that we need to keep in mind that scipy is a library, and thus can plug into a larger ecosystem. If we can provide eggs to ease fitting in an existing environment, we should. I can picture eggs being useful in a continuous integration framework for instance. To only consider full blown distributions as our distribution mechanism raises the danger of cattering only for one type of users: scientists (and data scientists), and to forget "normal" Python developers, for which scipy can be dragged in as a minor dependency in a projet. Of course, this is a solution to avoid delaying the release, but it should rather be considered as a stopgap solution than something satisfactory as part of the bigger picture. My 2 cents, Ga?l From njs at pobox.com Tue Dec 22 02:53:12 2015 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 21 Dec 2015 23:53:12 -0800 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: Message-ID: On Mon, Dec 21, 2015 at 7:20 PM, Sturla Molden wrote: > The official Python 2.7 installer is tied to an outdated version of MSVC. > Anaconda, Canopy et al. do not have this limitation. They can build their > own Python with a current version of Visual Studio. FYI, this is wrong -- the distributors all use an outdated MSVC for their Python 2.7 builds, because they want to be ABI-compatible with the official Python.org builds. This is crucial because they want their users to be able to download existing wheels/eggs (and the wheel/egg compatibility tagging system has baked in knowledge that "py27-on-windows" means MSVC 2008), and to be able to use weirdo internal extensions that were compiled with some other python 2.7 distribution. (The only exception I know of is msys2's python 2.7, which is built with mingw-w64. This makes a mess, though, and python.org has specifically refused to accept patches to even allow upstream cpython to be built this way -- this build basically only exists so msys2 gdb can link to it and run python gdb plugins.) -n -- Nathaniel J. Smith -- http://vorpus.org From cournape at gmail.com Tue Dec 22 07:01:13 2015 From: cournape at gmail.com (David Cournapeau) Date: Tue, 22 Dec 2015 12:01:13 +0000 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: Message-ID: On Tue, Dec 22, 2015 at 7:53 AM, Nathaniel Smith wrote: > On Mon, Dec 21, 2015 at 7:20 PM, Sturla Molden > wrote: > > The official Python 2.7 installer is tied to an outdated version of MSVC. > > Anaconda, Canopy et al. do not have this limitation. They can build their > > own Python with a current version of Visual Studio. > > FYI, this is wrong -- the distributors all use an outdated MSVC for > their Python 2.7 builds, because they want to be ABI-compatible with > the official Python.org builds. > Indeed, that's how (and why) we do it at Enthought. David -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Tue Dec 22 08:36:56 2015 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Tue, 22 Dec 2015 13:36:56 +0000 Subject: [SciPy-Dev] scipy 0.17.x branched Message-ID: Hi, maintenance/0.17.x has been branched and master is now for 0.18-dev. I plan to tag and cut the source-only release candidate tomorrow. (Let's keep the discussion of the windows binaries in the other email thread.) Evgeni From rlucente at pipeline.com Tue Dec 22 11:36:51 2015 From: rlucente at pipeline.com (Robert Lucente) Date: Tue, 22 Dec 2015 11:36:51 -0500 (GMT-05:00) Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release Message-ID: <27764709.1450802212310.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> >Intel compilers If we approach Intel about getting their compilers for free for SciPy, they might agree to it? From ralf.gommers at gmail.com Tue Dec 22 11:58:57 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 22 Dec 2015 17:58:57 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: <20151222074527.GI2408540@phare.normalesup.org> References: <20151222074527.GI2408540@phare.normalesup.org> Message-ID: On Tue, Dec 22, 2015 at 8:45 AM, Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > I think that we need to keep in mind that scipy is a library, and thus > can plug into a larger ecosystem. If we can provide eggs to ease fitting > in an existing environment, we should. I can picture eggs being useful in > a continuous integration framework for instance. > This doesn't really help; if we can build working eggs we can also build working .exe installers. (and also: eggs are the past, there's no point in spending time on them). > To only consider full blown distributions as our distribution mechanism > raises the danger of cattering only for one type of users: scientists > (and data scientists), and to forget "normal" Python developers, for > which scipy can be dragged in as a minor dependency in a projet. > Note that the majority of Windows users probably uses 64-bit Python by now - we don't cater to those users at all anyway. > Of course, this is a solution to avoid delaying the release, but it > should rather be considered as a stopgap solution than something > satisfactory as part of the bigger picture. > Of course, I agree with that. The bigger picture is: move to a decent compiler, and provide 32-bit and 64-bit wheels on PyPi. Exe installers with a GUI are not in the bigger picture in the future. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Dec 22 12:16:38 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 22 Dec 2015 18:16:38 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: <27764709.1450802212310.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> References: <27764709.1450802212310.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> Message-ID: On Tue, Dec 22, 2015 at 5:36 PM, Robert Lucente wrote: > >Intel compilers > If we approach Intel about getting their compilers for free for SciPy, > they might agree to it? > We tried that already, the final answer seems to be no. Also, just getting them for Scipy wouldn't necessary help all other open source projects or people with personal/private projects. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Tue Dec 22 12:30:10 2015 From: cournape at gmail.com (David Cournapeau) Date: Tue, 22 Dec 2015 17:30:10 +0000 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: <20151222074527.GI2408540@phare.normalesup.org> Message-ID: On Tue, Dec 22, 2015 at 4:58 PM, Ralf Gommers wrote: > > > On Tue, Dec 22, 2015 at 8:45 AM, Gael Varoquaux < > gael.varoquaux at normalesup.org> wrote: > >> I think that we need to keep in mind that scipy is a library, and thus >> can plug into a larger ecosystem. If we can provide eggs to ease fitting >> in an existing environment, we should. I can picture eggs being useful in >> a continuous integration framework for instance. >> > > This doesn't really help; if we can build working eggs we can also build > working .exe installers. (and also: eggs are the past, there's no point in > spending time on them). > > >> To only consider full blown distributions as our distribution mechanism >> raises the danger of cattering only for one type of users: scientists >> (and data scientists), and to forget "normal" Python developers, for >> which scipy can be dragged in as a minor dependency in a projet. >> > > Note that the majority of Windows users probably uses 64-bit Python by now > - we don't cater to those users at all anyway. > > >> Of course, this is a solution to avoid delaying the release, but it >> should rather be considered as a stopgap solution than something >> satisfactory as part of the bigger picture. >> > > Of course, I agree with that. The bigger picture is: move to a decent > compiler, and provide 32-bit and 64-bit wheels on PyPi. Exe installers with > a GUI are not in the bigger picture in the future. > Indeed. Our distribution approach needs to adapt to the changing ecosystem. When those "superpack" installers were conceived 7-8 years ago, crashing numpy (because of lack of SSE) on windows was happening all the time, windows 64 bits was rare, and most people wanted to only get numpy and scipy installed. easy_install was horrible, and scikits learn, image, pandas did not exist. Today, the sustaining solution is to have wheels that people can build without too much trouble. David > > Cheers, > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Tue Dec 22 12:50:39 2015 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 22 Dec 2015 18:50:39 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: <20151222074527.GI2408540@phare.normalesup.org> Message-ID: <20151222175039.GK2408540@phare.normalesup.org> On Tue, Dec 22, 2015 at 05:58:57PM +0100, Ralf Gommers wrote: > I think that we need to keep in mind that scipy is a library, and thus > can plug into a larger ecosystem. If we can provide eggs to ease fitting > in an existing environment, we should. I can picture eggs being useful in > a continuous integration framework for instance. > (and also: eggs are the past, there's no point in spending time on > them). ? Yes, indeed. I had wheels in mind, sorry. > Of course, this is a solution to avoid delaying the release, but it > should rather be considered as a stopgap solution than something > satisfactory as part of the bigger picture. > Of course, I agree with that. The bigger picture is: move to a decent > compiler, and provide 32-bit and 64-bit wheels on PyPi. Right. The point that I wanted to make was that it is convenient for us to think that our painful distribution problems are solved by distributions such as anaconda, but we shouldn't forget about the usecases we are leaving behind. I have been surprised to see that in large corporations, laptops are still often installed under Windows 32. This is of course a bit nonsensical. I guess that these usecases should be covered by distributions. Ga?l From gael.varoquaux at normalesup.org Tue Dec 22 12:51:24 2015 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 22 Dec 2015 18:51:24 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: <20151222074527.GI2408540@phare.normalesup.org> Message-ID: <20151222175124.GL2408540@phare.normalesup.org> On Tue, Dec 22, 2015 at 05:30:10PM +0000, David Cournapeau wrote: > Today, the sustaining solution is to have wheels that people can build without > too much trouble. +1! Ga?l From njs at pobox.com Tue Dec 22 13:28:25 2015 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 22 Dec 2015 10:28:25 -0800 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: <27764709.1450802212310.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> Message-ID: On Tue, Dec 22, 2015 at 9:16 AM, Ralf Gommers wrote: > > > On Tue, Dec 22, 2015 at 5:36 PM, Robert Lucente > wrote: >> >> >Intel compilers >> If we approach Intel about getting their compilers for free for SciPy, >> they might agree to it? > > > We tried that already, the final answer seems to be no. Possibly I miscommunicated something here... to make sure it's clear: I'm pretty sure that Intel would be happy to give us a zero cost copy of their compilers; our problem is with the license, which imposes several restrictions that we can't accept as an open-source project. Basically our binaries would not be BSD-licensed, and also our release manager would have to take on unbounded personal financial liability in case anyone ever sued Intel. (And we asked if they could fix the license, and on *that* they said no.) > Also, just getting them for Scipy wouldn't necessary help all other open > source projects or people with personal/private projects. Also this, right. -n -- Nathaniel J. Smith -- http://vorpus.org From matthew.brett at gmail.com Tue Dec 22 13:29:07 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 22 Dec 2015 18:29:07 +0000 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: <20151222175124.GL2408540@phare.normalesup.org> References: <20151222074527.GI2408540@phare.normalesup.org> <20151222175124.GL2408540@phare.normalesup.org> Message-ID: On Tue, Dec 22, 2015 at 5:51 PM, Gael Varoquaux wrote: > On Tue, Dec 22, 2015 at 05:30:10PM +0000, David Cournapeau wrote: >> Today, the sustaining solution is to have wheels that people can build without >> too much trouble. > > +1! +1 from me too, Matthew From evgeny.burovskiy at gmail.com Wed Dec 23 07:34:12 2015 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 23 Dec 2015 12:34:12 +0000 Subject: [SciPy-Dev] ANN: first release candidate for scipy 0.17.0 Message-ID: Hi, I'm pleased to announce the availability of the first release candidate for Scipy 0.17.0. Please try this rc and report any issues on Github tracker or scipy-dev mailing list. Source tarballs and full release notes are available from Github Releases: https://github.com/scipy/scipy/releases/tag/v0.17.0rc1 Please note that this is a source-only release. We do not provide win32 installers for this release. See the email thread starting from for the rationale and discussion. The updated release schedule is as follows: 10 Jan 2016: rc2 (if needed) 17 Jan 2016: final release. Thanks to everyone who contributed to this release! Evgeni Below is a part of the release notes. ========================== SciPy 0.17.0 Release Notes ========================== .. note:: Scipy 0.17.0 is not released yet! .. contents:: SciPy 0.17.0 is the culmination of 6 months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.17.x branch, and on adding new features on the master branch. This release requires Python 2.6, 2.7 or 3.2-3.5 and NumPy 1.6.2 or greater. Release highlights: - New functions for linear and nonlinear least squares optimization with constraints: `scipy.optimize.lsq_linear` and `scipy.optimize.least_squares` - Support for fitting with bounds in `scipy.optimize.curve_fit`. - Significant improvements to `scipy.stats`, providing many functions with better handing of inputs which have NaNs or are empty, improved documentation, and consistent behavior between `scipy.stats` and `scipy.stats.mstats`. - Significant performance improvements and new functionality in `scipy.spatial.cKDTree`. New features ============ `scipy.cluster` improvements - ---------------------------- A new function `scipy.cluster.hierarchy.cut_tree`, which determines a cut tree from a linkage matrix, was added. `scipy.io` improvements - ----------------------- `scipy.io.mmwrite` gained support for symmetric sparse matrices. `scipy.io.netcdf` gained support for masking and scaling data based on data attributes. `scipy.optimize` improvements - ----------------------------- Linear assignment problem solver ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ `scipy.optimize.linear_sum_assignment` is a new function for solving the linear sum assignment problem. It uses the Hungarian algorithm (Kuhn-Munkres). Least squares optimization ~~~~~~~~~~~~~~~~~~~~~~~~~~ A new function for *nonlinear* least squares optimization with constraints was added: `scipy.optimize.least_squares`. It provides several methods: Levenberg-Marquardt for unconstrained problems, and two trust-region methods for constrained ones. Furthermore it provides different loss functions. New trust-region methods also handle sparse Jacobians. A new function for *linear* least squares optimization with constraints was added: `scipy.optimize.lsq_linear`. It provides a trust-region method as well as an implementation of the Bounded-Variable Least-Squares (BVLS) algorithm. `scipy.optimize.curve_fit` now supports fitting with bounds. `scipy.signal` improvements - -------------------------- A ``mode`` keyword was added to `scipy.signal.spectrogram`, to let it return other spectrograms than power spectral density. `scipy.stats` improvements - -------------------------- Many functions in `scipy.stats` have gained a ``nan_policy`` keyword, which allows specifying how to treat input with NaNs in them: propagate the NaNs, raise an error, or omit the NaNs. Many functions in `scipy.stats` have been improved to correctly handle input arrays that are empty or contain infs/nans. A number of functions with the same name in `scipy.stats` and `scipy.stats.mstats` were changed to have matching signature and behavior. See `gh-5474 `__ for details. `scipy.stats.binom_test` and `scipy.stats.mannwhitneyu` gained a keyword ``alternative``, which allows specifying the hypothesis to test for. Eventually all hypothesis testing functions will get this keyword. For methods of many continuous distributions, complex input is now accepted. Matrix normal distribution has been implemented as `scipy.stats.matrix_normal`. `scipy.sparse` improvements - -------------------------- The `axis` keyword was added to sparse norms, `scipy.sparse.linalg.norm`. `scipy.spatial` improvements - ---------------------------- `scipy.spatial.cKDTree` was partly rewritten for improved performance and several new features were added to it: - - the ``query_ball_point`` method became significantly faster - - ``query`` and ``query_ball_point`` gained an ``n_jobs`` keyword for parallel execution - - build and query methods now release the GIL - - full pickling support - - support for periodic spaces - - the ``sparse_distance_matrix`` method can now return and sparse matrix type `scipy.interpolate` improvements - -------------------------------- Out-of-bounds behavior of `scipy.interpolate.interp1d` has been improved. Use a two-element tuple for the ``fill_value`` argument to specify separate fill values for input above and below the interpolation range. Linear and nearest interpolation kinds of `scipy.interpolate.interp1d` support extrapolation via the ``fill_value="extrapolate"`` keyword. `scipy.linalg` improvements - --------------------------- The default algorithm for `scipy.linalg.leastsq` has been changed to use LAPACK's function ``*gelsd``. Users wanting to get the previous behavior can use a new keyword ``lapack_driver="gelss"`` (allowed values are "gelss", "gelsd" and "gelsy"). ``scipy.sparse`` matrices and linear operators now support the matmul (``@``) operator when available (Python 3.5+). See [PEP 465](http://legacy.python.org/dev/peps/pep-0465/) A new function `scipy.linalg.ordqz`, for QZ decomposition with reordering, has been added. Deprecated features =================== ``scipy.stats.histogram`` is deprecated in favor of ``np.histogram``, which is faster and provides the same functionality. ``scipy.stats.threshold`` and ``scipy.mstats.threshold`` are deprecated in favor of ``np.clip``. See issue #617 for details. ``scipy.stats.ss`` is deprecated. This is a support function, not meant to be exposed to the user. Also, the name is unclear. See issue #663 for details. ``scipy.stats.square_of_sums`` is deprecated. This too is a support function not meant to be exposed to the user. See issues #665 and #663 for details. ``scipy.stats.f_value``, ``scipy.stats.f_value_multivariate``, ``scipy.stats.f_value_wilks_lambda``, and ``scipy.mstats.f_value_wilks_lambda`` are deprecated. These are related to ANOVA, for which ``scipy.stats`` provides quite limited functionality and these functions are not very useful standalone. See issues #660 and #650 for details. ``scipy.stats.chisqprob`` is deprecated. This is an alias. ``stats.chi2.sf`` should be used instead. ``scipy.stats.betai`` is deprecated. This is an alias for ``special.betainc`` which should be used instead. Backwards incompatible changes ============================== The functions ``stats.trim1`` and ``stats.trimboth`` now make sure the elements trimmed are the lowest and/or highest, depending on the case. Slicing without at least partial sorting was previously done, but didn't make sense for unsorted input. When ``variable_names`` is set to an empty list, ``scipy.io.loadmat`` now correctly returns no values instead of all the contents of the MAT file. Element-wise multiplication of sparse matrices now returns a sparse result in all cases. Previously, multiplying a sparse matrix with a dense matrix or array would return a dense matrix. The function ``misc.lena`` has been removed due to license incompatibility. The constructor for ``sparse.coo_matrix`` no longer accepts ``(None, (m,n))`` to construct an all-zero matrix of shape ``(m,n)``. This functionality was deprecated since at least 2007 and was already broken in the previous SciPy release. Use ``coo_matrix((m,n))`` instead. The Cython wrappers in ``linalg.cython_lapack`` for the LAPACK routines ``*gegs``, ``*gegv``, ``*gelsx``, ``*geqpf``, ``*ggsvd``, ``*ggsvp``, ``*lahrd``, ``*latzm``, ``*tzrqf`` have been removed since these routines are not present in the new LAPACK 3.6.0 release. With the exception of the routines ``*ggsvd`` and ``*ggsvp``, these were all deprecated in favor of routines that are currently present in our Cython LAPACK wrappers. Because the LAPACK ``*gegv`` routines were removed in LAPACK 3.6.0. The corresponding Python wrappers in ``scipy.linalg.lapack`` are now deprecated and will be removed in a future release. The source files for these routines have been temporarily included as a part of ``scipy.linalg`` so that SciPy can be built against LAPACK versions that do not provide these deprecated routines. Other changes ============= Html and pdf documentation of development versions of Scipy is now automatically rebuilt after every merged pull request. `scipy.constants` is updated to the CODATA 2014 recommended values. Usage of `scipy.fftpack` functions within Scipy has been changed in such a way that `PyFFTW `__ can easily replace `scipy.fftpack` functions (with improved performance). See `gh-5295 `__ for details. The ``imread`` functions in `scipy.misc` and `scipy.ndimage` were unified, for which a ``mode`` argument was added to `scipy.misc.imread`. Also, bugs for 1-bit and indexed RGB image formats were fixed. ``runtests.py``, the development script to build and test Scipy, now allows building in parallel with ``--parallel``. Authors ======= * @cel4 + * @chemelnucfin + * @endolith * @mamrehn + * @tosh1ki + * Joshua L. Adelman + * Anne Archibald * Herv? Audren + * Vincent Barrielle + * Bruno Beltran + * Sumit Binnani + * Joseph Jon Booker * Olga Botvinnik + * Michael Boyle + * Matthew Brett * Zaz Brown + * Lars Buitinck * Pete Bunch + * Evgeni Burovski * CJ Carey * Ien Cheng + * Cody + * Jaime Fernandez del Rio * Ales Erjavec + * Abraham Escalante * Yves-R?mi Van Eycke + * Yu Feng + * Eric Firing * Francis T. O'Donovan + * Andr? Gaul * Christoph Gohlke * Ralf Gommers * Alex Griffing * Alexander Grigorievskiy * Charles Harris * J?rn Hees + * Ian Henriksen * David Men?ndez Hurtado * Gert-Ludwig Ingold * Aakash Jain + * Rohit Jamuar + * Jan Schl?ter * Johannes Ball? * Luke Zoltan Kelley + * Jason King + * Andreas Kopecky + * Eric Larson * Denis Laxalde * Antony Lee * Gregory R. Lee * Josh Levy-Kramer + * Sam Lewis + * Fran?ois Magimel + * Mart?n Gait?n + * Sam Mason + * Andreas Mayer * Nikolay Mayorov * Damon McDougall + * Robert McGibbon * Sturla Molden * Will Monroe + * Eric Moore * Maniteja Nandana * Vikram Natarajan + * Andrew Nelson * Marti Nito + * Behzad Nouri + * Daisuke Oyama + * Giorgio Patrini + * Fabian Paul + * Christoph Paulik + * Mad Physicist + * Irvin Probst * Sebastian Pucilowski + * Ted Pudlik + * Eric Quintero * Yoav Ram + * Joscha Reimer + * Juha Remes * Frederik Rietdijk + * R?my L?one + * Christian Sachs + * Skipper Seabold * Sebastian Skoup? + * Alex Seewald + * Andreas Sorge + * Bernardo Sulzbach + * Julian Taylor * Louis Tiao + * Utkarsh Upadhyay + * Jacob Vanderplas * Gael Varoquaux + * Pauli Virtanen * Fredrik Wallner + * Stefan van der Walt * James Webber + * Warren Weckesser * Raphael Wettinger + * Josh Wilson + * Nat Wilson + * Peter Yin + A total of 101 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. From sturla.molden at gmail.com Wed Dec 23 17:35:19 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Wed, 23 Dec 2015 23:35:19 +0100 Subject: [SciPy-Dev] win32 binaries and scipy 0.17.0 release In-Reply-To: References: <27764709.1450802212310.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> Message-ID: On 22/12/15 18:16, Ralf Gommers wrote: > We tried that already, the final answer seems to be no. IIRC we got permission to distribute SciPy with MKL (not sure about using ifort) several years ago, but we decided to not use it because it would taint the license of the binaries. Sturla From matthew.brett at gmail.com Thu Dec 24 07:14:08 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 24 Dec 2015 12:14:08 +0000 Subject: [SciPy-Dev] ANN: first release candidate for scipy 0.17.0 In-Reply-To: References: Message-ID: Hi, On Wed, Dec 23, 2015 at 12:34 PM, Evgeni Burovski wrote: > Hi, > > I'm pleased to announce the availability of the first release > candidate for Scipy 0.17.0. Please try this rc and report any issues > on Github tracker or scipy-dev mailing list. > Source tarballs and full release notes are available from Github > Releases: https://github.com/scipy/scipy/releases/tag/v0.17.0rc1 > > Please note that this is a source-only release. We do not provide > win32 installers for this release. See the email thread starting from > > for the rationale and discussion. > > The updated release schedule is as follows: > 10 Jan 2016: rc2 (if needed) > 17 Jan 2016: final release. > > Thanks to everyone who contributed to this release! Thanks for this. Wheels for rc1 at wheels.scipy.org, built via https://travis-ci.org/MacPython/scipy-wheels pip install --trusted-host wheels.scipy.org -f http://wheels.scipy.org --pre scipy Cheers, Matthew From tpudlik at gmail.com Thu Dec 24 11:32:44 2015 From: tpudlik at gmail.com (Ted Pudlik) Date: Thu, 24 Dec 2015 16:32:44 +0000 Subject: [SciPy-Dev] Updating special function docs from AMOS source files Message-ID: Hello, I noticed that the SciPy reference documentation for some special functions is minimal (e.g., scipy.special.jv ), although the AMOS source files which these functions wrap contain extensive comments on the algorithms used and their limitations, including references (e.g., zbesj.f ). Would it be appropriate to update the reference documentation using the information in the source files? (Are there any copyright issues?) Best, Ted -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Dec 24 11:49:48 2015 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 24 Dec 2015 16:49:48 +0000 Subject: [SciPy-Dev] Updating special function docs from AMOS source files In-Reply-To: References: Message-ID: On Thu, Dec 24, 2015 at 4:32 PM, Ted Pudlik wrote: > > Hello, > > I noticed that the SciPy reference documentation for some special functions is minimal (e.g., scipy.special.jv), although the AMOS source files which these functions wrap contain extensive comments on the algorithms used and their limitations, including references (e.g., zbesj.f). Would it be appropriate to update the reference documentation using the information in the source files? (Are there any copyright issues?) Sounds like a good idea! If the comments are already checked into the repo with the code, then there is no problem copying it over to the Python-level docstrings. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Dec 30 09:00:57 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 30 Dec 2015 15:00:57 +0100 Subject: [SciPy-Dev] dropping numpy 1.6.2 support Message-ID: Hi all, It's been a year since we dropped support for numpy 1.5.1 and decided to keep supporting numpy 1.6.2 for a while longer. There are regularly PRs that need to be adapted because they fail with numpy 1.6.2, latest example is https://github.com/scipy/scipy/pull/5641. So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported version. This gives us 4 supported numpy versions, which is enough for users. Other consideration: - Ubuntu 14.04 (latest LTS version) is on 1.8.1 [2] - Debian stable is on 1.8.2 [3] - Centos 7 is on 1.7.1 [4] So Centos is the only problematic one. We can either ignore Centos (they really should be using the latest bugfix version of a minor release) or we can check for >= 1.7.1 in scipy/__init__.py but test against 1.7.2 on TravisCI. For an impression of where we work around 1.6.2 things, try ``$ grin "'1.7.0'"`` Thoughts? Ralf [1] http://article.gmane.org/gmane.comp.python.scientific.devel/19283 [2] http://packages.ubuntu.com/search?keywords=numpy&searchon=names&suite=all§ion=all [3] https://packages.debian.org/search?keywords=numpy&searchon=names&suite=all§ion=all [4] http://mirror.centos.org/centos/7/os/x86_64/Packages/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Wed Dec 30 09:14:02 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Wed, 30 Dec 2015 15:14:02 +0100 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On 30 December 2015 at 15:00, Ralf Gommers wrote: > > So Centos is the only problematic one. We can either ignore Centos (they > really should be using the latest bugfix version of a minor release) or we > can check for >= 1.7.1 in scipy/__init__.py but test against 1.7.2 on > TravisCI. In my opinion, this is not a factor. I don't think a hypothetical user capable of build and install Scipy, but that has to rely on whatever Numpy is provided in the repositories, is very likely. /David. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Dec 30 16:01:49 2015 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 31 Dec 2015 00:01:49 +0300 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On Dec 30, 2015 5:01 PM, "Ralf Gommers" wrote: > > Hi all, > > It's been a year since we dropped support for numpy 1.5.1 and decided to keep supporting numpy 1.6.2 for a while longer. There are regularly PRs that need to be adapted because they fail with numpy 1.6.2, latest example is https://github.com/scipy/scipy/pull/5641. > > So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported version. This gives us 4 supported numpy versions, which is enough for users. Other consideration: > - Ubuntu 14.04 (latest LTS version) is on 1.8.1 [2] > - Debian stable is on 1.8.2 [3] > - Centos 7 is on 1.7.1 [4] > So Centos is the only problematic one. We can either ignore Centos (they really should be using the latest bugfix version of a minor release) or we can check for >= 1.7.1 in scipy/__init__.py but test against 1.7.2 on TravisCI. > > For an impression of where we work around 1.6.2 things, try ``$ grin "'1.7.0'"`` > > Thoughts? > > Ralf > > > [1] http://article.gmane.org/gmane.comp.python.scientific.devel/19283 > [2] http://packages.ubuntu.com/search?keywords=numpy&searchon=names&suite=all§ion=all > [3] https://packages.debian.org/search?keywords=numpy&searchon=names&suite=all§ion=all > [4] http://mirror.centos.org/centos/7/os/x86_64/Packages/ > > +1 for bumping it up to a hard lower bound of 1.7.1, tested/supported soft lower bound of 1.7.2. While we're here, I guess it makes sense to think about dropping Python 2.6, 3.2 and maybe even 3.3. Evgeni -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Wed Dec 30 21:33:58 2015 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Wed, 30 Dec 2015 21:33:58 -0500 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On Wed, Dec 30, 2015 at 9:00 AM, Ralf Gommers wrote: > Hi all, > > It's been a year since we dropped support for numpy 1.5.1 and decided to > keep supporting numpy 1.6.2 for a while longer. There are regularly PRs > that need to be adapted because they fail with numpy 1.6.2, latest example > is https://github.com/scipy/scipy/pull/5641. > > So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported > version. > +1 Warren > This gives us 4 supported numpy versions, which is enough for users. Other > consideration: > - Ubuntu 14.04 (latest LTS version) is on 1.8.1 [2] > - Debian stable is on 1.8.2 [3] > - Centos 7 is on 1.7.1 [4] > So Centos is the only problematic one. We can either ignore Centos (they > really should be using the latest bugfix version of a minor release) or we > can check for >= 1.7.1 in scipy/__init__.py but test against 1.7.2 on > TravisCI. > > For an impression of where we work around 1.6.2 things, try ``$ grin > "'1.7.0'"`` > > Thoughts? > > Ralf > > > [1] http://article.gmane.org/gmane.comp.python.scientific.devel/19283 > [2] > http://packages.ubuntu.com/search?keywords=numpy&searchon=names&suite=all§ion=all > [3] > https://packages.debian.org/search?keywords=numpy&searchon=names&suite=all§ion=all > [4] http://mirror.centos.org/centos/7/os/x86_64/Packages/ > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sturla.molden at gmail.com Thu Dec 31 01:18:25 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Thu, 31 Dec 2015 07:18:25 +0100 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On 30/12/15 22:01, Evgeni Burovski wrote: > While we're here, I guess it makes sense to think about dropping Python > 2.6, 3.2 and maybe even 3.3. I am +1 for this, at least for 2.6 and 3.2. Sturla From sturla.molden at gmail.com Thu Dec 31 01:47:15 2015 From: sturla.molden at gmail.com (Sturla Molden) Date: Thu, 31 Dec 2015 07:47:15 +0100 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On 30/12/15 15:00, Ralf Gommers wrote: > It's been a year since we dropped support for numpy 1.5.1 and decided to > keep supporting numpy 1.6.2 for a while longer. There are regularly PRs > that need to be adapted because they fail with numpy 1.6.2, latest > example is https://github.com/scipy/scipy/pull/5641. > > So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported > version. +1 for this. Ceterum censeo Python 2.6 sustentationem esse delendam. From matthew.brett at gmail.com Thu Dec 31 02:10:10 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 31 Dec 2015 07:10:10 +0000 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: Hi, On Thu, Dec 31, 2015 at 6:47 AM, Sturla Molden wrote: > On 30/12/15 15:00, Ralf Gommers wrote: > >> It's been a year since we dropped support for numpy 1.5.1 and decided to >> keep supporting numpy 1.6.2 for a while longer. There are regularly PRs >> that need to be adapted because they fail with numpy 1.6.2, latest >> example is https://github.com/scipy/scipy/pull/5641. >> >> So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported >> version. > > > +1 for this. I seem to remember that 1.7.1 is a fairly common minimum around the place. Is there something about 1.7.2 specifically that makes it easier to support than 1.7.1? Cheers, Matthew From ralf.gommers at gmail.com Thu Dec 31 02:17:55 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 31 Dec 2015 08:17:55 +0100 Subject: [SciPy-Dev] dropping Python 2.6, 3.2 and 3.3 support (was: dropping numpy 1.6.2 support) Message-ID: On Thu, Dec 31, 2015 at 7:18 AM, Sturla Molden wrote: > On 30/12/15 22:01, Evgeni Burovski wrote: > > While we're here, I guess it makes sense to think about dropping Python >> 2.6, 3.2 and maybe even 3.3. >> > > I am +1 for this, at least for 2.6 and 3.2. +1 for this, those versions (including 3.3) are pretty much just a waste of developer effort and CI resources by now. Changing the subject line - this is a separate decision I think with possibly other interested users than the numpy 1.6.2 decision. Ralf P.S. here's the thread in which numpy decided to drop support after numpy 1.11: http://article.gmane.org/gmane.comp.python.numeric.general/61987 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Dec 31 02:24:52 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 31 Dec 2015 08:24:52 +0100 Subject: [SciPy-Dev] dropping numpy 1.6.2 support In-Reply-To: References: Message-ID: On Thu, Dec 31, 2015 at 8:10 AM, Matthew Brett wrote: > Hi, > > On Thu, Dec 31, 2015 at 6:47 AM, Sturla Molden > wrote: > > On 30/12/15 15:00, Ralf Gommers wrote: > > > >> It's been a year since we dropped support for numpy 1.5.1 and decided to > >> keep supporting numpy 1.6.2 for a while longer. There are regularly PRs > >> that need to be adapted because they fail with numpy 1.6.2, latest > >> example is https://github.com/scipy/scipy/pull/5641. > >> > >> So proposal: drop numpy 1.6.2, and make numpy 1.7.2 the lowest supported > >> version. > > > > > > +1 for this. > > I seem to remember that 1.7.1 is a fairly common minimum around the > place. Is there something about 1.7.2 specifically that makes it > easier to support than 1.7.1? > There are quite a few relevant bug fixes in 1.7.2: https://github.com/numpy/numpy/blob/master/doc/release/1.7.2-notes.rst Also, we already have 1.7.2 as the lowest supported version for Python 3.x, lower versions simply didn't work (IIRC due to f2py): https://github.com/scipy/scipy/blob/master/pavement.py#L559 Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From tpudlik at gmail.com Thu Dec 31 13:19:47 2015 From: tpudlik at gmail.com (Ted Pudlik) Date: Thu, 31 Dec 2015 18:19:47 +0000 Subject: [SciPy-Dev] PR adding vectorized spherical Bessel functions Message-ID: Hello, I started a pull request providing vectorized implementations of four spherical Bessel functions (jn, yn, in, kn) and their derivatives. This is my first large pull requests, so I would be grateful for any comments! Highlights: 1. Speedup of about an order of magnitude relative to `np.vectorize` (generally more, except for jn for real argument). 2. Recursion largely replaced with the relationship to cylindrical Bessel functions , which resolves accuracy issues for large argument (see gh-1641 , gh-2165 for jn at large real argument, but similar problems existed for yn and kn at large complex argument). 3. More careful treatment of 0, infinity, and overflow. (E.g., sph_kn(125, 0.1) = inf, not 0; yn(5, 0) = -inf, not -10**300.) Best wishes, Ted -------------- next part -------------- An HTML attachment was scrubbed... URL: