From ralf.gommers at gmail.com Wed Aug 1 01:53:28 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 31 Jul 2018 22:53:28 -0700 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 177, Issue 21 In-Reply-To: References: Message-ID: On Thu, Jul 26, 2018 at 7:47 AM, Matt Newville wrote: > Hi Andrew, Dickson, All, > > Sorry for barging in rather late to this discussion, and for responding to > a digest, and if this appears to be too negative. > No worries Matt, and thanks for the insightful comments. > >> Date: Mon, 23 Jul 2018 23:36:37 -0400 >> From: dickson41152 >> To: Andrew Nelson >> Cc: scipy-dev >> Subject: Re: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex >> Algorithm >> Message-ID: >> <5xgSJzGhTqz3FgLpZlSQCqNBm-v5aytNC9kOBDasi_ >> Mus8NNwoCKjPVEmrVqOpIn2B22oT1r0jOc8cNWJaBxRicbjjkAClPDRUwCRdPgAXs=@ >> protonmail.com> >> >> Content-Type: text/plain; charset="utf-8" >> >> Thank you for your interest Andrew. >> >> > variable transformations to form unconstrained problems from bound >> constrained problems >> >> I am considering writing a new transformation function to calculate the >> equations on that web page that you linked to discussing variable >> transformations. If it is as straight-forward as I think it will be to >> write, then I may also do some comparisons between the "clipping" method >> and the sin() / arcsin() / sqrt() method you highlighted. >> >> As an aside, I was told informally about the existence of that method you >> referenced during a conversation I had last week, but I have been finding >> it very difficult to find any written discussion (i.e. actual analytical >> expressions) about that method via web searches before I made the >> proposals. So thank you very much for linking that web page - it is very >> useful information for my own personal work as well as the proposed Scipy >> modifications! >> >> > user-specified objective function values for points in the >> user-specified initial simplex >> >> I have indeed used a keyword in the manner you describe (and therefore >> similar in style to the initial_simplex keyword) - it seemed to me to be a >> "clean" way of implementing this modification and very straightforward to >> implement since it is so similar to the initial_simplex handling. >> >> > evaluation budget count check >> >> I am thinking of writing checks which will effectively skip the execution >> of the remainder of the solver algorithm if the budget has been met, before >> finally sorting the current simplex of evaluations and returning the best. >> My inclination is to avoid using StopIteration since it is an exception >> i.e. it suggests some kind of problem has occurred, where as my proposed >> idea is to handle evaluations up to the exact budget as a feature. >> >> Anyway I haven't yet made an attempt on this modification, so my thoughts >> are tentative and may change - especially if it turns out that I am trying >> to write spaghetti. >> >> dickson > > > I think that adding bounds to Nelder-Mead Simplex is an interesting idea. > Using the Minuit-style bounds - using smooth functions to transform > infiniite "internal" values to finite "external" values - would definitely > be an easy and stable way to do that. > > But, with all due respect, I also think it is the wrong approach ;). The > main point of the lmfit package that Andrew linked to is that the fitting > *algorithm* should not support bounds on variable values. I do understand > that other scipy.optimize algorithms support bounds, and yes, I do think > that is an unfortunate choice ;). Instead, the *parameters* should have > bounds (and perhaps other properties), and the algorithm should work on > these parameter objects. In this way one does not add bounds to > Nelder-Mead, and then add bounds to Powell's method, and then add bounds to > Levenberg-Marquardt, etc. Rather, one adds bounds to parameters, and has > the methods use these parameters instead of scalar values, and thus one add > bounds to *all* methods in a consistent and transferrable way. > > A few months ago Andrew suggested a class-based approach to implementing > fitting algorithms. I think this idea has a lot of merit to it. I also > believe that lmfit successfully demonstrates that having parameter objects > as described briefly above does better encapsulate the desired properties > (bounds, etc) *and* makes the code implementing the algorithms clearer and > simpler. > > In fact, if I understand the original goals here to be: > 1. Allowing bound constraints > > 2. Allowing the user to specify initial objective function values for any > or all points in a user's initial simplex. > > 3. Checking the objective function evaluation count against the > user-specified evaluation budget at each function evaluation > . > > I would point out that lmfit has a Nelder-Mead solver that already does #1 > and #3 and that (as with bounds), #3 may be best done not within the > solver, but in outer code that calls the solver, so that this may be > available in a consistent manner to multiple methods. > If #2 means to memoize each step in the process to avoid repeated > calculation, this does not currently exist > within lmfit. > Such a goal might really be best done in the solver method itself. > It > may be > also be > doable using a per-iteration callback > (which lmfit does have) > , > though I don't know that anyone > one has ever tried this. > We've never really figured out the scope of scipy.optimize very well. Clearly adding bounds to solvers is an often requested feature. However we don't want to reinvent the wheel completely. It may be useful to take this discussion as an opportunity to put something on the roadmap about what to add and where to let other packages like lmfit take over. Cheers, Ralf > I apologize if this appears to be an outsider yelling "you're doing it all > wrong". I rely heavily on scipy.optimize for my daily work, and would like > to be able to make these tools better but I am over-committed and cannot > contribute significantly to the scipy ecosystem myself other than try to > support lmfit as best I can. > > Cheers, > > --Matt > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From phillip.m.feldman at gmail.com Wed Aug 1 13:59:05 2018 From: phillip.m.feldman at gmail.com (Phillip Feldman) Date: Wed, 1 Aug 2018 10:59:05 -0700 Subject: [SciPy-Dev] [GSOC 2018 Project Thread]: Rotation formalism in 3 dimensions In-Reply-To: References: <163213d9b99.e31ddfbf140246.3867213885941953730@zoho.com> Message-ID: Has this project been completed? If so, is there a timeline for incorporating the changes into the main branch? On Mon, Jun 4, 2018 at 1:50 PM, Aditya Bharti wrote: > We do plan to create a tutorial with examples highlighting the major > functionality of the module and some common use cases, but its exact nature > and extent will depend upon how much time we have left after completing the > implementation (including docstrings and tests). > > Best, > Aditya > > On Mon, 4 Jun 2018 at 09:45, Phillip Feldman > wrote: > >> "Refer to docs for more details." Will there be user documentation >> beyond the doc strings in the various methods? >> >> On Tue, May 29, 2018 at 2:21 PM, Aditya Bharti >> wrote: >> >>> Hi all, >>> Continuing the work so far, the following have been implemented this >>> week: >>> >>> - `from_rotvec`, `as_rotvec`: Representing Euler angles as rotation >>> vectors, with appropriate Taylor series expansions for small angles >>> - `from_euler`: Initialization from Euler angles, along with a >>> string based axis sequence specification. Refer to docs for more details. >>> >>> As always, the project lives here >>> , >>> and my personal experiences can be found on the blog >>> . >>> >>> Thanks, >>> Aditya >>> >>> On Sun, 20 May 2018 at 13:28, Phillip Feldman < >>> phillip.m.feldman at gmail.com> wrote: >>> >>>> When you say "discrete cosine matrix", I think that you mean "direction >>>> cosine matrix" (see https://en.wikiversity.org/ >>>> wiki/PlanetPhysics/Direction_Cosine_Matrix). >>>> >>>> Phillip >>>> >>>> On Sat, May 19, 2018 at 12:23 PM, Aditya Bharti >>>> wrote: >>>> >>>>> Hi all, >>>>> So this concludes week 1 of GSoC 2018. I must say it was a great >>>>> learning experience and I invite you all to check out my account of this >>>>> week on the blog . This >>>>> email is more of a technical update. >>>>> >>>>> - So, the main `Rotation` class will live under a new sub module >>>>> `scipy.spatial.transform`. >>>>> - Conversion between quaternions and discrete cosine matrices was >>>>> implemented. >>>>> - The rotation class now supports `from_quaternion`, `from_dcm`, >>>>> `as_quaternion` and `as_dcm`, with support for multiple rotations in one >>>>> call. >>>>> >>>>> The project currently lives in my own fork of scipy here >>>>> . >>>>> Stay tuned for more updates! >>>>> >>>>> Best, >>>>> Aditya >>>>> >>>>> On Wed, 2 May 2018 at 21:03, Aditya Bharti >>>>> wrote: >>>>> >>>>>> Hi Nikolay, >>>>>> >>>>>> I've used Wordpress only once before, so I don't know much about it. >>>>>> From my limited experience, it is extremely customizable. You can customize >>>>>> every thing from the look and feel to SEO characteristics. There are >>>>>> apparently a lot of wordpress plugins for these kind of tasks. For this >>>>>> particular blog, PSF had already setup an account for me with a site on it. >>>>>> All I had to do was click on the 'New' button and open up the new post >>>>>> page. There's a field for a header and body text, with options for adding >>>>>> audio, video and hyperlinks. >>>>>> >>>>>> As regards to the post itself, sure I'll expand it with a brief >>>>>> overview, motivation and an example. Note that the example will only show >>>>>> sample usage, not any internals. I plan to borrow heavily from my proposal >>>>>> for this purpose, I hope that's ok. >>>>>> >>>>>> Regards, >>>>>> Aditya >>>>>> >>>>>> On 2 May 2018 at 19:54, Nikolay Mayorov >>>>>> wrote: >>>>>> >>>>>>> Hi, Aditya! >>>>>>> >>>>>>> Glad that you set up the blog and good job on setting up the >>>>>>> documentation build as well. >>>>>>> >>>>>>> Curious, what is this blogging platform like? How do you create >>>>>>> posts in it? >>>>>>> >>>>>>> As for your first post: while not strictly necessary I think it >>>>>>> would be nice to see a more thorough introductory post with a brief >>>>>>> overview, motivation and/or an example. Do you want to work on it? >>>>>>> >>>>>>> Best, >>>>>>> Nikolay >>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> SciPy-Dev mailing list >>>>>>> SciPy-Dev at python.org >>>>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>>>> >>>>>>> >>>>>> >>>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>> >>>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From adibhar97 at gmail.com Thu Aug 2 01:41:01 2018 From: adibhar97 at gmail.com (Aditya Bharti) Date: Thu, 2 Aug 2018 11:11:01 +0530 Subject: [SciPy-Dev] [GSOC 2018 Project Thread]: Rotation formalism in 3 dimensions In-Reply-To: References: <163213d9b99.e31ddfbf140246.3867213885941953730@zoho.com> Message-ID: This project is very close to completion: 2 more PRs need to finalised during the coming week. A major chunk of the module has already been merged into the master branch and the new `scipy.spatial.transform` submodule currently lives here . The main Rotation class is almost complete and production ready. There are some changes to the spherical linear interpolation (Slerp) class as it stands and the two outstanding pull requests implement Wahba's estimation and Quaternion Spline interpolation. Best, Aditya On Wed, 1 Aug 2018 at 23:29, Phillip Feldman wrote: > Has this project been completed? If so, is there a timeline for > incorporating the changes into the main branch? > > On Mon, Jun 4, 2018 at 1:50 PM, Aditya Bharti wrote: > >> We do plan to create a tutorial with examples highlighting the major >> functionality of the module and some common use cases, but its exact nature >> and extent will depend upon how much time we have left after completing the >> implementation (including docstrings and tests). >> >> Best, >> Aditya >> >> On Mon, 4 Jun 2018 at 09:45, Phillip Feldman >> wrote: >> >>> "Refer to docs for more details." Will there be user documentation >>> beyond the doc strings in the various methods? >>> >>> On Tue, May 29, 2018 at 2:21 PM, Aditya Bharti >>> wrote: >>> >>>> Hi all, >>>> Continuing the work so far, the following have been implemented this >>>> week: >>>> >>>> - `from_rotvec`, `as_rotvec`: Representing Euler angles as rotation >>>> vectors, with appropriate Taylor series expansions for small angles >>>> - `from_euler`: Initialization from Euler angles, along with a >>>> string based axis sequence specification. Refer to docs for more details. >>>> >>>> As always, the project lives here >>>> , >>>> and my personal experiences can be found on the blog >>>> . >>>> >>>> Thanks, >>>> Aditya >>>> >>>> On Sun, 20 May 2018 at 13:28, Phillip Feldman < >>>> phillip.m.feldman at gmail.com> wrote: >>>> >>>>> When you say "discrete cosine matrix", I think that you mean >>>>> "direction cosine matrix" (see >>>>> https://en.wikiversity.org/wiki/PlanetPhysics/Direction_Cosine_Matrix >>>>> ). >>>>> >>>>> Phillip >>>>> >>>>> On Sat, May 19, 2018 at 12:23 PM, Aditya Bharti >>>>> wrote: >>>>> >>>>>> Hi all, >>>>>> So this concludes week 1 of GSoC 2018. I must say it was a great >>>>>> learning experience and I invite you all to check out my account of this >>>>>> week on the blog . >>>>>> This email is more of a technical update. >>>>>> >>>>>> - So, the main `Rotation` class will live under a new sub module >>>>>> `scipy.spatial.transform`. >>>>>> - Conversion between quaternions and discrete cosine matrices was >>>>>> implemented. >>>>>> - The rotation class now supports `from_quaternion`, `from_dcm`, >>>>>> `as_quaternion` and `as_dcm`, with support for multiple rotations in one >>>>>> call. >>>>>> >>>>>> The project currently lives in my own fork of scipy here >>>>>> . >>>>>> Stay tuned for more updates! >>>>>> >>>>>> Best, >>>>>> Aditya >>>>>> >>>>>> On Wed, 2 May 2018 at 21:03, Aditya Bharti >>>>>> wrote: >>>>>> >>>>>>> Hi Nikolay, >>>>>>> >>>>>>> I've used Wordpress only once before, so I don't know much about it. >>>>>>> From my limited experience, it is extremely customizable. You can customize >>>>>>> every thing from the look and feel to SEO characteristics. There are >>>>>>> apparently a lot of wordpress plugins for these kind of tasks. For this >>>>>>> particular blog, PSF had already setup an account for me with a site on it. >>>>>>> All I had to do was click on the 'New' button and open up the new post >>>>>>> page. There's a field for a header and body text, with options for adding >>>>>>> audio, video and hyperlinks. >>>>>>> >>>>>>> As regards to the post itself, sure I'll expand it with a brief >>>>>>> overview, motivation and an example. Note that the example will only show >>>>>>> sample usage, not any internals. I plan to borrow heavily from my proposal >>>>>>> for this purpose, I hope that's ok. >>>>>>> >>>>>>> Regards, >>>>>>> Aditya >>>>>>> >>>>>>> On 2 May 2018 at 19:54, Nikolay Mayorov >>>>>>> wrote: >>>>>>> >>>>>>>> Hi, Aditya! >>>>>>>> >>>>>>>> Glad that you set up the blog and good job on setting up the >>>>>>>> documentation build as well. >>>>>>>> >>>>>>>> Curious, what is this blogging platform like? How do you create >>>>>>>> posts in it? >>>>>>>> >>>>>>>> As for your first post: while not strictly necessary I think it >>>>>>>> would be nice to see a more thorough introductory post with a brief >>>>>>>> overview, motivation and/or an example. Do you want to work on it? >>>>>>>> >>>>>>>> Best, >>>>>>>> Nikolay >>>>>>>> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> SciPy-Dev mailing list >>>>>>>> SciPy-Dev at python.org >>>>>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>>>>> >>>>>>>> >>>>>>> >>>>>> _______________________________________________ >>>>>> SciPy-Dev mailing list >>>>>> SciPy-Dev at python.org >>>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>>> >>>>>> >>>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Thu Aug 2 14:37:07 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Thu, 2 Aug 2018 20:37:07 +0200 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines Message-ID: Due their historical evolution, there are certain LAPACK wrappers that are not standardized. Some work with minimum lwork variables instead of their optimal values. Also these routines often return quite big arrays during the lwork queries, to demonstrate : import scipy.linalg as la la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) is a workspace size query (via lwork=-1). The current default size is "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is 12800 on my machine. Therefore the mismatch is sometimes quite dramatic especially in some other routines. Notice also that to obtain this number the routine actually returns a 400-long array tau and requires the input matrix to be transferred back and forth. Moreover, they can't be handled via scipy.linalg.lapack._compute_lwork function. There are a few routines like this and I feel like they should be fixed and I'm willing to. However this means that their output signature is going to change which imply backwards compatibility breaks. I tried to see whether we could deprecate them with new wrappers with modified names, but to be honest, that would create too many duplicates. On the other hand I don't have a feeling of how much break this would mean out there in the wild. Is this break an acceptable one or not? (well, none is acceptable preferably, but in despair...) Any other alternatives, thoughts are most welcome. best, ilhan -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Aug 2 17:16:06 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 2 Aug 2018 17:16:06 -0400 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: On Thu, Aug 2, 2018 at 2:37 PM, Ilhan Polat wrote: > Due their historical evolution, there are certain LAPACK wrappers that are > not standardized. Some work with minimum lwork variables instead of their > optimal values. Also these routines often return quite big arrays during the > lwork queries, to demonstrate : > > import scipy.linalg as la > la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) > > is a workspace size query (via lwork=-1). The current default size is > "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is 12800 > on my machine. Therefore the mismatch is sometimes quite dramatic especially > in some other routines. Notice also that to obtain this number the routine > actually returns a 400-long array tau and requires the input matrix to be > transferred back and forth. Moreover, they can't be handled via > scipy.linalg.lapack._compute_lwork function. > > There are a few routines like this and I feel like they should be fixed and > I'm willing to. However this means that their output signature is going to > change which imply backwards compatibility breaks. I tried to see whether we > could deprecate them with new wrappers with modified names, but to be > honest, that would create too many duplicates. On the other hand I don't > have a feeling of how much break this would mean out there in the wild. > > Is this break an acceptable one or not? (well, none is acceptable > preferably, but in despair...) > > Any other alternatives, thoughts are most welcome. Is this only for the python functions or also for the cython wrappers to LAPACK? Binary incompatibilities are pretty painful. We just got rid of the compatibility with scipy's old cython wrapper code in statsmodels. Both distributing binaries and conditional compilation depending on the installed scipy versions is fragile and there doesn't seem to be good packaging support for it. Josef > > best, > ilhan > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > From ilhanpolat at gmail.com Thu Aug 2 18:04:55 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Fri, 3 Aug 2018 00:04:55 +0200 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: Cython wrappers won't be affected. This would only affect the people accessing LAPACK routines manually via scipy.linalg.lapack. API directly. On Thu, Aug 2, 2018, 23:17 wrote: > On Thu, Aug 2, 2018 at 2:37 PM, Ilhan Polat wrote: > > Due their historical evolution, there are certain LAPACK wrappers that > are > > not standardized. Some work with minimum lwork variables instead of their > > optimal values. Also these routines often return quite big arrays during > the > > lwork queries, to demonstrate : > > > > import scipy.linalg as la > > la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) > > > > is a workspace size query (via lwork=-1). The current default size is > > "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is > 12800 > > on my machine. Therefore the mismatch is sometimes quite dramatic > especially > > in some other routines. Notice also that to obtain this number the > routine > > actually returns a 400-long array tau and requires the input matrix to be > > transferred back and forth. Moreover, they can't be handled via > > scipy.linalg.lapack._compute_lwork function. > > > > There are a few routines like this and I feel like they should be fixed > and > > I'm willing to. However this means that their output signature is going > to > > change which imply backwards compatibility breaks. I tried to see > whether we > > could deprecate them with new wrappers with modified names, but to be > > honest, that would create too many duplicates. On the other hand I don't > > have a feeling of how much break this would mean out there in the wild. > > > > Is this break an acceptable one or not? (well, none is acceptable > > preferably, but in despair...) > > > > Any other alternatives, thoughts are most welcome. > > Is this only for the python functions or also for the cython wrappers to > LAPACK? > > Binary incompatibilities are pretty painful. We just got rid of the > compatibility with scipy's old cython wrapper code in statsmodels. > Both distributing binaries and conditional compilation depending on > the installed scipy versions is fragile and there doesn't seem to be > good packaging support for it. > > Josef > > > > > > > best, > > ilhan > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Thu Aug 2 18:08:47 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Fri, 3 Aug 2018 00:08:47 +0200 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: Oh also the ones who use get_lapack_funcs machinery On Fri, Aug 3, 2018, 00:04 Ilhan Polat wrote: > Cython wrappers won't be affected. This would only affect the people > accessing LAPACK routines manually via scipy.linalg.lapack. API > directly. > > On Thu, Aug 2, 2018, 23:17 wrote: > >> On Thu, Aug 2, 2018 at 2:37 PM, Ilhan Polat wrote: >> > Due their historical evolution, there are certain LAPACK wrappers that >> are >> > not standardized. Some work with minimum lwork variables instead of >> their >> > optimal values. Also these routines often return quite big arrays >> during the >> > lwork queries, to demonstrate : >> > >> > import scipy.linalg as la >> > la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) >> > >> > is a workspace size query (via lwork=-1). The current default size is >> > "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is >> 12800 >> > on my machine. Therefore the mismatch is sometimes quite dramatic >> especially >> > in some other routines. Notice also that to obtain this number the >> routine >> > actually returns a 400-long array tau and requires the input matrix to >> be >> > transferred back and forth. Moreover, they can't be handled via >> > scipy.linalg.lapack._compute_lwork function. >> > >> > There are a few routines like this and I feel like they should be fixed >> and >> > I'm willing to. However this means that their output signature is going >> to >> > change which imply backwards compatibility breaks. I tried to see >> whether we >> > could deprecate them with new wrappers with modified names, but to be >> > honest, that would create too many duplicates. On the other hand I don't >> > have a feeling of how much break this would mean out there in the wild. >> > >> > Is this break an acceptable one or not? (well, none is acceptable >> > preferably, but in despair...) >> > >> > Any other alternatives, thoughts are most welcome. >> >> Is this only for the python functions or also for the cython wrappers to >> LAPACK? >> >> Binary incompatibilities are pretty painful. We just got rid of the >> compatibility with scipy's old cython wrapper code in statsmodels. >> Both distributing binaries and conditional compilation depending on >> the installed scipy versions is fragile and there doesn't seem to be >> good packaging support for it. >> >> Josef >> >> >> >> > >> > best, >> > ilhan >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at python.org >> > https://mail.python.org/mailman/listinfo/scipy-dev >> > >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From christoph.baumgarten at gmail.com Fri Aug 3 02:43:02 2018 From: christoph.baumgarten at gmail.com (Christoph Baumgarten) Date: Fri, 3 Aug 2018 08:43:02 +0200 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates Message-ID: PR 8293 introduces a method to sample random variates for more complex distributions to scipy.stats (such as hyperbolic distributions and the generalized inverse gaussian (see PR 8681)). I think the PR is in good shape, it would be great if it could be merged soon since i could then continue to work on adding new distributions. The PR has been open for 7 months now. If someone could continue the review, that would bei great. Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From phillip.m.feldman at gmail.com Fri Aug 3 17:37:26 2018 From: phillip.m.feldman at gmail.com (Phillip Feldman) Date: Fri, 3 Aug 2018 14:37:26 -0700 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates In-Reply-To: References: Message-ID: The ability to sample random variates from a distribution where only the density is available seems quite useful. Is there a reference that describes the method? Phillip On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten < christoph.baumgarten at gmail.com> wrote: > PR 8293 introduces a method to sample random variates for more complex > distributions to scipy.stats (such as hyperbolic distributions and the > generalized inverse gaussian (see PR 8681)). I think the PR is in good > shape, it would be great if it could be merged soon since i could then > continue to work on adding new distributions. The PR has been open for 7 > months now. If someone could continue the review, that would bei great. > Thanks! > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Aug 3 21:25:50 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 3 Aug 2018 18:25:50 -0700 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: On Thu, Aug 2, 2018 at 11:37 AM, Ilhan Polat wrote: > Due their historical evolution, there are certain LAPACK wrappers that are > not standardized. Some work with minimum lwork variables instead of their > optimal values. Also these routines often return quite big arrays during > the lwork queries, to demonstrate : > > import scipy.linalg as la > la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) > > is a workspace size query (via lwork=-1). The current default size is > "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is 12800 > on my machine. Therefore the mismatch is sometimes quite dramatic > especially in some other routines. Notice also that to obtain this number > the routine actually returns a 400-long array tau and requires the input > matrix to be transferred back and forth. Moreover, they can't be handled > via scipy.linalg.lapack._compute_lwork function. > > There are a few routines like this and I feel like they should be fixed > and I'm willing to. However this means that their output signature is going > to change which imply backwards compatibility breaks. > What would the output change to? Currently it returns: qr : rank-2 array('d') with bounds (m,n) and a storage tau : rank-1 array('d') with bounds (MIN(m,n)) work : rank-1 array('d') with bounds (MAX(lwork,1)) info : int Ralf > I tried to see whether we could deprecate them with new wrappers with > modified names, but to be honest, that would create too many duplicates. On > the other hand I don't have a feeling of how much break this would mean out > there in the wild. > > Is this break an acceptable one or not? (well, none is acceptable > preferably, but in despair...) > > Any other alternatives, thoughts are most welcome. > > best, > ilhan > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Sat Aug 4 03:49:53 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Sat, 4 Aug 2018 09:49:53 +0200 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: It will gain a dgeqrf_lwork function as usual to return the necessary workspace size as lwork, info = dgeqrf_lwork(m,n) Then the "work" variable will be removed from dgeqrf signature and will be made hidden. For example for the previous example I gave before, the optimal size is 12800 and work array is returning an 12800-long array for a 400-400 array computation. On Sat, Aug 4, 2018 at 3:25 AM, Ralf Gommers wrote: > > > On Thu, Aug 2, 2018 at 11:37 AM, Ilhan Polat wrote: > >> Due their historical evolution, there are certain LAPACK wrappers that >> are not standardized. Some work with minimum lwork variables instead of >> their optimal values. Also these routines often return quite big arrays >> during the lwork queries, to demonstrate : >> >> import scipy.linalg as la >> la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) >> >> is a workspace size query (via lwork=-1). The current default size is >> "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is 12800 >> on my machine. Therefore the mismatch is sometimes quite dramatic >> especially in some other routines. Notice also that to obtain this number >> the routine actually returns a 400-long array tau and requires the input >> matrix to be transferred back and forth. Moreover, they can't be handled >> via scipy.linalg.lapack._compute_lwork function. >> >> There are a few routines like this and I feel like they should be fixed >> and I'm willing to. However this means that their output signature is going >> to change which imply backwards compatibility breaks. >> > > What would the output change to? Currently it returns: > > qr : rank-2 array('d') with bounds (m,n) and a storage > tau : rank-1 array('d') with bounds (MIN(m,n)) > work : rank-1 array('d') with bounds (MAX(lwork,1)) > info : int > > Ralf > > >> I tried to see whether we could deprecate them with new wrappers with >> modified names, but to be honest, that would create too many duplicates. On >> the other hand I don't have a feeling of how much break this would mean out >> there in the wild. >> >> Is this break an acceptable one or not? (well, none is acceptable >> preferably, but in despair...) >> >> Any other alternatives, thoughts are most welcome. >> >> best, >> ilhan >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ndbecker2 at gmail.com Sat Aug 4 07:44:06 2018 From: ndbecker2 at gmail.com (Neal Becker) Date: Sat, 4 Aug 2018 07:44:06 -0400 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates In-Reply-To: References: Message-ID: Don't know about this particular code, but unuran is pretty comprehensive, and there's a good book from the author. On Fri, Aug 3, 2018, 5:37 PM Phillip Feldman wrote: > The ability to sample random variates from a distribution where only the > density is available seems quite useful. Is there a reference that > describes the method? > > Phillip > > On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten < > christoph.baumgarten at gmail.com> wrote: > >> PR 8293 introduces a method to sample random variates for more complex >> distributions to scipy.stats (such as hyperbolic distributions and the >> generalized inverse gaussian (see PR 8681)). I think the PR is in good >> shape, it would be great if it could be merged soon since i could then >> continue to work on adding new distributions. The PR has been open for 7 >> months now. If someone could continue the review, that would bei great. >> Thanks! >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Aug 4 09:06:22 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 4 Aug 2018 06:06:22 -0700 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: On Sat, Aug 4, 2018 at 12:49 AM, Ilhan Polat wrote: > It will gain a dgeqrf_lwork function as usual to return the necessary > workspace size as > > lwork, info = dgeqrf_lwork(m,n) > > Then the "work" variable will be removed from dgeqrf signature and will > be made hidden. > > For example for the previous example I gave before, the optimal size is > 12800 and work array is returning an 12800-long array for a 400-400 array > computation. > Ah okay. Then the alternative is to just leave the work parameter, ignore it in the code if it's passed in (or give a warning/error) and document it as not being used. Right? If you're removing "work" from both the signature and the return value, that's a bigger change indeed, that can't be handled well that way. I'm not 100% sure, but I think I agree that a backwards incompatible change here will be better than introducing a bunch of new functions with worse names. We could introduce a Python wrapper for these to give a proper FutureWarning first. Cheers, Ralf > > > > > > > On Sat, Aug 4, 2018 at 3:25 AM, Ralf Gommers > wrote: > >> >> >> On Thu, Aug 2, 2018 at 11:37 AM, Ilhan Polat >> wrote: >> >>> Due their historical evolution, there are certain LAPACK wrappers that >>> are not standardized. Some work with minimum lwork variables instead of >>> their optimal values. Also these routines often return quite big arrays >>> during the lwork queries, to demonstrate : >>> >>> import scipy.linalg as la >>> la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) >>> >>> is a workspace size query (via lwork=-1). The current default size is >>> "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is 12800 >>> on my machine. Therefore the mismatch is sometimes quite dramatic >>> especially in some other routines. Notice also that to obtain this number >>> the routine actually returns a 400-long array tau and requires the input >>> matrix to be transferred back and forth. Moreover, they can't be handled >>> via scipy.linalg.lapack._compute_lwork function. >>> >>> There are a few routines like this and I feel like they should be fixed >>> and I'm willing to. However this means that their output signature is going >>> to change which imply backwards compatibility breaks. >>> >> >> What would the output change to? Currently it returns: >> >> qr : rank-2 array('d') with bounds (m,n) and a storage >> tau : rank-1 array('d') with bounds (MIN(m,n)) >> work : rank-1 array('d') with bounds (MAX(lwork,1)) >> info : int >> >> Ralf >> >> >>> I tried to see whether we could deprecate them with new wrappers with >>> modified names, but to be honest, that would create too many duplicates. On >>> the other hand I don't have a feeling of how much break this would mean out >>> there in the wild. >>> >>> Is this break an acceptable one or not? (well, none is acceptable >>> preferably, but in despair...) >>> >>> Any other alternatives, thoughts are most welcome. >>> >>> best, >>> ilhan >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sat Aug 4 10:00:36 2018 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 04 Aug 2018 16:00:36 +0200 Subject: [SciPy-Dev] Backwards Compatibility for low level LAPACK routines In-Reply-To: References: Message-ID: <5b464122e6ce8e5a7959f325e534f8d2543801d5.camel@iki.fi> la, 2018-08-04 kello 06:06 -0700, Ralf Gommers kirjoitti: [clip] > Ah okay. Then the alternative is to just leave the work parameter, > ignore > it in the code if it's passed in (or give a warning/error) and > document it > as not being used. Right? > > If you're removing "work" from both the signature and the return > value, > that's a bigger change indeed, that can't be handled well that way. > I'm not > 100% sure, but I think I agree that a backwards incompatible change > here > will be better than introducing a bunch of new functions with worse > names. > > We could introduce a Python wrapper for these to give a proper > FutureWarning first. Or, perhaps you can leave the `work` return variable in, but make it an 1-element array? Its value can be filled in from the `callstatement`, cf eg https://github.com/scipy/scipy/blob/master/scipy/linalg/flapack_gen.pyf.src#L89 The actual work array is then made an intent(hide,cache) variable. Pauli From rlucente at pipeline.com Sat Aug 4 08:04:09 2018 From: rlucente at pipeline.com (rlucente at pipeline.com) Date: Sat, 4 Aug 2018 08:04:09 -0400 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates In-Reply-To: References: Message-ID: <008001d42beb$40a7ba70$c1f72f50$@pipeline.com> Are you referring to http://statmath.wu-wien.ac.at/unuran/ Automatic Nonuniform Random Variate Generation By Wolfgang H?rmann, Josef Leydold, Gerhard Derflinger From: SciPy-Dev On Behalf Of Neal Becker Sent: Saturday, August 4, 2018 7:44 AM To: SciPy Developers List Subject: Re: [SciPy-Dev] Review of PR 8293 - sampling of random variates Don't know about this particular code, but unuran is pretty comprehensive, and there's a good book from the author. On Fri, Aug 3, 2018, 5:37 PM Phillip Feldman > wrote: The ability to sample random variates from a distribution where only the density is available seems quite useful. Is there a reference that describes the method? Phillip On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten > wrote: PR 8293 introduces a method to sample random variates for more complex distributions to scipy.stats (such as hyperbolic distributions and the generalized inverse gaussian (see PR 8681)). I think the PR is in good shape, it would be great if it could be merged soon since i could then continue to work on adding new distributions. The PR has been open for 7 months now. If someone could continue the review, that would bei great. Thanks! _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ndbecker2 at gmail.com Sat Aug 4 21:42:12 2018 From: ndbecker2 at gmail.com (Neal Becker) Date: Sat, 4 Aug 2018 21:42:12 -0400 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates In-Reply-To: <008001d42beb$40a7ba70$c1f72f50$@pipeline.com> References: <008001d42beb$40a7ba70$c1f72f50$@pipeline.com> Message-ID: Yes On Sat, Aug 4, 2018, 9:29 PM wrote: > Are you referring to > > > > http://statmath.wu-wien.ac.at/unuran/ > > > > *Automatic Nonuniform Random Variate Generation* > > By Wolfgang H?rmann, Josef Leydold, Gerhard Derflinger > > > > > > *From:* SciPy-Dev *On > Behalf Of *Neal Becker > *Sent:* Saturday, August 4, 2018 7:44 AM > *To:* SciPy Developers List > *Subject:* Re: [SciPy-Dev] Review of PR 8293 - sampling of random variates > > > > Don't know about this particular code, but unuran is pretty comprehensive, > and there's a good book from the author. > > On Fri, Aug 3, 2018, 5:37 PM Phillip Feldman > wrote: > > The ability to sample random variates from a distribution where only the > density is available seems quite useful. Is there a reference that > describes the method? > > > > Phillip > > > > On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten < > christoph.baumgarten at gmail.com> wrote: > > PR 8293 introduces a method to sample random variates for more complex > distributions to scipy.stats (such as hyperbolic distributions and the > generalized inverse gaussian (see PR 8681)). I think the PR is in good > shape, it would be great if it could be merged soon since i could then > continue to work on adding new distributions. The PR has been open for 7 > months now. If someone could continue the review, that would bei great. > Thanks! > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From christoph.baumgarten at gmail.com Sun Aug 5 02:59:33 2018 From: christoph.baumgarten at gmail.com (Christoph Baumgarten) Date: Sun, 5 Aug 2018 08:59:33 +0200 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates Message-ID: A good source on the ratio of uniforms method is the book by Devroye: "non-uniform random variate generation" or the original paper "Computer Computer Generation of Random Variables using the ratio of uniform deviates" by kinderman / monahan. schrieb am Sa., 4. Aug. 2018, 18:03: > Send SciPy-Dev mailing list submissions to > scipy-dev at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at python.org > > You can reach the person managing the list at > scipy-dev-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: Review of PR 8293 - sampling of random variates (Neal Becker) > 2. Re: Backwards Compatibility for low level LAPACK routines > (Ralf Gommers) > 3. Re: Backwards Compatibility for low level LAPACK routines > (Pauli Virtanen) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sat, 4 Aug 2018 07:44:06 -0400 > From: Neal Becker > To: SciPy Developers List > Subject: Re: [SciPy-Dev] Review of PR 8293 - sampling of random > variates > Message-ID: > < > CAG3t+pF_5GMXGDJUJF-Z8n2iRHvTSpwG6_b4y1dMWSqwKS_29w at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > Don't know about this particular code, but unuran is pretty comprehensive, > and there's a good book from the author. > > On Fri, Aug 3, 2018, 5:37 PM Phillip Feldman > wrote: > > > The ability to sample random variates from a distribution where only the > > density is available seems quite useful. Is there a reference that > > describes the method? > > > > Phillip > > > > On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten < > > christoph.baumgarten at gmail.com> wrote: > > > >> PR 8293 introduces a method to sample random variates for more complex > >> distributions to scipy.stats (such as hyperbolic distributions and the > >> generalized inverse gaussian (see PR 8681)). I think the PR is in good > >> shape, it would be great if it could be merged soon since i could then > >> continue to work on adding new distributions. The PR has been open for 7 > >> months now. If someone could continue the review, that would bei great. > >> Thanks! > >> > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at python.org > >> https://mail.python.org/mailman/listinfo/scipy-dev > >> > >> > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/scipy-dev/attachments/20180804/73dd5df9/attachment-0001.html > > > > ------------------------------ > > Message: 2 > Date: Sat, 4 Aug 2018 06:06:22 -0700 > From: Ralf Gommers > To: SciPy Developers List > Subject: Re: [SciPy-Dev] Backwards Compatibility for low level LAPACK > routines > Message-ID: > < > CABL7CQgmuAF2V0Pyrtt5qEBPE9M1ostJL54fx8YVVTZbgRF7KQ at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > On Sat, Aug 4, 2018 at 12:49 AM, Ilhan Polat wrote: > > > It will gain a dgeqrf_lwork function as usual to return the necessary > > workspace size as > > > > lwork, info = dgeqrf_lwork(m,n) > > > > Then the "work" variable will be removed from dgeqrf signature and will > > be made hidden. > > > > For example for the previous example I gave before, the optimal size is > > 12800 and work array is returning an 12800-long array for a 400-400 array > > computation. > > > > Ah okay. Then the alternative is to just leave the work parameter, ignore > it in the code if it's passed in (or give a warning/error) and document it > as not being used. Right? > > If you're removing "work" from both the signature and the return value, > that's a bigger change indeed, that can't be handled well that way. I'm not > 100% sure, but I think I agree that a backwards incompatible change here > will be better than introducing a bunch of new functions with worse names. > > We could introduce a Python wrapper for these to give a proper > FutureWarning first. > > Cheers, > Ralf > > > > > > > > > > > > > > > > > On Sat, Aug 4, 2018 at 3:25 AM, Ralf Gommers > > wrote: > > > >> > >> > >> On Thu, Aug 2, 2018 at 11:37 AM, Ilhan Polat > >> wrote: > >> > >>> Due their historical evolution, there are certain LAPACK wrappers that > >>> are not standardized. Some work with minimum lwork variables instead of > >>> their optimal values. Also these routines often return quite big arrays > >>> during the lwork queries, to demonstrate : > >>> > >>> import scipy.linalg as la > >>> la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) > >>> > >>> is a workspace size query (via lwork=-1). The current default size is > >>> "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is > 12800 > >>> on my machine. Therefore the mismatch is sometimes quite dramatic > >>> especially in some other routines. Notice also that to obtain this > number > >>> the routine actually returns a 400-long array tau and requires the > input > >>> matrix to be transferred back and forth. Moreover, they can't be > handled > >>> via scipy.linalg.lapack._compute_lwork function. > >>> > >>> There are a few routines like this and I feel like they should be fixed > >>> and I'm willing to. However this means that their output signature is > going > >>> to change which imply backwards compatibility breaks. > >>> > >> > >> What would the output change to? Currently it returns: > >> > >> qr : rank-2 array('d') with bounds (m,n) and a storage > >> tau : rank-1 array('d') with bounds (MIN(m,n)) > >> work : rank-1 array('d') with bounds (MAX(lwork,1)) > >> info : int > >> > >> Ralf > >> > >> > >>> I tried to see whether we could deprecate them with new wrappers with > >>> modified names, but to be honest, that would create too many > duplicates. On > >>> the other hand I don't have a feeling of how much break this would > mean out > >>> there in the wild. > >>> > >>> Is this break an acceptable one or not? (well, none is acceptable > >>> preferably, but in despair...) > >>> > >>> Any other alternatives, thoughts are most welcome. > >>> > >>> best, > >>> ilhan > >>> > >>> _______________________________________________ > >>> SciPy-Dev mailing list > >>> SciPy-Dev at python.org > >>> https://mail.python.org/mailman/listinfo/scipy-dev > >>> > >>> > >> > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at python.org > >> https://mail.python.org/mailman/listinfo/scipy-dev > >> > >> > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/scipy-dev/attachments/20180804/d1e8615c/attachment-0001.html > > > > ------------------------------ > > Message: 3 > Date: Sat, 04 Aug 2018 16:00:36 +0200 > From: Pauli Virtanen > To: scipy-dev at python.org > Subject: Re: [SciPy-Dev] Backwards Compatibility for low level LAPACK > routines > Message-ID: <5b464122e6ce8e5a7959f325e534f8d2543801d5.camel at iki.fi> > Content-Type: text/plain; charset="UTF-8" > > la, 2018-08-04 kello 06:06 -0700, Ralf Gommers kirjoitti: > [clip] > > Ah okay. Then the alternative is to just leave the work parameter, > > ignore > > it in the code if it's passed in (or give a warning/error) and > > document it > > as not being used. Right? > > > > If you're removing "work" from both the signature and the return > > value, > > that's a bigger change indeed, that can't be handled well that way. > > I'm not > > 100% sure, but I think I agree that a backwards incompatible change > > here > > will be better than introducing a bunch of new functions with worse > > names. > > > > We could introduce a Python wrapper for these to give a proper > > FutureWarning first. > > Or, perhaps you can leave the `work` return variable in, but make it an > 1-element array? Its value can be filled in from the `callstatement`, > cf eg > > https://github.com/scipy/scipy/blob/master/scipy/linalg/flapack_gen.pyf.src#L89 > The actual work array is then made an intent(hide,cache) variable. > > Pauli > > > > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 178, Issue 6 > ***************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From phillip.m.feldman at gmail.com Sun Aug 5 03:13:34 2018 From: phillip.m.feldman at gmail.com (Phillip Feldman) Date: Sun, 5 Aug 2018 00:13:34 -0700 Subject: [SciPy-Dev] Review of PR 8293 - sampling of random variates In-Reply-To: References: Message-ID: Devroye's book is excellent. Thanks for reminding me of it! On Sat, Aug 4, 2018 at 11:59 PM, Christoph Baumgarten < christoph.baumgarten at gmail.com> wrote: > > A good source on the ratio of uniforms method is the book by Devroye: > "non-uniform random variate generation" or the original paper "Computer > Computer Generation of Random Variables using the ratio of uniform > deviates" by kinderman / monahan. > > schrieb am Sa., 4. Aug. 2018, 18:03: > >> Send SciPy-Dev mailing list submissions to >> scipy-dev at python.org >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://mail.python.org/mailman/listinfo/scipy-dev >> or, via email, send a message with subject or body 'help' to >> scipy-dev-request at python.org >> >> You can reach the person managing the list at >> scipy-dev-owner at python.org >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of SciPy-Dev digest..." >> >> >> Today's Topics: >> >> 1. Re: Review of PR 8293 - sampling of random variates (Neal Becker) >> 2. Re: Backwards Compatibility for low level LAPACK routines >> (Ralf Gommers) >> 3. Re: Backwards Compatibility for low level LAPACK routines >> (Pauli Virtanen) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Sat, 4 Aug 2018 07:44:06 -0400 >> From: Neal Becker >> To: SciPy Developers List >> Subject: Re: [SciPy-Dev] Review of PR 8293 - sampling of random >> variates >> Message-ID: >> > 29w at mail.gmail.com> >> Content-Type: text/plain; charset="utf-8" >> >> >> Don't know about this particular code, but unuran is pretty comprehensive, >> and there's a good book from the author. >> >> On Fri, Aug 3, 2018, 5:37 PM Phillip Feldman > > >> wrote: >> >> > The ability to sample random variates from a distribution where only the >> > density is available seems quite useful. Is there a reference that >> > describes the method? >> > >> > Phillip >> > >> > On Thu, Aug 2, 2018 at 11:43 PM, Christoph Baumgarten < >> > christoph.baumgarten at gmail.com> wrote: >> > >> >> PR 8293 introduces a method to sample random variates for more complex >> >> distributions to scipy.stats (such as hyperbolic distributions and the >> >> generalized inverse gaussian (see PR 8681)). I think the PR is in good >> >> shape, it would be great if it could be merged soon since i could then >> >> continue to work on adding new distributions. The PR has been open for >> 7 >> >> months now. If someone could continue the review, that would bei great. >> >> Thanks! >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at python.org >> >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> >> >> >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at python.org >> > https://mail.python.org/mailman/listinfo/scipy-dev >> > >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > attachments/20180804/73dd5df9/attachment-0001.html> >> >> ------------------------------ >> >> Message: 2 >> Date: Sat, 4 Aug 2018 06:06:22 -0700 >> From: Ralf Gommers >> To: SciPy Developers List >> Subject: Re: [SciPy-Dev] Backwards Compatibility for low level LAPACK >> routines >> Message-ID: >> > gmail.com> >> Content-Type: text/plain; charset="utf-8" >> >> On Sat, Aug 4, 2018 at 12:49 AM, Ilhan Polat >> wrote: >> >> > It will gain a dgeqrf_lwork function as usual to return the necessary >> > workspace size as >> > >> > lwork, info = dgeqrf_lwork(m,n) >> > >> > Then the "work" variable will be removed from dgeqrf signature and will >> > be made hidden. >> > >> > For example for the previous example I gave before, the optimal size is >> > 12800 and work array is returning an 12800-long array for a 400-400 >> array >> > computation. >> > >> >> Ah okay. Then the alternative is to just leave the work parameter, ignore >> it in the code if it's passed in (or give a warning/error) and document it >> as not being used. Right? >> >> If you're removing "work" from both the signature and the return value, >> that's a bigger change indeed, that can't be handled well that way. I'm >> not >> 100% sure, but I think I agree that a backwards incompatible change here >> will be better than introducing a bunch of new functions with worse names. >> >> We could introduce a Python wrapper for these to give a proper >> FutureWarning first. >> >> Cheers, >> Ralf >> >> >> >> > >> > >> > >> > >> > >> > >> > On Sat, Aug 4, 2018 at 3:25 AM, Ralf Gommers >> > wrote: >> > >> >> >> >> >> >> On Thu, Aug 2, 2018 at 11:37 AM, Ilhan Polat >> >> wrote: >> >> >> >>> Due their historical evolution, there are certain LAPACK wrappers that >> >>> are not standardized. Some work with minimum lwork variables instead >> of >> >>> their optimal values. Also these routines often return quite big >> arrays >> >>> during the lwork queries, to demonstrate : >> >>> >> >>> import scipy.linalg as la >> >>> la.lapack.dgeqrf(a=np.random.rand(400,400), lwork=-1) >> >>> >> >>> is a workspace size query (via lwork=-1). The current default size is >> >>> "3*a.shape[0] + 1" hence 1201. However the optimal workspace size is >> 12800 >> >>> on my machine. Therefore the mismatch is sometimes quite dramatic >> >>> especially in some other routines. Notice also that to obtain this >> number >> >>> the routine actually returns a 400-long array tau and requires the >> input >> >>> matrix to be transferred back and forth. Moreover, they can't be >> handled >> >>> via scipy.linalg.lapack._compute_lwork function. >> >>> >> >>> There are a few routines like this and I feel like they should be >> fixed >> >>> and I'm willing to. However this means that their output signature is >> going >> >>> to change which imply backwards compatibility breaks. >> >>> >> >> >> >> What would the output change to? Currently it returns: >> >> >> >> qr : rank-2 array('d') with bounds (m,n) and a storage >> >> tau : rank-1 array('d') with bounds (MIN(m,n)) >> >> work : rank-1 array('d') with bounds (MAX(lwork,1)) >> >> info : int >> >> >> >> Ralf >> >> >> >> >> >>> I tried to see whether we could deprecate them with new wrappers with >> >>> modified names, but to be honest, that would create too many >> duplicates. On >> >>> the other hand I don't have a feeling of how much break this would >> mean out >> >>> there in the wild. >> >>> >> >>> Is this break an acceptable one or not? (well, none is acceptable >> >>> preferably, but in despair...) >> >>> >> >>> Any other alternatives, thoughts are most welcome. >> >>> >> >>> best, >> >>> ilhan >> >>> >> >>> _______________________________________________ >> >>> SciPy-Dev mailing list >> >>> SciPy-Dev at python.org >> >>> https://mail.python.org/mailman/listinfo/scipy-dev >> >>> >> >>> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at python.org >> >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> >> >> >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at python.org >> > https://mail.python.org/mailman/listinfo/scipy-dev >> > >> > >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > attachments/20180804/d1e8615c/attachment-0001.html> >> >> ------------------------------ >> >> Message: 3 >> Date: Sat, 04 Aug 2018 16:00:36 +0200 >> From: Pauli Virtanen >> To: scipy-dev at python.org >> Subject: Re: [SciPy-Dev] Backwards Compatibility for low level LAPACK >> routines >> Message-ID: <5b464122e6ce8e5a7959f325e534f8d2543801d5.camel at iki.fi> >> Content-Type: text/plain; charset="UTF-8" >> >> la, 2018-08-04 kello 06:06 -0700, Ralf Gommers kirjoitti: >> [clip] >> > Ah okay. Then the alternative is to just leave the work parameter, >> > ignore >> > it in the code if it's passed in (or give a warning/error) and >> > document it >> > as not being used. Right? >> > >> > If you're removing "work" from both the signature and the return >> > value, >> > that's a bigger change indeed, that can't be handled well that way. >> > I'm not >> > 100% sure, but I think I agree that a backwards incompatible change >> > here >> > will be better than introducing a bunch of new functions with worse >> > names. >> > >> > We could introduce a Python wrapper for these to give a proper >> > FutureWarning first. >> >> Or, perhaps you can leave the `work` return variable in, but make it an >> 1-element array? Its value can be filled in from the `callstatement`, >> cf eg >> https://github.com/scipy/scipy/blob/master/scipy/ >> linalg/flapack_gen.pyf.src#L89 >> The actual work array is then made an intent(hide,cache) variable. >> >> Pauli >> >> >> >> >> ------------------------------ >> >> Subject: Digest Footer >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> >> ------------------------------ >> >> End of SciPy-Dev Digest, Vol 178, Issue 6 >> ***************************************** >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From magnunor at gmail.com Thu Aug 9 03:11:46 2018 From: magnunor at gmail.com (Magnus Nord) Date: Thu, 9 Aug 2018 09:11:46 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? Message-ID: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> I'm one of the developers in the HyperSpy project, which is an open source python library for analysing multidimensional data, with a focus on electron microscopy. We're in the processes of splitting the project into smaller pieces, to i) make the code a bit more manageable, ii) make it easier for other projects to use specific functionalities without having to install a lot of extra dependencies. The current focus is moving the file reading/writing functionality to its own library, which includes readers for many types of proprietary electron microscopy file formats. The plan is to keep the GPLv3 license, and only depend on the more standard scientific python stack (numpy, ...). However, we've been having some issues in deciding on a name for the library, and someone suggested using a scikit- prefix (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a scikit prefix would be appropriate for a library focused on file I/O of microscopy data? Discussion on this in the HyperSpy issue tracker: https://github.com/hyperspy/hyperspy/issues/1978 Magnus From sylvain.corlay at gmail.com Thu Aug 9 03:52:26 2018 From: sylvain.corlay at gmail.com (Sylvain Corlay) Date: Thu, 9 Aug 2018 09:52:26 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> Message-ID: Hi Magnus, I am not sure if anyone "owns" the scikit prefix. Although I think that most people would expect a scikit-foobar to be BSD licensed like scipy, scikit-learn, scikit-image, scikit-optimize etc. So I would find it misleading for GPL-licensed software to adopt this naming pattern. Sylvain On Thu, Aug 9, 2018, 09:11 Magnus Nord wrote: > I'm one of the developers in the HyperSpy project, which is an open > source python library for analysing multidimensional data, with a focus > on electron microscopy. We're in the processes of splitting the project > into smaller pieces, to i) make the code a bit more manageable, ii) make > it easier for other projects to use specific functionalities without > having to install a lot of extra dependencies. > > The current focus is moving the file reading/writing functionality to > its own library, which includes readers for many types of proprietary > electron microscopy file formats. The plan is to keep the GPLv3 license, > and only depend on the more standard scientific python stack (numpy, ...). > > However, we've been having some issues in deciding on a name for the > library, and someone suggested using a scikit- prefix > (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a > scikit prefix would be appropriate for a library focused on file I/O of > microscopy data? > > Discussion on this in the HyperSpy issue tracker: > https://github.com/hyperspy/hyperspy/issues/1978 > > Magnus > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From magnunor at gmail.com Thu Aug 9 05:41:02 2018 From: magnunor at gmail.com (Magnus Nord) Date: Thu, 9 Aug 2018 11:41:02 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> Message-ID: <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> The scikit "label" is indeed fairly permissive. Taken from the Scipy's scikit page (https://www.scipy.org/scikits.html): * The package is deemed too specialized to live in SciPy itself or * The package has a GPL (or similar) license which is incompatible with SciPy?s BSD license or * The package is meant to be included in SciPy, but development is still in progress. So using GPLv3 should be in line with the scikit prefix intentions? Magnus On 08/09/2018 09:52 AM, Sylvain Corlay wrote: > Hi Magnus, > > I am not sure if anyone "owns" the scikit prefix. > > Although I think that most people would expect a scikit-foobar to be > BSD licensed like scipy, scikit-learn, scikit-image, scikit-optimize > etc. So I would find it misleading for GPL-licensed software to adopt > this naming pattern. > > Sylvain > > > On Thu, Aug 9, 2018, 09:11 Magnus Nord > wrote: > > I'm one of the developers in the HyperSpy project, which is an open > source python library for analysing multidimensional data, with a > focus > on electron microscopy. We're in the processes of splitting the > project > into smaller pieces, to i) make the code a bit more manageable, > ii) make > it easier for other projects to use specific functionalities without > having to install a lot of extra dependencies. > > The current focus is moving the file reading/writing functionality to > its own library, which includes readers for many types of proprietary > electron microscopy file formats. The plan is to keep the GPLv3 > license, > and only depend on the more standard scientific python stack > (numpy, ...). > > However, we've been having some issues in deciding on a name for the > library, and someone suggested using a scikit- prefix > (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a > scikit prefix would be appropriate for a library focused on file > I/O of > microscopy data? > > Discussion on this in the HyperSpy issue tracker: > https://github.com/hyperspy/hyperspy/issues/1978 > > Magnus > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From sylvain.corlay at gmail.com Thu Aug 9 08:31:57 2018 From: sylvain.corlay at gmail.com (Sylvain Corlay) Date: Thu, 9 Aug 2018 14:31:57 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> Message-ID: Their recommendation is BSD although they are ok with other OSI-approved licenses. I think that using the scikit prefix for GPL software breaks people's expectations about the python scientific stack and is therefore misleading but it is only my opinion. Was the choice of GPLv3 over BSD deliberate or just a stab in the dark? On Thu, Aug 9, 2018, 11:41 Magnus Nord wrote: > The scikit "label" is indeed fairly permissive. Taken from the Scipy's > scikit page (https://www.scipy.org/scikits.html): > > - The package is deemed too specialized to live in SciPy itself or > - The package has a GPL (or similar) license which is incompatible > with SciPy?s BSD license or > - The package is meant to be included in SciPy, but development is > still in progress. > > So using GPLv3 should be in line with the scikit prefix intentions? > > Magnus > > On 08/09/2018 09:52 AM, Sylvain Corlay wrote: > > Hi Magnus, > > I am not sure if anyone "owns" the scikit prefix. > > Although I think that most people would expect a scikit-foobar to be BSD > licensed like scipy, scikit-learn, scikit-image, scikit-optimize etc. So I > would find it misleading for GPL-licensed software to adopt this naming > pattern. > > Sylvain > > > On Thu, Aug 9, 2018, 09:11 Magnus Nord wrote: > >> I'm one of the developers in the HyperSpy project, which is an open >> source python library for analysing multidimensional data, with a focus >> on electron microscopy. We're in the processes of splitting the project >> into smaller pieces, to i) make the code a bit more manageable, ii) make >> it easier for other projects to use specific functionalities without >> having to install a lot of extra dependencies. >> >> The current focus is moving the file reading/writing functionality to >> its own library, which includes readers for many types of proprietary >> electron microscopy file formats. The plan is to keep the GPLv3 license, >> and only depend on the more standard scientific python stack (numpy, ...). >> >> However, we've been having some issues in deciding on a name for the >> library, and someone suggested using a scikit- prefix >> (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a >> scikit prefix would be appropriate for a library focused on file I/O of >> microscopy data? >> >> Discussion on this in the HyperSpy issue tracker: >> https://github.com/hyperspy/hyperspy/issues/1978 >> >> Magnus >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > > _______________________________________________ > SciPy-Dev mailing listSciPy-Dev at python.orghttps://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From magnunor at gmail.com Thu Aug 9 08:48:07 2018 From: magnunor at gmail.com (Magnus Nord) Date: Thu, 9 Aug 2018 14:48:07 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> Message-ID: The reason for having this new library be GPLv3 is that HyperSpy itself is GPLv3, and most of the code will simply be copy-pasted from HyperSpy's IO-module into the new library. Magnus On 08/09/2018 02:31 PM, Sylvain Corlay wrote: > Their recommendation is BSD although they are ok with other > OSI-approved licenses. > > I think that using the scikit prefix for GPL software breaks people's > expectations about the python scientific stack and is therefore > misleading but it is only my opinion. > > Was the choice of GPLv3 over BSD deliberate or just a stab in the dark? > > On Thu, Aug 9, 2018, 11:41 Magnus Nord > wrote: > > The scikit "label" is indeed fairly permissive. Taken from the > Scipy's scikit page (https://www.scipy.org/scikits.html): > > * The package is deemed too specialized to live in SciPy itself or > * The package has a GPL (or similar) license which is > incompatible with SciPy?s BSD license or > * The package is meant to be included in SciPy, but development > is still in progress. > > So using GPLv3 should be in line with the scikit prefix intentions? > > Magnus > > On 08/09/2018 09:52 AM, Sylvain Corlay wrote: >> Hi Magnus, >> >> I am not sure if anyone "owns" the scikit prefix. >> >> Although I think that most people would expect a scikit-foobar to >> be BSD licensed like scipy, scikit-learn, scikit-image, >> scikit-optimize etc. So I would find it misleading for >> GPL-licensed software to adopt this naming pattern. >> >> Sylvain >> >> >> On Thu, Aug 9, 2018, 09:11 Magnus Nord > > wrote: >> >> I'm one of the developers in the HyperSpy project, which is >> an open >> source python library for analysing multidimensional data, >> with a focus >> on electron microscopy. We're in the processes of splitting >> the project >> into smaller pieces, to i) make the code a bit more >> manageable, ii) make >> it easier for other projects to use specific functionalities >> without >> having to install a lot of extra dependencies. >> >> The current focus is moving the file reading/writing >> functionality to >> its own library, which includes readers for many types of >> proprietary >> electron microscopy file formats. The plan is to keep the >> GPLv3 license, >> and only depend on the more standard scientific python stack >> (numpy, ...). >> >> However, we've been having some issues in deciding on a name >> for the >> library, and someone suggested using a scikit- prefix >> (scikit-microscopyIO?, scikit-microscopy-io?). So I'm >> wondering if a >> scikit prefix would be appropriate for a library focused on >> file I/O of >> microscopy data? >> >> Discussion on this in the HyperSpy issue tracker: >> https://github.com/hyperspy/hyperspy/issues/1978 >> >> Magnus >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From sylvain.corlay at gmail.com Thu Aug 9 08:54:09 2018 From: sylvain.corlay at gmail.com (Sylvain Corlay) Date: Thu, 9 Aug 2018 14:54:09 +0200 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> Message-ID: Arg too bad :) I guess that you are stuck with gpl unless all the copyright holders agree to change it... On Thu, Aug 9, 2018, 14:48 Magnus Nord wrote: > The reason for having this new library be GPLv3 is that HyperSpy itself is > GPLv3, and most of the code will simply be copy-pasted from HyperSpy's > IO-module into the new library. > > Magnus > > On 08/09/2018 02:31 PM, Sylvain Corlay wrote: > > Their recommendation is BSD although they are ok with other OSI-approved > licenses. > > I think that using the scikit prefix for GPL software breaks people's > expectations about the python scientific stack and is therefore misleading > but it is only my opinion. > > Was the choice of GPLv3 over BSD deliberate or just a stab in the dark? > > On Thu, Aug 9, 2018, 11:41 Magnus Nord wrote: > >> The scikit "label" is indeed fairly permissive. Taken from the Scipy's >> scikit page (https://www.scipy.org/scikits.html): >> >> - The package is deemed too specialized to live in SciPy itself or >> - The package has a GPL (or similar) license which is incompatible >> with SciPy?s BSD license or >> - The package is meant to be included in SciPy, but development is >> still in progress. >> >> So using GPLv3 should be in line with the scikit prefix intentions? >> >> Magnus >> >> On 08/09/2018 09:52 AM, Sylvain Corlay wrote: >> >> Hi Magnus, >> >> I am not sure if anyone "owns" the scikit prefix. >> >> Although I think that most people would expect a scikit-foobar to be BSD >> licensed like scipy, scikit-learn, scikit-image, scikit-optimize etc. So I >> would find it misleading for GPL-licensed software to adopt this naming >> pattern. >> >> Sylvain >> >> >> On Thu, Aug 9, 2018, 09:11 Magnus Nord wrote: >> >>> I'm one of the developers in the HyperSpy project, which is an open >>> source python library for analysing multidimensional data, with a focus >>> on electron microscopy. We're in the processes of splitting the project >>> into smaller pieces, to i) make the code a bit more manageable, ii) make >>> it easier for other projects to use specific functionalities without >>> having to install a lot of extra dependencies. >>> >>> The current focus is moving the file reading/writing functionality to >>> its own library, which includes readers for many types of proprietary >>> electron microscopy file formats. The plan is to keep the GPLv3 license, >>> and only depend on the more standard scientific python stack (numpy, >>> ...). >>> >>> However, we've been having some issues in deciding on a name for the >>> library, and someone suggested using a scikit- prefix >>> (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a >>> scikit prefix would be appropriate for a library focused on file I/O of >>> microscopy data? >>> >>> Discussion on this in the HyperSpy issue tracker: >>> https://github.com/hyperspy/hyperspy/issues/1978 >>> >>> Magnus >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> >> _______________________________________________ >> SciPy-Dev mailing listSciPy-Dev at python.orghttps://mail.python.org/mailman/listinfo/scipy-dev >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > > _______________________________________________ > SciPy-Dev mailing listSciPy-Dev at python.orghttps://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Aug 9 09:12:06 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 9 Aug 2018 09:12:06 -0400 Subject: [SciPy-Dev] scikit prefix for microscopy file I/O library? In-Reply-To: References: <3d9625b0-4ca8-44ef-4cb8-c1c3ca62e91a@gmail.com> <9788352a-462d-1a4c-0f32-9d3c484085e9@gmail.com> Message-ID: BSD-3 compatible licensed is preferred from a developer perspective because then we can also freely look at each others code, and possibly copy code or move parts into scipy. However, some packages are GPL because of a requirement from the underlying code. For example, scikit-sparse uses CHOLMOD which I would like to use from scipy, but it is GPL and would "infect" scipy's license if used/copied there. Some packages that are overall GPL because of some included code, but have additionally a more permissible license on parts of the code if that code is reused without the GPL part. Josef On Thu, Aug 9, 2018 at 8:54 AM, Sylvain Corlay wrote: > Arg too bad :) > > I guess that you are stuck with gpl unless all the copyright holders agree > to change it... > > On Thu, Aug 9, 2018, 14:48 Magnus Nord wrote: > >> The reason for having this new library be GPLv3 is that HyperSpy itself >> is GPLv3, and most of the code will simply be copy-pasted from HyperSpy's >> IO-module into the new library. >> >> Magnus >> >> On 08/09/2018 02:31 PM, Sylvain Corlay wrote: >> >> Their recommendation is BSD although they are ok with other OSI-approved >> licenses. >> >> I think that using the scikit prefix for GPL software breaks people's >> expectations about the python scientific stack and is therefore misleading >> but it is only my opinion. >> >> Was the choice of GPLv3 over BSD deliberate or just a stab in the dark? >> >> On Thu, Aug 9, 2018, 11:41 Magnus Nord wrote: >> >>> The scikit "label" is indeed fairly permissive. Taken from the Scipy's >>> scikit page (https://www.scipy.org/scikits.html): >>> >>> - The package is deemed too specialized to live in SciPy itself or >>> - The package has a GPL (or similar) license which is incompatible >>> with SciPy?s BSD license or >>> - The package is meant to be included in SciPy, but development is >>> still in progress. >>> >>> So using GPLv3 should be in line with the scikit prefix intentions? >>> >>> Magnus >>> >>> On 08/09/2018 09:52 AM, Sylvain Corlay wrote: >>> >>> Hi Magnus, >>> >>> I am not sure if anyone "owns" the scikit prefix. >>> >>> Although I think that most people would expect a scikit-foobar to be BSD >>> licensed like scipy, scikit-learn, scikit-image, scikit-optimize etc. So I >>> would find it misleading for GPL-licensed software to adopt this naming >>> pattern. >>> >>> Sylvain >>> >>> >>> On Thu, Aug 9, 2018, 09:11 Magnus Nord wrote: >>> >>>> I'm one of the developers in the HyperSpy project, which is an open >>>> source python library for analysing multidimensional data, with a focus >>>> on electron microscopy. We're in the processes of splitting the project >>>> into smaller pieces, to i) make the code a bit more manageable, ii) >>>> make >>>> it easier for other projects to use specific functionalities without >>>> having to install a lot of extra dependencies. >>>> >>>> The current focus is moving the file reading/writing functionality to >>>> its own library, which includes readers for many types of proprietary >>>> electron microscopy file formats. The plan is to keep the GPLv3 >>>> license, >>>> and only depend on the more standard scientific python stack (numpy, >>>> ...). >>>> >>>> However, we've been having some issues in deciding on a name for the >>>> library, and someone suggested using a scikit- prefix >>>> (scikit-microscopyIO?, scikit-microscopy-io?). So I'm wondering if a >>>> scikit prefix would be appropriate for a library focused on file I/O of >>>> microscopy data? >>>> >>>> Discussion on this in the HyperSpy issue tracker: >>>> https://github.com/hyperspy/hyperspy/issues/1978 >>>> >>>> Magnus >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing listSciPy-Dev at python.orghttps://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> >> _______________________________________________ >> SciPy-Dev mailing listSciPy-Dev at python.orghttps://mail.python.org/mailman/listinfo/scipy-dev >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Aug 11 14:09:02 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 11 Aug 2018 11:09:02 -0700 Subject: [SciPy-Dev] minor update to CoC Message-ID: Hi all, On the NumPy mailing list there was/is a long discussion about adopting a CoC. In particular, the same one as SciPy. People are in agreement on that but wanted one clarification - that the list of categories in the diversity statement was appended with a sentence saying that being in such a category does not excuse bad behavior. We would like to keep the SciPy and NumPy CoC's in sync, so that addition is being made at https://github.com/scipy/scipy/pull/9109 first. The exact addition is: ", to the extent that these do not conflict with this code of conduct." I plan to merge that PR soon, if there's any objection please comment. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Sun Aug 12 06:46:46 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Sun, 12 Aug 2018 12:46:46 +0200 Subject: [SciPy-Dev] minor update to CoC In-Reply-To: References: Message-ID: A bit recursive but looks good to me :) On Sat, Aug 11, 2018 at 8:09 PM Ralf Gommers wrote: > Hi all, > > On the NumPy mailing list there was/is a long discussion about adopting a > CoC. In particular, the same one as SciPy. People are in agreement on that > but wanted one clarification - that the list of categories in the diversity > statement was appended with a sentence saying that being in such a category > does not excuse bad behavior. > > We would like to keep the SciPy and NumPy CoC's in sync, so that addition > is being made at https://github.com/scipy/scipy/pull/9109 first. The > exact addition is: > > ", to the extent that these do not conflict with this code of conduct." > > I plan to merge that PR soon, if there's any objection please comment. > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xsaixosx at gmail.com Mon Aug 13 17:27:39 2018 From: xsaixosx at gmail.com (Saixoz) Date: Mon, 13 Aug 2018 23:27:39 +0200 Subject: [SciPy-Dev] 1D parabolic PDE solver Message-ID: Dear all, A recent task that I had to perform involved rewriting a large number of scripts and files from MATLAB to Python. In the process I noticed that Scipy lacks a PDE solver. When I tried to find alternatives, the options found were all unusable for various reasons. I wrote a solver which solves slab symmetric problems with a similar PDE form to that which MATLAB?s pdepe solves, namely c(x,u,t,du/dx)(du/dt) = (d/dx)(f(x,u,t,du/dx) + s(x,u,t,du/dx) using the method of lines approach, finite differences, and the BDF method provided by solve_ivp. I was wondering whether there would be some interest in adding the solver under scipy.integrate, to begin building a part of a collection. I feel that for very simple problems such as the ones that my solver can solve, users shouldn?t need to turn to overly complex packages. If there is interest, I would need to clean up the code and add tests and stability checks to it, help/advice would be appreciated. I believe BDF already has stability in the time grid, so I would only need to check stability in space grid? Can someone confirm? I?d also like to move a large chunk of the code to Cython since I wrote it in native Python and used Numba to speed it up, which still makes it rather slow. I?ve worked in C and Python before but never in Cython, could someone provide me with a link to get started? As far as legal issues go, there should be no concern ? I explicitly avoided looking at and using the MATLAB source code as a reference since I wanted to leave publishing the solver under Scipy an option. I did use their solver to check some of the solutions that my solver produced though. The solver was written while working for the ETH Zurich, where my supervisor responded very enthusiastically to the possibility of providing the solver to the public under Scipy, so there should be no issues there either. All the best, Nicolas -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Tue Aug 14 02:32:37 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Mon, 13 Aug 2018 23:32:37 -0700 Subject: [SciPy-Dev] Cython optimize root finders Message-ID: Hi, After some previous discussion, I have implemented or wrapped some root finders from optimize in Cython for users who need performance improvements or low level control. See https://github.com/scipy/scipy/pull/8431#issuecomment-412699499 I would appreciate anyone's comments. Thanks, Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From blairuk at gmail.com Tue Aug 14 10:45:34 2018 From: blairuk at gmail.com (Blair Azzopardi) Date: Tue, 14 Aug 2018 15:45:34 +0100 Subject: [SciPy-Dev] Improve Scipy's Stable PDF FFT Implementation Message-ID: Hi I've written a short paper exploring optimising our Stable probability density FFT implementation. In my original PR (7374) I introduced a method by Mitnik. However, there is a better approach by Wang that significantly improves accuracy. Wang's method can be generalized using standard techniques improving accuracy even more but at the cost of CPU time. I felt I should write to the dev mailing list as I'm not sure if implementing what I feel is a generalized approach might not be established enough within the academic community for inclusion in Scipy. Although, that said, Wang's approach is cited by several other papers so shouldn't be a problem. The paper can be found in my github repository below. I'd like to submit to Arxiv but can't without an endorsement; perhaps someone here might be willing to help there. https://github.com/bsdz/docs/blob/master/papers/Stable%20Probability%20Densities%20using%20FFTs%20and%20Newton-Cote.pdf So, I plan to raise at least another PR to replace the FFT implementation of Mitnik with Wang's. A couple of questions that come to mind: 1) Is it worth me just introducing the general solution that defaults to Wang's? 2) Is it worth placing the general FFT characteristic function code into a separate module for approximating other integrals using FFT? If so, where would be a good place? Blair -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Tue Aug 14 19:53:08 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Wed, 15 Aug 2018 09:53:08 +1000 Subject: [SciPy-Dev] Compiling scipy on macOS with conda environment Message-ID: Hi all, I've been having issues compiling scipy/master on macOS in a Python 3.6 conda based environment (numpy 1.15). I'm getting a whole load of undefined symbols, and the tail end of the build log is below. I've managed to get around the problem by setting LDFLAGS (based on https://groups.google.com/a/continuum.io/forum/#!topic/conda/cwqTbwQAyW0): export LDFLAGS="$LDFLAGS -undefined dynamic_lookup -bundle" I have no idea why this works. Not sure why the problem comes about, but I thought people might be interested in knowing what the solution could be. /usr/local/bin/gfortran -Wall -g -Wl,-pie -Wl,-headerpad_max_install_names -Wl,-dead_strip_dylibs build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/drfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zrfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/src/dct.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/src/dst.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/fortranobject.o -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0 -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0/../../.. -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0/../../.. -Lbuild/temp.macosx-10.9-x86_64-3.6 -ldfftpack -lfftpack -lgfortran -o build/lib.macosx-10.9-x86_64-3.6/scipy/fftpack/_ fftpack.cpython-36m-darwin.so Undefined symbols for architecture x86_64: "_PyArg_ParseTupleAndKeywords", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o ... "_PyBytes_FromString", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyCapsule_GetPointer", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyCapsule_AsVoidPtr in fortranobject.o "_PyCapsule_New", referenced from: _fortran_getattr in fortranobject.o _F2PyCapsule_FromVoidPtr in fortranobject.o "_PyCapsule_Type", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyCapsule_Check in fortranobject.o "_PyComplex_Type", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyDict_DelItemString", referenced from: _fortran_setattr in fortranobject.o "_PyDict_GetItemString", referenced from: _fortran_getattr in fortranobject.o "_PyDict_New", referenced from: _PyFortranObject_New in fortranobject.o _PyFortranObject_NewAsAttr in fortranobject.o _fortran_setattr in fortranobject.o "_PyDict_SetItemString", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o _PyFortranObject_New in fortranobject.o _fortran_getattr in fortranobject.o _fortran_setattr in fortranobject.o "_PyErr_Clear", referenced from: _int_from_pyobj in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o _fortran_getattr in fortranobject.o _fortran_repr in fortranobject.o _F2PyCapsule_FromVoidPtr in fortranobject.o _F2PyCapsule_AsVoidPtr in fortranobject.o "_PyErr_Format", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_call in fortranobject.o _check_and_fix_dimensions in fortranobject.o "_PyErr_NewException", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyErr_NoMemory", referenced from: _fortran_getattr in fortranobject.o "_PyErr_Occurred", referenced from: _PyInit__fftpack in _fftpackmodule.o _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o ... "_PyErr_Print", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o "_PyErr_SetString", referenced from: _PyInit__fftpack in _fftpackmodule.o _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_cfft in _fftpackmodule.o _f2py_rout__fftpack_rfft in _fftpackmodule.o ... "_PyExc_AttributeError", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_setattr in fortranobject.o "_PyExc_ImportError", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyExc_RuntimeError", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_call in fortranobject.o "_PyExc_TypeError", referenced from: _fortran_call in fortranobject.o _array_from_pyobj in fortranobject.o "_PyExc_ValueError", referenced from: _array_from_pyobj in fortranobject.o _check_and_fix_dimensions in fortranobject.o "_PyImport_ImportModule", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyLong_AsLong", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyMem_Free", referenced from: _fortran_dealloc in fortranobject.o _fortran_getattr in fortranobject.o "_PyMem_Malloc", referenced from: _fortran_getattr in fortranobject.o "_PyModule_Create2", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyModule_GetDict", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyNumber_Long", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyOS_snprintf", referenced from: _fortran_getattr in fortranobject.o "_PyObject_GenericGetAttr", referenced from: _fortran_getattr in fortranobject.o "_PyObject_GetAttrString", referenced from: _PyInit__fftpack in _fftpackmodule.o _int_from_pyobj in _fftpackmodule.o _fortran_repr in fortranobject.o "_PySequence_Check", referenced from: _int_from_pyobj in _fftpackmodule.o "_PySequence_GetItem", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyType_IsSubtype", referenced from: _int_from_pyobj in _fftpackmodule.o _array_from_pyobj in fortranobject.o "_PyType_Type", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyUnicode_Concat", referenced from: _fortran_getattr in fortranobject.o "_PyUnicode_FromFormat", referenced from: _fortran_repr in fortranobject.o "_PyUnicode_FromString", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_getattr in fortranobject.o _fortran_repr in fortranobject.o "_PyUnicode_FromStringAndSize", referenced from: _fortran_getattr in fortranobject.o "_Py_BuildValue", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o ... "__PyObject_New", referenced from: _PyFortranObject_New in fortranobject.o _PyFortranObject_NewAsAttr in fortranobject.o "__Py_NoneStruct", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_cfft in _fftpackmodule.o _f2py_rout__fftpack_rfft in _fftpackmodule.o _f2py_rout__fftpack_crfft in _fftpackmodule.o ... "_main", referenced from: implicit entry/start for main executable ld: symbol(s) not found for architecture x86_64 collect2: error: ld returned 1 exit status Undefined symbols for architecture x86_64: "_PyArg_ParseTupleAndKeywords", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o ... "_PyBytes_FromString", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyCapsule_GetPointer", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyCapsule_AsVoidPtr in fortranobject.o "_PyCapsule_New", referenced from: _fortran_getattr in fortranobject.o _F2PyCapsule_FromVoidPtr in fortranobject.o "_PyCapsule_Type", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyCapsule_Check in fortranobject.o "_PyComplex_Type", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyDict_DelItemString", referenced from: _fortran_setattr in fortranobject.o "_PyDict_GetItemString", referenced from: _fortran_getattr in fortranobject.o "_PyDict_New", referenced from: _PyFortranObject_New in fortranobject.o _PyFortranObject_NewAsAttr in fortranobject.o _fortran_setattr in fortranobject.o "_PyDict_SetItemString", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o _PyFortranObject_New in fortranobject.o _fortran_getattr in fortranobject.o _fortran_setattr in fortranobject.o "_PyErr_Clear", referenced from: _int_from_pyobj in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o _fortran_getattr in fortranobject.o _fortran_repr in fortranobject.o _F2PyCapsule_FromVoidPtr in fortranobject.o _F2PyCapsule_AsVoidPtr in fortranobject.o "_PyErr_Format", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_call in fortranobject.o _check_and_fix_dimensions in fortranobject.o "_PyErr_NewException", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyErr_NoMemory", referenced from: _fortran_getattr in fortranobject.o "_PyErr_Occurred", referenced from: _PyInit__fftpack in _fftpackmodule.o _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o ... "_PyErr_Print", referenced from: _PyInit__fftpack in _fftpackmodule.o _F2PyDict_SetItemString in fortranobject.o "_PyErr_SetString", referenced from: _PyInit__fftpack in _fftpackmodule.o _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_cfft in _fftpackmodule.o _f2py_rout__fftpack_rfft in _fftpackmodule.o ... "_PyExc_AttributeError", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_setattr in fortranobject.o "_PyExc_ImportError", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyExc_RuntimeError", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_call in fortranobject.o "_PyExc_TypeError", referenced from: _fortran_call in fortranobject.o _array_from_pyobj in fortranobject.o "_PyExc_ValueError", referenced from: _array_from_pyobj in fortranobject.o _check_and_fix_dimensions in fortranobject.o "_PyImport_ImportModule", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyLong_AsLong", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyMem_Free", referenced from: _fortran_dealloc in fortranobject.o _fortran_getattr in fortranobject.o "_PyMem_Malloc", referenced from: _fortran_getattr in fortranobject.o "_PyModule_Create2", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyModule_GetDict", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyNumber_Long", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyOS_snprintf", referenced from: _fortran_getattr in fortranobject.o "_PyObject_GenericGetAttr", referenced from: _fortran_getattr in fortranobject.o "_PyObject_GetAttrString", referenced from: _PyInit__fftpack in _fftpackmodule.o _int_from_pyobj in _fftpackmodule.o _fortran_repr in fortranobject.o "_PySequence_Check", referenced from: _int_from_pyobj in _fftpackmodule.o "_PySequence_GetItem", referenced from: _int_from_pyobj in _fftpackmodule.o "_PyType_IsSubtype", referenced from: _int_from_pyobj in _fftpackmodule.o _array_from_pyobj in fortranobject.o "_PyType_Type", referenced from: _PyInit__fftpack in _fftpackmodule.o "_PyUnicode_Concat", referenced from: _fortran_getattr in fortranobject.o "_PyUnicode_FromFormat", referenced from: _fortran_repr in fortranobject.o "_PyUnicode_FromString", referenced from: _PyInit__fftpack in _fftpackmodule.o _fortran_getattr in fortranobject.o _fortran_repr in fortranobject.o "_PyUnicode_FromStringAndSize", referenced from: _fortran_getattr in fortranobject.o "_Py_BuildValue", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o _f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o ... "__PyObject_New", referenced from: _PyFortranObject_New in fortranobject.o _PyFortranObject_NewAsAttr in fortranobject.o "__Py_NoneStruct", referenced from: _f2py_rout__fftpack_zfft in _fftpackmodule.o _f2py_rout__fftpack_drfft in _fftpackmodule.o _f2py_rout__fftpack_zrfft in _fftpackmodule.o _f2py_rout__fftpack_zfftnd in _fftpackmodule.o _f2py_rout__fftpack_cfft in _fftpackmodule.o _f2py_rout__fftpack_rfft in _fftpackmodule.o _f2py_rout__fftpack_crfft in _fftpackmodule.o ... "_main", referenced from: implicit entry/start for main executable ld: symbol(s) not found for architecture x86_64 collect2: error: ld returned 1 exit status error: Command "/usr/local/bin/gfortran -Wall -g -Wl,-pie -Wl,-headerpad_max_install_names -Wl,-dead_strip_dylibs build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/drfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zrfft.o build/temp.macosx-10.9-x86_64-3.6/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/src/dct.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/src/dst.o build/temp.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/build/src.macosx-10.9-x86_64-3.6/scipy/fftpack/fortranobject.o -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0 -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0/../../.. -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0/../../.. -Lbuild/temp.macosx-10.9-x86_64-3.6 -ldfftpack -lfftpack -lgfortran -o build/lib.macosx-10.9-x86_64-3.6/scipy/fftpack/_ fftpack.cpython-36m-darwin.so" failed with exit status 1 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Aug 15 00:52:37 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 14 Aug 2018 21:52:37 -0700 Subject: [SciPy-Dev] Compiling scipy on macOS with conda environment In-Reply-To: References: Message-ID: On Tue, Aug 14, 2018 at 4:53 PM, Andrew Nelson wrote: > Hi all, > I've been having issues compiling scipy/master on macOS in a Python 3.6 > conda based environment (numpy 1.15). > I'm getting a whole load of undefined symbols, and the tail end of the > build log is below. > > I've managed to get around the problem by setting LDFLAGS (based on > https://groups.google.com/a/continuum.io/forum/#!topic/conda/cwqTbwQAyW0): > > export LDFLAGS="$LDFLAGS -undefined dynamic_lookup -bundle" > > I have no idea why this works. Not sure why the problem comes about, but I > thought people might be interested in knowing what the solution could be. > It should be caused by LDFLAGS already being set in your environment. In numpy 1.16.0 this can be solved by setting NPY_DISTUTILS_APPEND_FLAGS=1 ( https://github.com/numpy/numpy/pull/11525). Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Wed Aug 15 00:59:55 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Wed, 15 Aug 2018 14:59:55 +1000 Subject: [SciPy-Dev] Compiling scipy on macOS with conda environment In-Reply-To: References: Message-ID: On Wed, 15 Aug 2018 at 14:53, Ralf Gommers wrote: > On Tue, Aug 14, 2018 at 4:53 PM, Andrew Nelson wrote: > >> Hi all, >> I've been having issues compiling scipy/master on macOS in a Python 3.6 >> conda based environment (numpy 1.15). >> I'm getting a whole load of undefined symbols, and the tail end of the >> build log is below. >> >> I've managed to get around the problem by setting LDFLAGS (based on >> https://groups.google.com/a/continuum.io/forum/#!topic/conda/cwqTbwQAyW0 >> ): >> >> export LDFLAGS="$LDFLAGS -undefined dynamic_lookup -bundle" >> >> I have no idea why this works. Not sure why the problem comes about, but >> I thought people might be interested in knowing what the solution could be. >> > > It should be caused by LDFLAGS already being set in your environment. In > numpy 1.16.0 this can be solved by setting NPY_DISTUTILS_APPEND_FLAGS=1 ( > https://github.com/numpy/numpy/pull/11525). > Good to know about that issue. I thought I was going crazy when trying to build scipy. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ndbecker2 at gmail.com Wed Aug 15 08:29:58 2018 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 15 Aug 2018 08:29:58 -0400 Subject: [SciPy-Dev] Universal Method to Sort Complex Information Found | Quanta Magazine Message-ID: https://www.quantamagazine.org/universal-method-to-sort-complex-information-found-20180813/ From andrea.karlova at gmail.com Fri Aug 17 02:54:15 2018 From: andrea.karlova at gmail.com (Andrea Karlova) Date: Fri, 17 Aug 2018 07:54:15 +0100 Subject: [SciPy-Dev] Improve Scipy's Stable PDF FFT Implementation In-Reply-To: References: Message-ID: Blair, I think it may be good idea to finalise your note first and send it to academic journal, because it is their task to give you a feedback. Once you hear from them back, it is gonna be much easier to get approval for putting your preprint on ArXiv. Main idea behind ArXiv is to share preprints which are pretty much already in the publication process. For some journals, especially math ones, it takes years to get to the printed phase (my experience with good math journals is around 3 years in average). Regarding your note, I think you may need to add more text, literature overview, describe where your work is innovative, add section on future work and discussion and mainly formalise the body which you consider as a main results. Then just pick the journal and send it to them. It maybe also helpful to have particular code regarding the results on github, such that results from the paper can be easily replicable. Regarding the overall development: I have recently recieved email from the authors of the C-library libstable: https://www.researchgate.net/publication/317292451_libstable_Fast_Parallel_and_High-Precision_Computation_of_a-Stable_Distributions_in_R_CC_and_MATLAB They would be happy to collaborate on using their core library for SciPy, so that we would add wrappers, corrected math mistakes in the code with their help and they are happy to join and share their code with Scipy community. I think this is a robust longterm solution. Regards, Andrea Karlova On Tue, 14 Aug 2018 at 15:46, Blair Azzopardi wrote: > Hi > > I've written a short paper exploring optimising our Stable probability > density FFT implementation. In my original PR (7374) I introduced a method > by Mitnik. However, there is a better approach by Wang that significantly > improves accuracy. Wang's method can be generalized using standard > techniques improving accuracy even more but at the cost of CPU time. I felt > I should write to the dev mailing list as I'm not sure if implementing what > I feel is a generalized approach might not be established enough within the > academic community for inclusion in Scipy. Although, that said, Wang's > approach is cited by several other papers so shouldn't be a problem. > > The paper can be found in my github repository below. I'd like to submit > to Arxiv but can't without an endorsement; perhaps someone here might be > willing to help there. > > > https://github.com/bsdz/docs/blob/master/papers/Stable%20Probability%20Densities%20using%20FFTs%20and%20Newton-Cote.pdf > > > So, I plan to raise at least another PR to replace the FFT implementation > of Mitnik with Wang's. A couple of questions that come to mind: > > 1) Is it worth me just introducing the general solution that defaults to > Wang's? > 2) Is it worth placing the general FFT characteristic function code into a > separate module for approximating other integrals using FFT? If so, where > would be a good place? > > Blair > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blairuk at gmail.com Fri Aug 17 09:14:33 2018 From: blairuk at gmail.com (Blair Azzopardi) Date: Fri, 17 Aug 2018 14:14:33 +0100 Subject: [SciPy-Dev] Improve Scipy's Stable PDF FFT Implementation In-Reply-To: References: Message-ID: On Fri, 17 Aug 2018, 07:55 Andrea Karlova, wrote: > Blair, > > I think it may be good idea to finalise your note first and send it to > academic journal, > because it is their task to give you a feedback. > Once you hear from them back, it is gonna be much easier to get approval > for putting your preprint on ArXiv. Main idea behind ArXiv is to share > preprints which are pretty much already in the publication process. For > some journals, especially math ones, it takes years to get to the printed > phase (my experience with good math journals is around 3 years in average). > > Regarding your note, I think you may need to add more text, literature > overview, describe where your work is innovative, add section on future > work and discussion and mainly formalise the body which you consider as a > main results. Then just pick the journal and send it to them. > > It maybe also helpful to have particular code regarding the results on > github, such that results from the paper can be easily replicable. > > Regarding the overall development: I have recently recieved email from the > authors of the C-library libstable: > https://www.researchgate.net/publication/317292451_libstable_Fast_Parallel_and_High-Precision_Computation_of_a-Stable_Distributions_in_R_CC_and_MATLAB > > They would be happy to collaborate on using their core library for SciPy, > so that we would add wrappers, corrected math mistakes in the code with > their help and they are happy to join and share their code with Scipy > community. I think this is a robust longterm solution. > > Regards, > Andrea Karlova > > On Tue, 14 Aug 2018 at 15:46, Blair Azzopardi wrote: > >> Hi >> >> I've written a short paper exploring optimising our Stable probability >> density FFT implementation. In my original PR (7374) I introduced a method >> by Mitnik. However, there is a better approach by Wang that significantly >> improves accuracy. Wang's method can be generalized using standard >> techniques improving accuracy even more but at the cost of CPU time. I felt >> I should write to the dev mailing list as I'm not sure if implementing what >> I feel is a generalized approach might not be established enough within the >> academic community for inclusion in Scipy. Although, that said, Wang's >> approach is cited by several other papers so shouldn't be a problem. >> >> The paper can be found in my github repository below. I'd like to submit >> to Arxiv but can't without an endorsement; perhaps someone here might be >> willing to help there. >> >> >> https://github.com/bsdz/docs/blob/master/papers/Stable%20Probability%20Densities%20using%20FFTs%20and%20Newton-Cote.pdf >> >> >> So, I plan to raise at least another PR to replace the FFT implementation >> of Mitnik with Wang's. A couple of questions that come to mind: >> >> 1) Is it worth me just introducing the general solution that defaults to >> Wang's? >> 2) Is it worth placing the general FFT characteristic function code into >> a separate module for approximating other integrals using FFT? If so, where >> would be a good place? >> >> Blair >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > Thanks for your comments. Just to clarify I didn't write the paper to be innovative or for work towards a PhD. It's just a summary of known available methods and a demonstration of how accurate they are compared to what we have already and how we could improve on what we already have. I'm happy to leave it in my GitHub repo. The paper you linked to shows the same formulaes we already have although they describe it as Nolan's method rather than my label 'Zolotarev'. I added that implementation based on your concern that the existing DNI and FFT methods weren't good enough if I recall correctly. Perhaps we should rename "zolotarev" to "nolan" in the code. One advantage to the approach described in that link you sent is it being in raw C and utilising parallel processing. We could perhaps achieve similar performance for our similar methods if we cythonize them; I'm not sure if in Scipy we utilize mulithreading anywhere yet. Python has a few issues with MT and GIL although doesn't stop using threads in compiled C code. This might be an issue for portability. The paper you link to doesn't really explore the FFT approach as much as Wang has and if you read my investigation you'll see Wang's method achieves much better accuracy than 10e-6; furthermore, by just cranking up the cotes-newton degree we improve accuracy even more (10e-16) at the cost of cpu time. It's well known that FFT method is better suited for alpha>1; however, it's still useful for MLE of parameters. Regarding the paper authors and libstable. In section 5 they state it's GPL v3 and depends on GSL (gnu scientific library) so might require quite a lot of effort for them to relax license to BSD compatible. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Mon Aug 20 15:40:52 2018 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Mon, 20 Aug 2018 15:40:52 -0400 Subject: [SciPy-Dev] Proposal for some new fitting routines Message-ID: I have added a function for fitting exponential and power curves. This seems to be a fairly common problem (e.g., see the list of selections from stack exchange below). The algorithm I have implemented here is objectively better than the common solution of fitting to log(y) because it requires no up-front knowledge of the intercept, or any other estimations. Assuming that this PR is acceptable, I would like to get some criticism on where I chose to put it in the directory structure. My thought was to create a separate module for such common non-iterative optimizations. I am also unsure of whether my citation of the original work is formatted properly. I would like to make sure that the author gets the credit and publicity they deserve. The paper is mentioned (by the author) in answers to [1] and [2] below. List of relevant stack exchange questions: [1]: https://stackoverflow.com/questions/3938042/fitting-exponential-decay-with-no-initial-guessing [2]: https://math.stackexchange.com/questions/1337601/fit-exponential-with-constant [3]: https://math.stackexchange.com/questions/350754/fitting-exponential-curve-to-data [4]: https://math.stackexchange.com/questions/1999069/fit-curve-exponential-maybe-to-experimental-data [5]: https://mathematica.stackexchange.com/questions/87645/exponential-fit-to-data-is-of-low-quality [6]: https://stackoverflow.com/questions/48099026/exponential-curve-fit-will-not-fit [7]: https://stackoverflow.com/questions/50683768/exponent-curve-fitting-in-python [8]: https://stackoverflow.com/questions/49565152/curve-fit-an-exponential-decay-function-in-python-using-given-data-points [9]: https://stackoverflow.com/questions/21420792/exponential-curve-fitting-in-scipy Regards, - Joe -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Aug 21 16:47:34 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 21 Aug 2018 14:47:34 -0600 Subject: [SciPy-Dev] NumPy 1.15.1 released. Message-ID: Hi All, On behalf of the NumPy team, I am pleased to announce the release of NumPy 1.15.1. This is a bugfix release for bugs and regressions reported following the 1.15.0 release. Noticeable fixes are - The annoying but harmless RuntimeWarning that "numpy.dtype size changed" has been suppressed. The long standing suppression was lost in the transition to pytest. - The update to Cython 0.28.3 exposed a problematic use of a gcc attribute used to prefer code size over speed in module initialization, possibly resulting in incorrect compiled code. This has been fixed in latest Cython but has been disabled here for safety. - Support for big-endian and ARMv8 architectures has been improved. The Python versions supported by this release are 2.7, 3.4-3.7. The wheels are linked with OpenBLAS v0.3.0, which should fix some of the linalg problems reported for NumPy 1.14. Wheels for this release can be downloaded from PyPI , source archives are available from Github . *Compatibility Note* The NumPy 1.15.x OS X wheels released on PyPI no longer contain 32-bit binaries. That will also be the case in future releases. See #11625 for the related discussion. Those needing 32-bit support on the Mac should look elsewhere or build from source. *Contributors* A total of 7 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * Charles Harris * Chris Billington * Elliott Sales de Andrade + * Eric Wieser * Jeremy Manning + * Matti Picus * Ralf Gommers *Pull requests merged* A total of 24 pull requests were merged for this release. * `#11647 `__: MAINT: Filter Cython warnings in ``__init__.py`` * `#11648 `__: BUG: Fix doc source links to unwrap decorators * `#11657 `__: BUG: Ensure singleton dimensions are not dropped when converting... * `#11661 `__: BUG: Warn on Nan in minimum,maximum for scalars * `#11665 `__: BUG: cython sometimes emits invalid gcc attribute * `#11682 `__: BUG: Fix regression in void_getitem * `#11698 `__: BUG: Make matrix_power again work for object arrays. * `#11700 `__: BUG: Add missing PyErr_NoMemory after failing malloc * `#11719 `__: BUG: Fix undefined functions on big-endian systems. * `#11720 `__: MAINT: Make einsum optimize default to False. * `#11746 `__: BUG: Fix regression in loadtxt for bz2 text files in Python 2. * `#11757 `__: BUG: Revert use of `console_scripts`. * `#11758 `__: BUG: Fix Fortran kind detection for aarch64 & s390x. * `#11759 `__: BUG: Fix printing of longdouble on ppc64le. * `#11760 `__: BUG: Fixes for unicode field names in Python 2 * `#11761 `__: BUG: Increase required cython version on python 3.7 * `#11763 `__: BUG: check return value of _buffer_format_string * `#11775 `__: MAINT: Make assert_array_compare more generic. * `#11776 `__: TST: Fix urlopen stubbing. * `#11777 `__: BUG: Fix regression in intersect1d. * `#11779 `__: BUG: Fix test sensitive to platform byte order. * `#11781 `__: BUG: Avoid signed overflow in histogram * `#11785 `__: BUG: Fix pickle and memoryview for datetime64, timedelta64 scalars * `#11786 `__: BUG: Deprecation triggers segfault Cheers, Charles Harris -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Tue Aug 21 22:12:49 2018 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Tue, 21 Aug 2018 22:12:49 -0400 Subject: [SciPy-Dev] Proposal for some new fitting routines In-Reply-To: References: Message-ID: I had sent out an email about a PR proposing an exponential and a power fit function, but I did not specify the PR number. The PR is #9158. Regards, - Joe On Mon, Aug 20, 2018, 3:40 PM Joseph Fox-Rabinovitz < jfoxrabinovitz at gmail.com> wrote: > I have added a function for fitting exponential and power curves. This > seems to > be a fairly common problem (e.g., see the list of selections from stack > exchange > below). The algorithm I have implemented here is objectively better than > the > common solution of fitting to log(y) because it requires no up-front > knowledge > of the intercept, or any other estimations. > > Assuming that this PR is acceptable, I would like to get some criticism on > where > I chose to put it in the directory structure. My thought was to create a > separate > module for such common non-iterative optimizations. > > I am also unsure of whether my citation of the original work is formatted > properly. I would like to make sure that the author gets the credit and > publicity they > deserve. The paper is mentioned (by the author) in answers to [1] and [2] > below. > > List of relevant stack exchange questions: > > [1]: > https://stackoverflow.com/questions/3938042/fitting-exponential-decay-with-no-initial-guessing > [2]: > https://math.stackexchange.com/questions/1337601/fit-exponential-with-constant > [3]: > https://math.stackexchange.com/questions/350754/fitting-exponential-curve-to-data > [4]: > https://math.stackexchange.com/questions/1999069/fit-curve-exponential-maybe-to-experimental-data > [5]: > https://mathematica.stackexchange.com/questions/87645/exponential-fit-to-data-is-of-low-quality > [6]: > https://stackoverflow.com/questions/48099026/exponential-curve-fit-will-not-fit > [7]: > https://stackoverflow.com/questions/50683768/exponent-curve-fitting-in-python > [8]: > https://stackoverflow.com/questions/49565152/curve-fit-an-exponential-decay-function-in-python-using-given-data-points > [9]: > https://stackoverflow.com/questions/21420792/exponential-curve-fitting-in-scipy > > Regards, > > - Joe > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Tue Aug 21 23:01:29 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 21 Aug 2018 23:01:29 -0400 Subject: [SciPy-Dev] Proposal for some new fitting routines In-Reply-To: References: Message-ID: On Tue, Aug 21, 2018 at 10:12 PM, Joseph Fox-Rabinovitz < jfoxrabinovitz at gmail.com> wrote: > I had sent out an email about a PR proposing an exponential and a power > fit function, but I did not specify the PR number. The PR is #9158. > > Regards, > > - Joe > > > On Mon, Aug 20, 2018, 3:40 PM Joseph Fox-Rabinovitz < > jfoxrabinovitz at gmail.com> wrote: > >> I have added a function for fitting exponential and power curves. This >> seems to >> be a fairly common problem (e.g., see the list of selections from stack >> exchange >> below). The algorithm I have implemented here is objectively better than >> the >> common solution of fitting to log(y) because it requires no up-front >> knowledge >> of the intercept, or any other estimations. >> >> Assuming that this PR is acceptable, I would like to get some criticism >> on where >> I chose to put it in the directory structure. My thought was to create a >> separate >> module for such common non-iterative optimizations. >> >> I am also unsure of whether my citation of the original work is formatted >> properly. I would like to make sure that the author gets the credit and >> publicity they >> deserve. The paper is mentioned (by the author) in answers to [1] and [2] >> below. >> >> List of relevant stack exchange questions: >> >> [1]: https://stackoverflow.com/questions/3938042/fitting- >> exponential-decay-with-no-initial-guessing >> [2]: https://math.stackexchange.com/questions/1337601/fit- >> exponential-with-constant >> [3]: https://math.stackexchange.com/questions/350754/fitting- >> exponential-curve-to-data >> [4]: https://math.stackexchange.com/questions/1999069/fit- >> curve-exponential-maybe-to-experimental-data >> [5]: https://mathematica.stackexchange.com/questions/ >> 87645/exponential-fit-to-data-is-of-low-quality >> [6]: https://stackoverflow.com/questions/48099026/ >> exponential-curve-fit-will-not-fit >> [7]: https://stackoverflow.com/questions/50683768/exponent- >> curve-fitting-in-python >> [8]: https://stackoverflow.com/questions/49565152/curve-fit- >> an-exponential-decay-function-in-python-using-given-data-points >> [9]: https://stackoverflow.com/questions/21420792/ >> exponential-curve-fitting-in-scipy >> >> Regards, >> >> - Joe >> > The main question in my opinion is whether scipy should add functions in this area, this area essentially doesn't exist in scipy and is covered, for example, by lmfit. Except for the scipy.stats.distribution and the simple linear model, there are no fit or estimation methods for specific functions, AFAIK. ODR is related but also doesn't provide specific nonlinear functions scipy.optimize has the optimization algorithm but not specific nonlinear functions that are estimated. lmfit has built in functions for nonlinear estimation https://lmfit.github.io/lmfit-py/builtin_models.html and there some other less well known packages, AFAIR. (There was also a discussion about adding similar predefined functions to statsmodels, but they were never added and for those cases I usually refer to lmfit.) My guess is that, unless there is a clear plan to expand in this area in scipy, then pointing to lmfit would be the better approach. Nevertheless, there could also be demand for having something simple in scipy. E.g. scipy.stats is a bit "eclectic" and has some overlap with statsmodels, or the other way around. scipy.stats has simple linear regression but we never added a full OLS, multiple linear regression to scipy.stats because there would be too much code duplication especially if there is feature creep and users want similar extended results as statsmodels provides. numpy has polynomial fitting as the only fitting function, AFAIK. Josef > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From newville at cars.uchicago.edu Wed Aug 22 11:18:07 2018 From: newville at cars.uchicago.edu (Matt Newville) Date: Wed, 22 Aug 2018 10:18:07 -0500 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 178, Issue 18 In-Reply-To: References: Message-ID: Hi All, On Tue, Aug 21, 2018 at 10:01 PM wrote: > Send SciPy-Dev mailing list submissions to > scipy-dev at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at python.org > > You can reach the person managing the list at > scipy-dev-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. NumPy 1.15.1 released. (Charles R Harris) > 2. Proposal for some new fitting routines (Joseph Fox-Rabinovitz) > 3. Re: Proposal for some new fitting routines (josef.pktd at gmail.com) > > > Message: 2 > Date: Tue, 21 Aug 2018 22:12:49 -0400 > From: Joseph Fox-Rabinovitz > To: SciPy Developers List > Subject: [SciPy-Dev] Proposal for some new fitting routines > Message-ID: > < > CAAa1KPZ76rECAtBNF_cNMgpQD2eE1YiZgo9notpUSpGA-_yO6Q at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > I had sent out an email about a PR proposing an exponential and a power fit > function, but I did not specify the PR number. The PR is #9158. > > Regards, > > - Joe > > > On Mon, Aug 20, 2018, 3:40 PM Joseph Fox-Rabinovitz < > jfoxrabinovitz at gmail.com> wrote: > > > I have added a function for fitting exponential and power curves. This > > seems to > > be a fairly common problem (e.g., see the list of selections from stack > > exchange > > below). The algorithm I have implemented here is objectively better than > > the > > common solution of fitting to log(y) because it requires no up-front > > knowledge > > of the intercept, or any other estimations. > > > > Assuming that this PR is acceptable, I would like to get some criticism > on > > where > > I chose to put it in the directory structure. My thought was to create a > > separate > > module for such common non-iterative optimizations. > > > > I am also unsure of whether my citation of the original work is formatted > > properly. I would like to make sure that the author gets the credit and > > publicity they > > deserve. The paper is mentioned (by the author) in answers to [1] and [2] > > below. > > > My impression is that these are fine procedures to have readily available, but that they might not be a great fit for scipy.optimize since the basic algorithm is non-iterative. That is, the procedures seems closer to scipy.stats.linregress and so it might make more sense having them scipy.stats, perhaps calling the routines scipy.stats.expregress and .powregress. It would be very helpful to make the output a closer match to those of linregress, especially including the standard errors and correlation coefficients if possible. Since the algorithms are non-iterative, I'm not sure it makes a lot of sense to have these become the exponential and power-law fit methodsin lmfit (there are existing exponential and power-law models there already), but I would be inclined to use these algorithms (whether they go into scipy or not) to produce initial guesses for these models. Hope that helps (and thanks for asking!), --Matt Newville -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Wed Aug 22 13:32:04 2018 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Wed, 22 Aug 2018 13:32:04 -0400 Subject: [SciPy-Dev] Proposal for some new fitting routines In-Reply-To: References: Message-ID: On Tue, Aug 21, 2018 at 11:02 PM wrote: > > > On Tue, Aug 21, 2018 at 10:12 PM, Joseph Fox-Rabinovitz < > jfoxrabinovitz at gmail.com> wrote: > >> I had sent out an email about a PR proposing an exponential and a power >> fit function, but I did not specify the PR number. The PR is #9158. >> >> Regards, >> >> - Joe >> >> >> On Mon, Aug 20, 2018, 3:40 PM Joseph Fox-Rabinovitz < >> jfoxrabinovitz at gmail.com> wrote: >> >>> I have added a function for fitting exponential and power curves. This >>> seems to >>> be a fairly common problem (e.g., see the list of selections from stack >>> exchange >>> below). The algorithm I have implemented here is objectively better than >>> the >>> common solution of fitting to log(y) because it requires no up-front >>> knowledge >>> of the intercept, or any other estimations. >>> >>> Assuming that this PR is acceptable, I would like to get some criticism >>> on where >>> I chose to put it in the directory structure. My thought was to create a >>> separate >>> module for such common non-iterative optimizations. >>> >>> I am also unsure of whether my citation of the original work is formatted >>> properly. I would like to make sure that the author gets the credit and >>> publicity they >>> deserve. The paper is mentioned (by the author) in answers to [1] and >>> [2] below. >>> >>> List of relevant stack exchange questions: >>> >>> [1]: >>> https://stackoverflow.com/questions/3938042/fitting-exponential-decay-with-no-initial-guessing >>> [2]: >>> https://math.stackexchange.com/questions/1337601/fit-exponential-with-constant >>> [3]: >>> https://math.stackexchange.com/questions/350754/fitting-exponential-curve-to-data >>> [4]: >>> https://math.stackexchange.com/questions/1999069/fit-curve-exponential-maybe-to-experimental-data >>> [5]: >>> https://mathematica.stackexchange.com/questions/87645/exponential-fit-to-data-is-of-low-quality >>> [6]: >>> https://stackoverflow.com/questions/48099026/exponential-curve-fit-will-not-fit >>> [7]: >>> https://stackoverflow.com/questions/50683768/exponent-curve-fitting-in-python >>> [8]: >>> https://stackoverflow.com/questions/49565152/curve-fit-an-exponential-decay-function-in-python-using-given-data-points >>> [9]: >>> https://stackoverflow.com/questions/21420792/exponential-curve-fitting-in-scipy >>> >>> Regards, >>> >>> - Joe >>> >> > > The main question in my opinion is whether scipy should add functions in > this area, this area essentially doesn't exist in scipy and is covered, for > example, by lmfit. > > Except for the scipy.stats.distribution and the simple linear model, there > are no fit or estimation methods for specific functions, AFAIK. ODR is > related but also doesn't provide specific nonlinear functions > scipy.optimize has the optimization algorithm but not specific nonlinear > functions that are estimated. > > lmfit has built in functions for nonlinear estimation > https://lmfit.github.io/lmfit-py/builtin_models.html > and there some other less well known packages, AFAIR. > > (There was also a discussion about adding similar predefined functions to > statsmodels, but they were never added and for those cases I usually refer > to lmfit.) > lmfit definitely looks like a good choice to put the functions. On the other hand, it offers much more advanced features than just a simple regression. My guess is that, unless there is a clear plan to expand in this area in > scipy, then pointing to lmfit would be the better approach. > Nevertheless, there could also be demand for having something simple in > scipy. E.g. scipy.stats is a bit "eclectic" and has some overlap with > statsmodels, or the other way around. scipy.stats has simple linear > regression but we never added a full OLS, > On the other hand, the functions I propose are extremely simple, and would do well alongside linregress if they were renamed expregress and powregress. > multiple linear regression to scipy.stats because there would be too much > code duplication especially if there is feature creep and users want > similar extended results as statsmodels provides. > > numpy has polynomial fitting as the only fitting function, AFAIK. > Definitely out of scope for numpy and borderline for scipy. > > Josef > > > > > >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From christoph.baumgarten at gmail.com Thu Aug 23 17:36:11 2018 From: christoph.baumgarten at gmail.com (Christoph Baumgarten) Date: Thu, 23 Aug 2018 23:36:11 +0200 Subject: [SciPy-Dev] Proposal for some new fitting routines In-Reply-To: References: Message-ID: I have never used these methods, but they look useful and given the links to stackoverflow etc, there seems to be a certain "demand" for them. In stats, we have linregress as well as theilslopes (robust regression), i recently created PR #9101 to add another robust regression technique. so it could fit there. On Thu, Aug 23, 2018 at 6:02 PM wrote: > Send SciPy-Dev mailing list submissions to > scipy-dev at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at python.org > > You can reach the person managing the list at > scipy-dev-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: Proposal for some new fitting routines (Joseph Fox-Rabinovitz) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 22 Aug 2018 13:32:04 -0400 > From: Joseph Fox-Rabinovitz > To: SciPy Developers List > Subject: Re: [SciPy-Dev] Proposal for some new fitting routines > Message-ID: > < > CAAa1KPYoS__fKgb3o+4xt0s+Xn-1DivQSZsD0x-DqraXLFhj3Q at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > On Tue, Aug 21, 2018 at 11:02 PM wrote: > > > > > > > On Tue, Aug 21, 2018 at 10:12 PM, Joseph Fox-Rabinovitz < > > jfoxrabinovitz at gmail.com> wrote: > > > >> I had sent out an email about a PR proposing an exponential and a power > >> fit function, but I did not specify the PR number. The PR is #9158. > >> > >> Regards, > >> > >> - Joe > >> > >> > >> On Mon, Aug 20, 2018, 3:40 PM Joseph Fox-Rabinovitz < > >> jfoxrabinovitz at gmail.com> wrote: > >> > >>> I have added a function for fitting exponential and power curves. This > >>> seems to > >>> be a fairly common problem (e.g., see the list of selections from stack > >>> exchange > >>> below). The algorithm I have implemented here is objectively better > than > >>> the > >>> common solution of fitting to log(y) because it requires no up-front > >>> knowledge > >>> of the intercept, or any other estimations. > >>> > >>> Assuming that this PR is acceptable, I would like to get some criticism > >>> on where > >>> I chose to put it in the directory structure. My thought was to create > a > >>> separate > >>> module for such common non-iterative optimizations. > >>> > >>> I am also unsure of whether my citation of the original work is > formatted > >>> properly. I would like to make sure that the author gets the credit and > >>> publicity they > >>> deserve. The paper is mentioned (by the author) in answers to [1] and > >>> [2] below. > >>> > >>> List of relevant stack exchange questions: > >>> > >>> [1]: > >>> > https://stackoverflow.com/questions/3938042/fitting-exponential-decay-with-no-initial-guessing > >>> [2]: > >>> > https://math.stackexchange.com/questions/1337601/fit-exponential-with-constant > >>> [3]: > >>> > https://math.stackexchange.com/questions/350754/fitting-exponential-curve-to-data > >>> [4]: > >>> > https://math.stackexchange.com/questions/1999069/fit-curve-exponential-maybe-to-experimental-data > >>> [5]: > >>> > https://mathematica.stackexchange.com/questions/87645/exponential-fit-to-data-is-of-low-quality > >>> [6]: > >>> > https://stackoverflow.com/questions/48099026/exponential-curve-fit-will-not-fit > >>> [7]: > >>> > https://stackoverflow.com/questions/50683768/exponent-curve-fitting-in-python > >>> [8]: > >>> > https://stackoverflow.com/questions/49565152/curve-fit-an-exponential-decay-function-in-python-using-given-data-points > >>> [9]: > >>> > https://stackoverflow.com/questions/21420792/exponential-curve-fitting-in-scipy > >>> > >>> Regards, > >>> > >>> - Joe > >>> > >> > > > > The main question in my opinion is whether scipy should add functions in > > this area, this area essentially doesn't exist in scipy and is covered, > for > > example, by lmfit. > > > > Except for the scipy.stats.distribution and the simple linear model, > there > > are no fit or estimation methods for specific functions, AFAIK. ODR is > > related but also doesn't provide specific nonlinear functions > > scipy.optimize has the optimization algorithm but not specific nonlinear > > functions that are estimated. > > > > lmfit has built in functions for nonlinear estimation > > https://lmfit.github.io/lmfit-py/builtin_models.html > > and there some other less well known packages, AFAIR. > > > > (There was also a discussion about adding similar predefined functions to > > statsmodels, but they were never added and for those cases I usually > refer > > to lmfit.) > > > > lmfit definitely looks like a good choice to put the functions. On the > other hand, it offers much more advanced features than just a simple > regression. > > My guess is that, unless there is a clear plan to expand in this area in > > scipy, then pointing to lmfit would be the better approach. > > Nevertheless, there could also be demand for having something simple in > > scipy. E.g. scipy.stats is a bit "eclectic" and has some overlap with > > statsmodels, or the other way around. scipy.stats has simple linear > > regression but we never added a full OLS, > > > > On the other hand, the functions I propose are extremely simple, and would > do well alongside linregress if they were renamed expregress and > powregress. > > > > multiple linear regression to scipy.stats because there would be too much > > code duplication especially if there is feature creep and users want > > similar extended results as statsmodels provides. > > > > numpy has polynomial fitting as the only fitting function, AFAIK. > > > > Definitely out of scope for numpy and borderline for scipy. > > > > > > Josef > > > > > > > > > > > >> > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at python.org > >> https://mail.python.org/mailman/listinfo/scipy-dev > >> > >> > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/scipy-dev/attachments/20180822/30022b70/attachment-0001.html > > > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 178, Issue 20 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rlucente at pipeline.com Thu Aug 23 15:57:00 2018 From: rlucente at pipeline.com (Robert Lucente) Date: Thu, 23 Aug 2018 15:57:00 -0400 (GMT-04:00) Subject: [SciPy-Dev] Proposal for some new fitting routines Message-ID: <778681881.5820.1535054220286@wamui-boogie.atl.sa.earthlink.net> An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Aug 23 19:00:56 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 23 Aug 2018 19:00:56 -0400 Subject: [SciPy-Dev] Proposal for some new fitting routines In-Reply-To: <778681881.5820.1535054220286@wamui-boogie.atl.sa.earthlink.net> References: <778681881.5820.1535054220286@wamui-boogie.atl.sa.earthlink.net> Message-ID: On Thu, Aug 23, 2018 at 3:57 PM, Robert Lucente wrote: > >Except for the scipy.stats.distribution > > I found > > https://docs.scipy.org/doc/scipy/reference/stats.html > > Does the ".distribution" meant in the generic sense? > both, it's the distributions that are in scipy stats, which also has more direct access scipy.stats has everything in the namespace, including the distributions. but `scipy.stats.distributions` was the original module and still has the distributions in the name space. The only reason to access that is when we want to use the distribution classes directly >>> import scipy.stats.distributions as sd >>> dir(sd) The continuous distributions have a fit method but not the discrete distributions. Josef > > Humm > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Fri Aug 24 04:26:11 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Fri, 24 Aug 2018 01:26:11 -0700 Subject: [SciPy-Dev] please review cython optimize zeros api Message-ID: Hi, PR 8431 is ready for review, IMHO. https://github.com/scipy/scipy/pull/8431 This PR adds a Cython API for root-finding of scalar functions in Optimize. It exposes brentq, brenth, ridder, bisect, newton, secant, and halley with 3 different callback signatures, two which are safe with Cython `nogil` which allows them to be called with Cython `prange`. this PR is a follow on to #8357, and allows efficient looping from C of large arrays with the same callback using any of the scalar-function root finders. I would appreciate your comments. Best, Mark -- Mark Mikofski, PhD (2005) *Fiat Lux* -------------- next part -------------- An HTML attachment was scrubbed... URL: From juanlu001 at gmail.com Fri Aug 24 04:48:41 2018 From: juanlu001 at gmail.com (Juan Luis Cano) Date: Fri, 24 Aug 2018 10:48:41 +0200 Subject: [SciPy-Dev] Plans for PyPy3 wheels? Message-ID: Hi all, [I'm sending this to scipy-dev because I'm already subscribed, but I guess someone could bring this discussion to numpy-discussion as well] Lately PyPy3 (I'm only interested in Python 3) has matured a lot and it's becoming more and more interesting with time. The developers invested a lot of effort in improving the compatibility with CPython C extensions, and I'm happy to see that SciPy in particular passes all tests in PyPy3: https://circleci.com/gh/scipy/scipy/8318 There was an effort by Antonio Cuni, a PyPy developer, to produce wheels of popular scientific packages. However, it's still a non official thing, and I tried to link them to OpenBLAS but I'm not skilled (or patient) enough: https://github.com/antocuni/pypy-wheels/pull/6 Are there any plans to officially support PyPy3 (and perhaps PyPy) and upload official wheels to pypi.org? Lack of binary packages is holding me back for playing more with PyPy, and there are still no clear plans on what to do with conda packages: https://github.com/conda-forge/pypy2.7-feedstock/issues/1 Did other SciPy or NumPy developers play with PyPy3? Do you think this would be an interesting thing to have? What needs to be done to get there? Best, -- Juan Luis Cano -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Aug 24 15:26:35 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 24 Aug 2018 12:26:35 -0700 Subject: [SciPy-Dev] Plans for PyPy3 wheels? In-Reply-To: References: Message-ID: On Fri, Aug 24, 2018 at 1:49 AM Juan Luis Cano wrote: > Hi all, > > [I'm sending this to scipy-dev because I'm already subscribed, but I guess > someone could bring this discussion to numpy-discussion as well] > > Lately PyPy3 (I'm only interested in Python 3) has matured a lot and it's > becoming more and more interesting with time. The developers invested a lot > of effort in improving the compatibility with CPython C extensions, and I'm > happy to see that SciPy in particular passes all tests in PyPy3: > > https://circleci.com/gh/scipy/scipy/8318 > > There was an effort by Antonio Cuni, a PyPy developer, to produce wheels > of popular scientific packages. However, it's still a non official thing, > and I tried to link them to OpenBLAS but I'm not skilled (or patient) > enough: > > https://github.com/antocuni/pypy-wheels/pull/6 > > Are there any plans to officially support PyPy3 (and perhaps PyPy) and > upload official wheels to pypi.org? > As far as I can find so quickly, there's not yet a clear ABI tag or policy on ABI stability for Pypy wheels ( https://github.com/pypa/wheel/issues/187#issuecomment-320482719). That would be needed before putting wheels on PyPI. Once that's in place, I think we'd be happy to consider releasing official Pypy >=3.5 wheels. If you or someone else wants to look into the current status on this, that would be useful. Cheers, Ralf > Lack of binary packages is holding me back for playing more with PyPy, and > there are still no clear plans on what to do with conda packages: > > https://github.com/conda-forge/pypy2.7-feedstock/issues/1 > > Did other SciPy or NumPy developers play with PyPy3? Do you think this > would be an interesting thing to have? What needs to be done to get there? > > Best, > > -- > Juan Luis Cano > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Fri Aug 24 15:38:28 2018 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 24 Aug 2018 21:38:28 +0200 Subject: [SciPy-Dev] Plans for PyPy3 wheels? In-Reply-To: References: Message-ID: <14125554bdfc225e2fdd0a1a8122c6238e869aaf.camel@iki.fi> pe, 2018-08-24 kello 12:26 -0700, Ralf Gommers kirjoitti: [clip] > As far as I can find so quickly, there's not yet a clear ABI tag or > policy > on ABI stability for Pypy wheels ( > https://github.com/pypa/wheel/issues/187#issuecomment-320482719). > That > would be needed before putting wheels on PyPI. Once that's in place, > I > think we'd be happy to consider releasing official Pypy >=3.5 wheels. > > If you or someone else wants to look into the current status on this, > that > would be useful. The second issue as I understand is that there are unsolved issues in manylinux1 compatibility (https://bitbucket.org/pypy/pypy/issues/2617/) which is another hurdle for Linux wheels. Numpy/Scipy specifically shouldn't have any unusual needs here, so the efforts should probably be first directed at getting the general binary wheel infrastructure up for Pypy. Pauli From juanlu001 at gmail.com Tue Aug 28 09:05:08 2018 From: juanlu001 at gmail.com (Juan Luis Cano) Date: Tue, 28 Aug 2018 10:05:08 -0300 Subject: [SciPy-Dev] Plans for PyPy3 wheels? In-Reply-To: <14125554bdfc225e2fdd0a1a8122c6238e869aaf.camel@iki.fi> References: <14125554bdfc225e2fdd0a1a8122c6238e869aaf.camel@iki.fi> Message-ID: Hi, Thanks for your responses! Antonio Cuni clarified on Twitter that the major blocker is manylinux1 compatibility: https://twitter.com/antocuni/status/1034082028290011137 But today I learned that there's an upcoming standard, manylinux2010, that will supersede manylinux1 and will be based on CentOS 6 instead of 5, hence making things easier: https://github.com/pypa/manylinux/issues/179 Somebody pinged all the pending pull requests yesterday and apparently there are no major blockers, so perhaps we can speak soon about building Linux wheels with the new format, and hopefully PyPy will be compatible to it. On Fri, Aug 24, 2018 at 4:38 PM Pauli Virtanen wrote: > pe, 2018-08-24 kello 12:26 -0700, Ralf Gommers kirjoitti: > [clip] > > As far as I can find so quickly, there's not yet a clear ABI tag or > > policy > > on ABI stability for Pypy wheels ( > > https://github.com/pypa/wheel/issues/187#issuecomment-320482719). > > That > > would be needed before putting wheels on PyPI. Once that's in place, > > I > > think we'd be happy to consider releasing official Pypy >=3.5 wheels. > > > > If you or someone else wants to look into the current status on this, > > that > > would be useful. > > The second issue as I understand is that there are unsolved issues in > manylinux1 compatibility (https://bitbucket.org/pypy/pypy/issues/2617/) > which is another hurdle for Linux wheels. > > Numpy/Scipy specifically shouldn't have any unusual needs here, so the > efforts should probably be first directed at getting the general binary > wheel infrastructure up for Pypy. > > Pauli > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -- Juan Luis Cano -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Thu Aug 30 14:53:53 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Thu, 30 Aug 2018 11:53:53 -0700 Subject: [SciPy-Dev] Initializing error number in optimize zeros Message-ID: Hi, I have a question about the scalar function root finders in optimize/Zeros like brentq, etc, that are written in C, the error number is never explicitly initialized, so it could be previously set to -2 or -1, which could be misinterpreted as a failure. Any reason not to set the error number to zero explicitly at the start of the function? I've created an issue to track this : ENH: initialize params->error_num to zero in optimize/Zeros/*.c ? Issue #9189 ? scipy/scipy https://github.com/scipy/scipy/issues/9189 and implemented it in the cython_optimize_struct branch of my fork, eg bisect.c: https://github.com/mikofski/scipy/blob/cython_optimize_struct/scipy/optimize/Zeros/bisect.c#L13 Please, any comment? Thanks! Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Fri Aug 31 03:58:18 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Fri, 31 Aug 2018 00:58:18 -0700 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates Message-ID: Hi All, I would like to use Jinja2 templates to generate the Cython pyx and pxd files necessary for the Cython Optimize Zeros API proposed in #8431, maybe in a follow up PR https://github.com/scipy/scipy/pull/8431 Does anyone have any experience with code generation in SciPy? Is it best to generate the files offline and then commit them to the repository, or is it better to add some scripting to either setup.py or to the CI yaml config files? I put something together in a branch in my scipy fork, and it recreates the existing code verbatim. See the templates folder and _generate_zeros_type.py script. https://github.com/mikofski/scipy/tree/cython_optimize_generate_code/scipy/optimize/cython_optimize Any comments would be greatly appreciated. thanks! Mark -- Mark Mikofski, PhD (2005) *Fiat Lux* -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Fri Aug 31 04:03:46 2018 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Fri, 31 Aug 2018 11:03:46 +0300 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: Message-ID: Maybe better use Tempita, which we use already. For instance, https://github.com/scipy/scipy/blob/master/scipy/linalg/_decomp_update.pyx.in On Fri, Aug 31, 2018 at 10:58 AM Mark Alexander Mikofski wrote: > > Hi All, > > I would like to use Jinja2 templates to generate the Cython pyx and pxd files necessary for the Cython Optimize Zeros API proposed in #8431, maybe in a follow up PR > > https://github.com/scipy/scipy/pull/8431 > > Does anyone have any experience with code generation in SciPy? Is it best to generate the files offline and then commit them to the repository, or is it better to add some scripting to either setup.py or to the CI yaml config files? > > I put something together in a branch in my scipy fork, and it recreates the existing code verbatim. See the templates folder and _generate_zeros_type.py script. > > https://github.com/mikofski/scipy/tree/cython_optimize_generate_code/scipy/optimize/cython_optimize > > Any comments would be greatly appreciated. > > thanks! > Mark > -- > Mark Mikofski, PhD (2005) > Fiat Lux > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev From mikofski at berkeley.edu Fri Aug 31 13:07:57 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Fri, 31 Aug 2018 10:07:57 -0700 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: Message-ID: Thanks! Although looks like Tempita is inactive, the site link is broken, and the last release was in 2013. It looks like the most current fork which adds py3 compat is https://github.com/gjhiggins/tempita - is that what you are using? Any opposition to Jinja2? Jinja2 seems to work and it's already a requirement of SciPy via Sphinx, and it looks very similar to Tempita. On Fri, Aug 31, 2018 at 1:04 AM Evgeni Burovski wrote: > Maybe better use Tempita, which we use already. For instance, > > > https://github.com/scipy/scipy/blob/master/scipy/linalg/_decomp_update.pyx.in > On Fri, Aug 31, 2018 at 10:58 AM Mark Alexander Mikofski > wrote: > > > > Hi All, > > > > I would like to use Jinja2 templates to generate the Cython pyx and pxd > files necessary for the Cython Optimize Zeros API proposed in #8431, maybe > in a follow up PR > > > > https://github.com/scipy/scipy/pull/8431 > > > > Does anyone have any experience with code generation in SciPy? Is it > best to generate the files offline and then commit them to the repository, > or is it better to add some scripting to either setup.py or to the CI yaml > config files? > > > > I put something together in a branch in my scipy fork, and it recreates > the existing code verbatim. See the templates folder and > _generate_zeros_type.py script. > > > > > https://github.com/mikofski/scipy/tree/cython_optimize_generate_code/scipy/optimize/cython_optimize > > > > Any comments would be greatly appreciated. > > > > thanks! > > Mark > > -- > > Mark Mikofski, PhD (2005) > > Fiat Lux > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -- Mark Mikofski, PhD (2005) *Fiat Lux* -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Fri Aug 31 15:44:40 2018 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 31 Aug 2018 21:44:40 +0200 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: Message-ID: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Tempita is mature, and has worked fine for us. The "official" version at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). Jinja2 is much more heavily geared towards HTML generation, and for the use here I don't see it having much advantages. From robert.kern at gmail.com Fri Aug 31 16:59:30 2018 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 31 Aug 2018 13:59:30 -0700 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> References: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Message-ID: On Fri, Aug 31, 2018 at 12:47 PM Pauli Virtanen wrote: > Tempita is mature, and has worked fine for us. The "official" version > at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). > Jinja2 is much more heavily geared towards HTML generation, and for the > use here I don't see it having much advantages. > While we're on the subject, we might want to vendorize it. Right now, we're preferentially using Cython's vendorized version, but they have informed us that they consider that an implementation detail that external users (like us) should not rely on. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Fri Aug 31 17:22:44 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 31 Aug 2018 15:22:44 -0600 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Message-ID: On Fri, Aug 31, 2018 at 3:00 PM Robert Kern wrote: > On Fri, Aug 31, 2018 at 12:47 PM Pauli Virtanen wrote: > >> Tempita is mature, and has worked fine for us. The "official" version >> at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). >> Jinja2 is much more heavily geared towards HTML generation, and for the >> use here I don't see it having much advantages. >> > > While we're on the subject, we might want to vendorize it. Right now, > we're preferentially using Cython's vendorized version, but they have > informed us that they consider that an implementation detail that external > users (like us) should not rely on. > > It is already in vendorized in NumPy, see `tools/npy_tempita` and works for both Python 2 and Python 3. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Aug 31 17:26:18 2018 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 31 Aug 2018 14:26:18 -0700 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Message-ID: On Fri, Aug 31, 2018 at 2:23 PM Charles R Harris wrote: > > On Fri, Aug 31, 2018 at 3:00 PM Robert Kern wrote: > >> On Fri, Aug 31, 2018 at 12:47 PM Pauli Virtanen wrote: >> >>> Tempita is mature, and has worked fine for us. The "official" version >>> at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). >>> Jinja2 is much more heavily geared towards HTML generation, and for the >>> use here I don't see it having much advantages. >>> >> >> While we're on the subject, we might want to vendorize it. Right now, >> we're preferentially using Cython's vendorized version, but they have >> informed us that they consider that an implementation detail that external >> users (like us) should not rely on. >> >> > > It is already in vendorized in NumPy, see `tools/npy_tempita` and works > for both Python 2 and Python 3. > Cool. Should be easy to port that over to scipy, then. https://github.com/scipy/scipy/blob/master/tools/cythonize.py#L112 -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Fri Aug 31 17:26:49 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 31 Aug 2018 15:26:49 -0600 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> References: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Message-ID: On Fri, Aug 31, 2018 at 1:47 PM Pauli Virtanen wrote: > Tempita is mature, and has worked fine for us. The "official" version > at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). > Jinja2 is much more heavily geared towards HTML generation, and for the > use here I don't see it having much advantages. > The maintained version is now at https://github.com/gjhiggins/tempita. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Fri Aug 31 18:05:58 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Fri, 31 Aug 2018 15:05:58 -0700 Subject: [SciPy-Dev] Cython code generation with Jinja2 templates In-Reply-To: References: <40a06fc985d77cc15a79cdd0a3a6dd9ce9ff6706.camel@iki.fi> Message-ID: Ok, I agree, I will use Tempita templates to generate the cython code for PR 8431. Thanks everyone! On Fri, Aug 31, 2018, 2:27 PM Charles R Harris wrote: > > > On Fri, Aug 31, 2018 at 1:47 PM Pauli Virtanen wrote: > >> Tempita is mature, and has worked fine for us. The "official" version >> at https://pypi.org/project/Tempita/ is Python 3 compatible (via 2to3). >> Jinja2 is much more heavily geared towards HTML generation, and for the >> use here I don't see it having much advantages. >> > > The maintained version is now at https://github.com/gjhiggins/tempita. > > Chuck > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: