From evgeny.burovskiy at gmail.com Wed Mar 1 08:59:13 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 1 Mar 2017 16:59:13 +0300 Subject: [SciPy-Dev] GSoC17 - Question about candidacy In-Reply-To: References: Message-ID: On Tue, Feb 28, 2017 at 2:38 PM, ??????? ?????? wrote: > Dear developers > > I'm Yevhenii Hyzyla and I'm an undergraduate student from Taras Shevchenko > National University of Kyiv in Ukraine. I don't have any experience of open > source contributing, but I want to start this way from SciPy. I have good > knowledge of Python (some small pet project) and C on sufficient level > (university course). Also, I have an experience of working with SciPy > library and SciPy stack in general. > > After seeing the wiki page with ideas and I think I could develop > scipy.diff, because I have good knowledge about differential and how to > calculate them using numerical or symbolic methods. (I had courses at > university from Mathematical analysis and Numerical analysis). And my > hobbies are Math and Python. > > 1) If I make some patches to scipy, what chance of being accepted will be? > 2) How many students applies to scipy each year? > Could I get feedback about my candidacy from developers? > > Thank for reading! > > Yevhenii Hi Yevhenii, Glad to see your interest in this project! It's a bit hard to say anything about chances at this point. It'll depend on several things: the number and quality of applications, the availability of mentors, and so on. Our selection process relies a lot on the project proposals. From experience, writing a good proposal takes several iterations. I thus encourage you to write the first iteration and send it to this list. Cheers, Evgeni From aditya.dpsg.15 at gmail.com Wed Mar 1 09:06:44 2017 From: aditya.dpsg.15 at gmail.com (Aditya Singh) Date: Wed, 1 Mar 2017 19:36:44 +0530 Subject: [SciPy-Dev] GSoC 2017 Project- scipy.diff(numerical differentiation) Message-ID: Hello Developers! I am a GSoC enthusiast! any tips on how to begin? I am particularly interested in the project - " Implement scipy.diff(numerical differentiation) " Also this is the first time I am contributing to SciPy, so I need some help getting started! -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Mar 1 09:20:05 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 1 Mar 2017 17:20:05 +0300 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: <20170222070745.GI3818748@phare.normalesup.org> <2ad3a609-d29d-205e-ccfe-5151c9202261@crans.org> Message-ID: On Wed, Mar 1, 2017 at 2:01 AM, Matt Haberland wrote: > Based on positive feedback, I've gone ahead and completed most of the > finishing touches: documentation, pep8 compliance, etc... for the > "interior-point" method of linprog. > > As this is my first contribution to SciPy, I would appreciate a real-time > conversation with somebody regarding some remaining questions. Would an > experienced contributor, like whomever might review the forthcoming pull > request, be willing to schedule a time to chat? > > Thanks! > Matt > > -------- > > P.S. Example questions: > > How strict does PEP8 compliance need to be? For instance, do I really need > to replace lambda functions (defined within functions) with regular > functions? (There are a few other things autopep8 couldn't fix I want to ask > about.) > The code currently spits out warnings using the standard warnings module. Is > that OK or is there some other standard for SciPy? Does there need to to be > an option to disable these, or can we let the user suppress them otherwise > if desired? > I have a several non-essential feature questions like "Do I need to > implement a callback interface like the simplex method does, or can that be > left for an update?" Some of these are technical but each can be explained > with about one sentence of background information. I'm just looking for > opinions from anyone willing to think about it for 10 seconds. > > P.P.S. Re: Pierre's comment: regarding lack of accuracy, you mean when > linprog fails to find a solution when it exists or reports that it has found > a solution when it hasn't? Agreed, I think the interior point method would > help fix that. And its presolve subroutine could help the simplex code avoid > some of those problems. > > However because the simplex method traverses vertices to numerical > precision, its solutions (when it finds them) will typically be more > "exact", whereas interior point methods terminate when a user specified > tolerance is met. Along those lines, interior point methods can return > solutions at the boundary of the polytope but not at a vertex, and so they > do not always identify an optimal basis, which some applications require. > There are "crossover" methods to do this from an interior point solution but > I haven't looked into those yet. > > For that reason, and to allow time for any bugs with the IP code to surface, > should "simplex" remain the default, if the linprog documentation were to > state that "interior-point" could be faster? Hi Matt, Our development process is quite asynchronous at this time, so please don't be discouraged if you don't get much responses about a real-time conversation. It's just that we are spread quite thin all across the globe. Since your code is at the polishing stage already, a way forward can be to send a work-in-progress pull request on GitHub already. This will both give you feedback about style things like pep8 (we have a specialized checker which runs automatically on new PRs), and will hopefully help to attract reviewers. The PR does not need to be perfect from the start, most PRs go over a series of iterations anyway. Cheers, Evgeni From ralf.gommers at gmail.com Thu Mar 2 06:18:20 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 3 Mar 2017 00:18:20 +1300 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Wed, Mar 1, 2017 at 8:01 AM, Robert Kern wrote: > On Tue, Feb 28, 2017 at 1:31 AM, Ralf Gommers > wrote: > > > > On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern > wrote: > >> > >> On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> > > >> > > >> > > >> > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern > wrote: > >> >> > >> >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: > >> >> > > >> >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: > >> >> > [clip] > >> >> > > A few scattered thoughts about the project: > >> >> > > > >> >> > > - This is not exactly a transition to pytest as such. The goal > is to > >> >> > > allow an alternative test runner, with minimal changes to the > test > >> >> > > suite. > >> >> > > >> >> > I'd suggest we should consider actually dropping nose, if we make > the > >> >> > suite pytest compatible. > >> >> > > >> >> > I think retaining nose compatibility in the numpy/scipy test > suites does > >> >> > not bring advantages --- if we are planning to retain it, I would > like to > >> >> > understand why. Supporting two systems is more complicated, makes > it > >> >> > harder to add new tests, and if the other one is not regularly > used, > >> >> > it'll probably break often. > >> >> > > >> >> > I think we should avoid temptation to build compatibility layers, > or > >> >> > custom test runners --- there's already complication in > numpy.testing > >> >> > that IIRC originates from working around issues in nose, and I > think not > >> >> > many remember how it works in depth. > >> >> > > >> >> > I think the needs for Scipy and Numpy are not demanding, so > sticking with > >> >> > "vanilla" pytest features and using only its native test runner > would > >> >> > sound best. The main requirement is numerical assert functions, > but AFAIK > >> >> > the ones in numpy.testing are pytest compatible (and independent > of the > >> >> > nose machinery). > >> >> > >> >> If we're migrating test runners, I'd rather drop all dependencies > and target `python -m unittest discover` as the lowest common denominator > rather than target a fuzzy "vanilla" subset of pytest in particular. > >> > > >> > I'd certainly expect to make full use of the pytest features, why use > it otherwise? There is a reason that both nose and pytest extended > unittest. The only advantage I see to unittest is that it is less likely to > become abandon ware. > >> > >> Well, that's definitely not what Pauli was advocating, and my response > was intending to clarify his position. > >> > >> >> This may well be the technical effect of what you are describing, > but I think it's worth explicitly stating that focus. > >> > > >> > I don't think that us where we were headed. Can you make the case for > unittest? > >> > >> There are two places where nose and pytest provide features; they are > each a *test runner* and also a *test framework*. As a test runner, they > provide features for discovering tests (essentially, they let you easily > point to a subset of tests to run) and reporting the test results in a > variety of user-friendly ways. Test discovery functionality works with > plain-old-TestCases, and pretty much any test runner works with the > plain-old-TestCases. These features face the user who is *running* the > tests. > >> > >> The other place they provide features is as test frameworks. This is > where things like the generator tests and additional fixtures come in (i.e. > setup_class, setup_module, etc.). These require you to write your tests in > framework-specific ways. Those written for nose don't necessarily work with > those written for pytest and vice versa. And any other plain test runner is > right out. These features face the developer who is *writing* the tests. > >> > >> Test discovery was the main reason we started using nose over plain > unittest. At that time (and incidentally, when nose and pytest began), > `python -m unittest discover` didn't exist. > > > > As Josef said, "import scipy; scipy.test()" must work (always useful, > but especially important on Windows). > > Okay. And? > No and, just pointing out that `python -m unittest discover` doesn't cut it. > >> To run unit tests, you had to manually collect the TestCases into a > TestSuite and write your own logic for configuring any subsets from user > input. It was a pain in the ass for large hierarchical packages like numpy > and scipy. The test framework features of nose like generator tests were > nice bonuses, but were not the main motivating factor. [Source: me; it was > Jarrod and I who made the decision to use nose at a numpy sprint in > Berkeley.] > >> > >> If we're going to make a switch, I'd rather give up those framework > features (we don't use all that many) in order to get agnosticism on the > test runner front. Being agnostic to the test runner lets us optimize for > multiple use cases simultaneously. That is, the best test runner for "run > the whole suite and record every detail with coverage under time > constraints under Travis CI" is often different from what a developer wants > for "run just the one test module for the thing I'm working on and report > to me only what breaks as fast as possible with minimal clutter". > > > > I don't believe we can live with just plain unittest, we want to move to > a better maintained framework and gain some features in this GSoC project, > not go backwards in functionality. > > Ah, okay. The way the project abstract is written, this is not clear. It > reads as if you want pytest as a mere additional possible test runner > alongside nose. Instead, it seems that you want to migrate away from nose > and to pytest, wholesale. I'm fine with that. But get Pauli on board with > going beyond "vanilla" pytest and rewrite the abstract. Drop the bit about > ideally running the test suite with both nose and pytest. > Fair point. Will try to clean up the abstract over the weekend. > If you're looking to gain functionality, there's no point, just > maintenance headaches. If you want some runner-agnosticism, then unittest > is the layer to do it on. Maintaining runner-agnosticism for just two > runners (only one of which is maintained) is a wasted opportunity. That's > my main contention. > You're right. Runner agnosticism is not what we were aiming at, but if we were then it'd probably have to be plain unittest. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From karandesai281196 at gmail.com Thu Mar 2 06:30:24 2017 From: karandesai281196 at gmail.com (Karan Desai) Date: Thu, 02 Mar 2017 11:30:24 +0000 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: <0f0e1e24-c88d-a71c-3100-02016158b76e@mixmax.com> Hello developers, I was out of town for a while. Came back to see there's quite a lot of discussion on this idea. Just to acknowledge that it'll take time for me to process all the information above. I'll give in my inputs to discussion or contribute a patch directly in near time. Regards,Karan. On Thu, Mar 2, 2017 4:48 PM, Ralf Gommers ralf.gommers at gmail.com wrote: On Wed, Mar 1, 2017 at 8:01 AM, Robert Kern wrote: On Tue, Feb 28, 2017 at 1:31 AM, Ralf Gommers wrote: > > On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern wrote: >> >> On Mon, Feb 27, 2017 at 2:32 PM, Charles R Harris wrote: >> > >> > >> > >> > On Mon, Feb 27, 2017 at 2:35 PM, Robert Kern wrote: >> >> >> >> On Mon, Feb 27, 2017 at 1:03 PM, Pauli Virtanen wrote: >> >> > >> >> > Mon, 27 Feb 2017 19:13:48 +0300, Evgeni Burovski kirjoitti: >> >> > [clip] >> >> > > A few scattered thoughts about the project: >> >> > > >> >> > > - This is not exactly a transition to pytest as such. The goal is to >> >> > > allow an alternative test runner, with minimal changes to the test >> >> > > suite. >> >> > >> >> > I'd suggest we should consider actually dropping nose, if we make the >> >> > suite pytest compatible. >> >> > >> >> > I think retaining nose compatibility in the numpy/scipy test suites does >> >> > not bring advantages --- if we are planning to retain it, I would like to >> >> > understand why. Supporting two systems is more complicated, makes it >> >> > harder to add new tests, and if the other one is not regularly used, >> >> > it'll probably break often. >> >> > >> >> > I think we should avoid temptation to build compatibility layers, or >> >> > custom test runners --- there's already complication in numpy.testing >> >> > that IIRC originates from working around issues in nose, and I think not >> >> > many remember how it works in depth. >> >> > >> >> > I think the needs for Scipy and Numpy are not demanding, so sticking with >> >> > "vanilla" pytest features and using only its native test runner would >> >> > sound best. The main requirement is numerical assert functions, but AFAIK >> >> > the ones in numpy.testing are pytest compatible (and independent of the >> >> > nose machinery). >> >> >> >> If we're migrating test runners, I'd rather drop all dependencies and target `python -m unittest discover` as the lowest common denominator rather than target a fuzzy "vanilla" subset of pytest in particular. >> > >> > I'd certainly expect to make full use of the pytest features, why use it otherwise? There is a reason that both nose and pytest extended unittest. The only advantage I see to unittest is that it is less likely to become abandon ware. >> >> Well, that's definitely not what Pauli was advocating, and my response was intending to clarify his position. >> >> >> This may well be the technical effect of what you are describing, but I think it's worth explicitly stating that focus. >> > >> > I don't think that us where we were headed. Can you make the case for unittest? >> >> There are two places where nose and pytest provide features; they are each a *test runner* and also a *test framework*. As a test runner, they provide features for discovering tests (essentially, they let you easily point to a subset of tests to run) and reporting the test results in a variety of user-friendly ways. Test discovery functionality works with plain-old-TestCases, and pretty much any test runner works with the plain-old-TestCases. These features face the user who is *running* the tests. >> >> The other place they provide features is as test frameworks. This is where things like the generator tests and additional fixtures come in (i.e. setup_class, setup_module, etc.). These require you to write your tests in framework-specific ways. Those written for nose don't necessarily work with those written for pytest and vice versa. And any other plain test runner is right out. These features face the developer who is *writing* the tests. >> >> Test discovery was the main reason we started using nose over plain unittest. At that time (and incidentally, when nose and pytest began), `python -m unittest discover` didn't exist. > > As Josef said, "import scipy; scipy.test()" must work (always useful, but especially important on Windows). Okay. And? No and, just pointing out that `python -m unittest discover` doesn't cut it. >> To run unit tests, you had to manually collect the TestCases into a TestSuite and write your own logic for configuring any subsets from user input. It was a pain in the ass for large hierarchical packages like numpy and scipy. The test framework features of nose like generator tests were nice bonuses, but were not the main motivating factor. [Source: me; it was Jarrod and I who made the decision to use nose at a numpy sprint in Berkeley.] >> >> If we're going to make a switch, I'd rather give up those framework features (we don't use all that many) in order to get agnosticism on the test runner front. Being agnostic to the test runner lets us optimize for multiple use cases simultaneously. That is, the best test runner for "run the whole suite and record every detail with coverage under time constraints under Travis CI" is often different from what a developer wants for "run just the one test module for the thing I'm working on and report to me only what breaks as fast as possible with minimal clutter". > > I don't believe we can live with just plain unittest, we want to move to a better maintained framework and gain some features in this GSoC project, not go backwards in functionality. Ah, okay. The way the project abstract is written, this is not clear. It reads as if you want pytest as a mere additional possible test runner alongside nose. Instead, it seems that you want to migrate away from nose and to pytest, wholesale. I'm fine with that. But get Pauli on board with going beyond "vanilla" pytest and rewrite the abstract. Drop the bit about ideally running the test suite with both nose and pytest. Fair point. Will try to clean up the abstract over the weekend. If you're looking to gain functionality, there's no point, just maintenance headaches. If you want some runner-agnosticism, then unittest is the layer to do it on. Maintaining runner-agnosticism for just two runners (only one of which is maintained) is a wasted opportunity. That's my main contention. You're right. Runner agnosticism is not what we were aiming at, but if we were then it'd probably have to be plain unittest. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Mar 2 06:54:08 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 3 Mar 2017 00:54:08 +1300 Subject: [SciPy-Dev] Interested in contributing BILP solver and interior-point method for linprog; SciPy-dev search down? In-Reply-To: References: <20170222070745.GI3818748@phare.normalesup.org> <2ad3a609-d29d-205e-ccfe-5151c9202261@crans.org> Message-ID: On Thu, Mar 2, 2017 at 3:20 AM, Evgeni Burovski wrote: > On Wed, Mar 1, 2017 at 2:01 AM, Matt Haberland wrote: > > Based on positive feedback, I've gone ahead and completed most of the > > finishing touches: documentation, pep8 compliance, etc... for the > > "interior-point" method of linprog. > > > > As this is my first contribution to SciPy, I would appreciate a real-time > > conversation with somebody regarding some remaining questions. Would an > > experienced contributor, like whomever might review the forthcoming pull > > request, be willing to schedule a time to chat? > > > > Thanks! > > Matt > > > > -------- > > > > P.S. Example questions: > > > > How strict does PEP8 compliance need to be? For instance, do I really > need > > to replace lambda functions (defined within functions) with regular > > functions? (There are a few other things autopep8 couldn't fix I want to > ask > > about.) > > The code currently spits out warnings using the standard warnings > module. Is > > that OK or is there some other standard for SciPy? Does there need to to > be > > an option to disable these, or can we let the user suppress them > otherwise > > if desired? > > I have a several non-essential feature questions like "Do I need to > > implement a callback interface like the simplex method does, or can that > be > > left for an update?" Some of these are technical but each can be > explained > > with about one sentence of background information. I'm just looking for > > opinions from anyone willing to think about it for 10 seconds. > > > > P.P.S. Re: Pierre's comment: regarding lack of accuracy, you mean when > > linprog fails to find a solution when it exists or reports that it has > found > > a solution when it hasn't? Agreed, I think the interior point method > would > > help fix that. And its presolve subroutine could help the simplex code > avoid > > some of those problems. > > > > However because the simplex method traverses vertices to numerical > > precision, its solutions (when it finds them) will typically be more > > "exact", whereas interior point methods terminate when a user specified > > tolerance is met. Along those lines, interior point methods can return > > solutions at the boundary of the polytope but not at a vertex, and so > they > > do not always identify an optimal basis, which some applications require. > > There are "crossover" methods to do this from an interior point solution > but > > I haven't looked into those yet. > > > > For that reason, and to allow time for any bugs with the IP code to > surface, > > should "simplex" remain the default, if the linprog documentation were to > > state that "interior-point" could be faster? > > > Hi Matt, > > Our development process is quite asynchronous at this time, so please > don't be discouraged if you don't get much responses about a real-time > conversation. It's just that we are spread quite thin all across the > globe. > > Since your code is at the polishing stage already, a way forward can > be to send a work-in-progress pull request on GitHub already. +1 for submitting what you have already (start the PR title with WIP: ), always good to get people to comment on it. I'm happy to have a quick chat though if you prefer that. I'm 21 hours ahead of you; my 8-9am (your 11-12am) works well if you ping me in advance. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Mar 2 13:06:19 2017 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 2 Mar 2017 10:06:19 -0800 Subject: [SciPy-Dev] GSoC'17 candidate - Interested in Nose to Pytest Migration. In-Reply-To: References: <085705b7-ae6c-7fb0-478d-c9f33a13b8b2@mixmax.com> Message-ID: On Thu, Mar 2, 2017 at 3:18 AM, Ralf Gommers wrote: > > On Wed, Mar 1, 2017 at 8:01 AM, Robert Kern wrote: >> >> On Tue, Feb 28, 2017 at 1:31 AM, Ralf Gommers wrote: >> > >> > On Tue, Feb 28, 2017 at 12:08 PM, Robert Kern wrote: >> >> Test discovery was the main reason we started using nose over plain unittest. At that time (and incidentally, when nose and pytest began), `python -m unittest discover` didn't exist. >> > >> > As Josef said, "import scipy; scipy.test()" must work (always useful, but especially important on Windows). >> >> Okay. And? > > No and, just pointing out that `python -m unittest discover` doesn't cut it. Ah, I see. When I wrote that, I didn't mean that we'd type in `python -m unittest discover` all the time. I was just referring to modern unittest's builtin test discovery functionality, which makes it a feasible tool to underlie `scipy.test()`. If that had existed at the time, we likely wouldn't have rewritten `scipy.test()` to use nose. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.zhurko at gmail.com Thu Mar 2 19:59:36 2017 From: evgeny.zhurko at gmail.com (Evgeny Zhurko) Date: Fri, 3 Mar 2017 03:59:36 +0300 Subject: [SciPy-Dev] Owen's T function Message-ID: Hi all, I started implementing Owen's T function( https://github.com/scipy/scipy/pull/7120). I have implemented Omen's T1, T2, T4,T6 from "Fast and accurate calculation of Owen's T Function ( https://www.jstatsoft.org/article/view/v005i05/t.pdf) already(not pushed yet). I have problem with T3 and T5. I can't evaluate C2i - obtained from the coefficients of Chebyshes polynomials in T3 and can't evaluate Wi - wieghts and Xi - values of abcissas. As i see in Boost library, they doesn't evaluate C2i, Wi, Xi(constants in source). Maybe someone know how i can evaluate it using scipy or numpy? Best Regards, Evgeny Zhurko -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.craig.wilson at gmail.com Thu Mar 2 21:02:13 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Thu, 2 Mar 2017 20:02:13 -0600 Subject: [SciPy-Dev] Owen's T function In-Reply-To: References: Message-ID: Why not borrow the constants from the Boost source? (Boost is license compatible.) On Thu, Mar 2, 2017 at 6:59 PM, Evgeny Zhurko wrote: > Hi all, > > I started implementing Owen's T > function(https://github.com/scipy/scipy/pull/7120). I have implemented > Omen's T1, T2, T4,T6 from "Fast and accurate calculation of Owen's T > Function (https://www.jstatsoft.org/article/view/v005i05/t.pdf) already(not > pushed yet). > I have problem with T3 and T5. I can't evaluate C2i - obtained from the > coefficients of Chebyshes > polynomials in T3 and can't evaluate Wi - wieghts and Xi - values of > abcissas. > As i see in Boost library, they doesn't evaluate C2i, Wi, Xi(constants in > source). > > Maybe someone know how i can evaluate it using scipy or numpy? > > Best Regards, > Evgeny Zhurko > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From evgeny.zhurko at gmail.com Fri Mar 3 17:54:38 2017 From: evgeny.zhurko at gmail.com (Evgeny Zhurko) Date: Sat, 4 Mar 2017 01:54:38 +0300 Subject: [SciPy-Dev] Owen's T function Message-ID: Maybe someone have borrowed constants before me but i didn't found it. I included all constant and test data from Boost in last commit. But i have new questions: I implemented T1-T6 methods and tried test function on Boost's test data. All tests passed when i use assert_almost_equal with decimal=1. About 30 of 400 examples have deviation 0.01 from real value, 40 of 400 examples have deviation about 1e-10 and the others less then 1e-18. Boost have improvement for some methods ( http://www.boost.org/doc/libs/1_52_0/boost/math/special_functions/owens_t.hpp). Is it enough to insert copyright in source code if i'll implement this algorithm into Scipy? Test data have a very big size(about 100 character per number). I inserted data into test_basic.py . May I move test data into txt or csv file and read it in tests? Is it requires including copyright into file with test data? Test data from Boost not cover all methods(only T2, T4) as a result I want to use Owen, Donald B. "Tables for Computing Bivariate Normal Probabilities." ( http://projecteuclid.org/download/pdf_1/euclid.aoms/1177728074). Pull request: https://github.com/scipy/scipy/pull/7120 -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.craig.wilson at gmail.com Fri Mar 3 18:04:42 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Fri, 3 Mar 2017 17:04:42 -0600 Subject: [SciPy-Dev] Owen's T function In-Reply-To: References: Message-ID: > All tests passed when i use assert_almost_equal with decimal=1 Don't measure accuracy by decimals--with special functions getting a small relative error is more important. In general a relative error of 1e-11 to 1e-14 or so is a good target. Sometimes there are good reasons why one can't achieve a relative error that small (such as a zero of a function). > Is it enough to insert copyright in source code if i'll implement this algorithm into Scipy? That should be fine. > I inserted data into test_basic.py You want to use the framework in `test_data.py`. On Fri, Mar 3, 2017 at 4:54 PM, Evgeny Zhurko wrote: > Maybe someone have borrowed constants before me but i didn't found it. I > included all constant and test data from Boost in last commit. > > But i have new questions: > > I implemented T1-T6 methods and tried test function on Boost's test data. > All tests passed when i use assert_almost_equal with decimal=1. About 30 of > 400 examples have deviation 0.01 from real value, 40 of 400 examples have > deviation about 1e-10 and the others less then 1e-18. Boost have improvement > for some methods > (http://www.boost.org/doc/libs/1_52_0/boost/math/special_functions/owens_t.hpp). > Is it enough to insert copyright in source code if i'll implement this > algorithm into Scipy? > > Test data have a very big size(about 100 character per number). I inserted > data into test_basic.py . May I move test data into txt or csv file and read > it in tests? Is it requires including copyright into file with test data? > > Test data from Boost not cover all methods(only T2, T4) as a result I want > to use > Owen, Donald B. "Tables for Computing Bivariate Normal Probabilities." > (http://projecteuclid.org/download/pdf_1/euclid.aoms/1177728074). > > Pull request: https://github.com/scipy/scipy/pull/7120 > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Mon Mar 6 22:57:25 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 6 Mar 2017 20:57:25 -0700 Subject: [SciPy-Dev] NumPy pre-release 1.12.1rc1 Message-ID: Hi All, I'm pleased to announce the release of NumPy 1.12.1rc1. NumPy 1.12.1rc1 supports Python 2.7 and 3.4 - 3.6 and fixes bugs and regressions found in NumPy 1.12.0. In particular, the regression in f2py constant parsing is fixed. Wheels for Linux, Windows, and OSX can be found on pypi. Archives can be downloaded from github . *Contributors* A total of 10 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * Charles Harris * Eric Wieser * Greg Young * Joerg Behrmann + * John Kirkham * Julian Taylor * Marten van Kerkwijk * Matthew Brett * Shota Kawabuchi * Jean Utke + *Fixes Backported* * #8483: BUG: Fix wrong future nat warning and equiv type logic error... * #8489: BUG: Fix wrong masked median for some special cases * #8490: DOC: Place np.average in inline code * #8491: TST: Work around isfinite inconsistency on i386 * #8494: BUG: Guard against replacing constants without `'_'` spec in f2py. * #8524: BUG: Fix mean for float 16 non-array inputs for 1.12 * #8571: BUG: Fix calling python api with error set and minor leaks for... * #8602: BUG: Make iscomplexobj compatible with custom dtypes again * #8618: BUG: Fix undefined behaviour induced by bad `__array_wrap__` * #8648: BUG: Fix `MaskedArray.__setitem__` * #8659: BUG: PPC64el machines are POWER for Fortran in f2py * #8665: BUG: Look up methods on MaskedArray in `_frommethod` * #8674: BUG: Remove extra digit in `binary_repr` at limit * #8704: BUG: Fix deepcopy regression for empty arrays. * #8707: BUG: Fix ma.median for empty ndarrays Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Mar 7 16:01:15 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 8 Mar 2017 10:01:15 +1300 Subject: [SciPy-Dev] Fwd: Docathon Day 1 Update: The Doccening In-Reply-To: <80bb9d960ec9c9896a0cf4b96.2abb39d3c3.20170307025755.cfdc895f86.b6ed6565@mail225.atl121.mcsv.net> References: <80bb9d960ec9c9896a0cf4b96.2abb39d3c3.20170307025755.cfdc895f86.b6ed6565@mail225.atl121.mcsv.net> Message-ID: Hi all, The docathlon has started, and Warren got us off the mark with https://github.com/scipy/scipy/pull/7138 (thanks!). If you want to contribute, an easy way is what Warren did: pick a function with an incomplete docstring (e.g. missing example) and add it. Alternatively, look at one of the issues with the documentation label: https://github.com/scipy/scipy/issues?q=is%3Aopen+is%3Aissue+label%3ADocumentation And remember to start your commit message with DOC:. Cheers, Ralf ---------- Forwarded message ---------- From: The Docathon Team Date: Tue, Mar 7, 2017 at 3:58 PM Subject: Docathon Day 1 Update: The Doccening To: ralf.gommers at gmail.com *To err is human, to document is divine -Docrates* The Docathon is One Day Old! The docathon just turned one (day old)! Here's a quick update to congratulate everybody on a great first day. Updates - We kicked off the week at BIDS by hosting a half-day of Documentation Tutorials . These were live-streamed on youtube for your viewing pleasure! - We also demoed a template repository to help you get sphinx / numpydoc / sphinx-gallery working with your project. Use that along with this guide repository to help you get started! Check out the docboards! My favorite thing about today is that you can totally see a bump in the documentation commits for our projects. You can check that out on our global activity board below: *Global Activity* We should also give a shout out to our docstar projects that showed a big bump in activity today. Here's our docboard for project commits: Three cheers for: - PMagPy , a python project for analyzing paleomagnetic data - cottoncandy , a project for storing and flexibly accessing numpy data on Amazon S3. - Sylius , a platform for e-commerce using PHP We also got a big bump in commits from individual users! Here's what everybody has been up to: Let's give a shout out to this day's docstars *anwarnunez*, *willingc*, and *swanson-hysell*. Keep it going! We look forward to seeing what comes next tomorrow. The working groups will be holding sessions once again, though we've gotten a lot of great contributions from people all over the country! We're making great progress, so let's keep the momentum through tomorrow! Until then, *The Docathon Team* Docathon website Projects page Hosts page Slack channel Github repo all hail the party parrot [image: Email Marketing Powered by MailChimp] All hail the party parrot. Want to change how you receive these emails? You can update your preferences or unsubscribe from this list -------------- next part -------------- An HTML attachment was scrubbed... URL: From sturla.molden at gmail.com Wed Mar 8 08:34:45 2017 From: sturla.molden at gmail.com (Sturla Molden) Date: Wed, 8 Mar 2017 14:34:45 +0100 Subject: [SciPy-Dev] Drop i386 part of OSX wheel build? In-Reply-To: References: Message-ID: On 24/02/2017 19:51, Matthew Brett wrote: > The main reason for this is that, for OSX, we are building twice, > first for i386, then for x86_64, and then combining these builds into > a 'fat' binary to be compatible with either architecture. The builds > take a long time, hence the timeout. With only amd64 (x86_64) we can also use a recent gfortran to build SciPy, e.g. a binary from the GCC wiki. Sturla From evgeny.burovskiy at gmail.com Thu Mar 9 10:54:31 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 9 Mar 2017 18:54:31 +0300 Subject: [SciPy-Dev] ANN: SciPy 0.19.0 Message-ID: On behalf of the Scipy development team I am pleased to announce the availability of Scipy 0.19.0. This release contains several great new features and a large number of bug fixes and various improvements, as detailed in the release notes below. 121 people contributed to this release over the course of seven months. Thanks to everyone who contributed! This release requires Python 2.7 or 3.4-3.6 and NumPy 1.8.2 or greater. Source tarballs and release notes can be found at https://github.com/scipy/scipy/releases/tag/v0.19.0. OS X and Linux wheels are available from PyPI. For security-conscious, the wheels themselves are signed with my GPG key. Additionally, you can checksum the wheels and verify the checksums with those listed below or in the README file at https://github.com/scipy/scipy/releases/tag/v0.19.0. Cheers, Evgeni -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 ========================== SciPy 0.19.0 Release Notes ========================== .. contents:: SciPy 0.19.0 is the culmination of 7 months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.19.x branch, and on adding new features on the master branch. This release requires Python 2.7 or 3.4-3.6 and NumPy 1.8.2 or greater. Highlights of this release include: - - A unified foreign function interface layer, `scipy.LowLevelCallable`. - - Cython API for scalar, typed versions of the universal functions from the `scipy.special` module, via `cimport scipy.special.cython_special`. New features ============ Foreign function interface improvements - --------------------------------------- `scipy.LowLevelCallable` provides a new unified interface for wrapping low-level compiled callback functions in the Python space. It supports Cython imported "api" functions, ctypes function pointers, CFFI function pointers, ``PyCapsules``, Numba jitted functions and more. See `gh-6509 `_ for details. `scipy.linalg` improvements - --------------------------- The function `scipy.linalg.solve` obtained two more keywords ``assume_a`` and ``transposed``. The underlying LAPACK routines are replaced with "expert" versions and now can also be used to solve symmetric, hermitian and positive definite coefficient matrices. Moreover, ill-conditioned matrices now cause a warning to be emitted with the estimated condition number information. Old ``sym_pos`` keyword is kept for backwards compatibility reasons however it is identical to using ``assume_a='pos'``. Moreover, the ``debug`` keyword, which had no function but only printing the ``overwrite_`` values, is deprecated. The function `scipy.linalg.matrix_balance` was added to perform the so-called matrix balancing using the LAPACK xGEBAL routine family. This can be used to approximately equate the row and column norms through diagonal similarity transformations. The functions `scipy.linalg.solve_continuous_are` and `scipy.linalg.solve_discrete_are` have numerically more stable algorithms. These functions can also solve generalized algebraic matrix Riccati equations. Moreover, both gained a ``balanced`` keyword to turn balancing on and off. `scipy.spatial` improvements - ---------------------------- `scipy.spatial.SphericalVoronoi.sort_vertices_of_regions` has been re-written in Cython to improve performance. `scipy.spatial.SphericalVoronoi` can handle > 200 k points (at least 10 million) and has improved performance. The function `scipy.spatial.distance.directed_hausdorff` was added to calculate the directed Hausdorff distance. ``count_neighbors`` method of `scipy.spatial.cKDTree` gained an ability to perform weighted pair counting via the new keywords ``weights`` and ``cumulative``. See `gh-5647 `_ for details. `scipy.spatial.distance.pdist` and `scipy.spatial.distance.cdist` now support non-double custom metrics. `scipy.ndimage` improvements - ---------------------------- The callback function C API supports PyCapsules in Python 2.7 Multidimensional filters now allow having different extrapolation modes for different axes. `scipy.optimize` improvements - ----------------------------- The `scipy.optimize.basinhopping` global minimizer obtained a new keyword, `seed`, which can be used to seed the random number generator and obtain repeatable minimizations. The keyword `sigma` in `scipy.optimize.curve_fit` was overloaded to also accept the covariance matrix of errors in the data. `scipy.signal` improvements - --------------------------- The function `scipy.signal.correlate` and `scipy.signal.convolve` have a new optional parameter `method`. The default value of `auto` estimates the fastest of two computation methods, the direct approach and the Fourier transform approach. A new function has been added to choose the convolution/correlation method, `scipy.signal.choose_conv_method` which may be appropriate if convolutions or correlations are performed on many arrays of the same size. New functions have been added to calculate complex short time fourier transforms of an input signal, and to invert the transform to recover the original signal: `scipy.signal.stft` and `scipy.signal.istft`. This implementation also fixes the previously incorrect ouput of `scipy.signal.spectrogram` when complex output data were requested. The function `scipy.signal.sosfreqz` was added to compute the frequency response from second-order sections. The function `scipy.signal.unit_impulse` was added to conveniently generate an impulse function. The function `scipy.signal.iirnotch` was added to design second-order IIR notch filters that can be used to remove a frequency component from a signal. The dual function `scipy.signal.iirpeak` was added to compute the coefficients of a second-order IIR peak (resonant) filter. The function `scipy.signal.minimum_phase` was added to convert linear-phase FIR filters to minimum phase. The functions `scipy.signal.upfirdn` and `scipy.signal.resample_poly` are now substantially faster when operating on some n-dimensional arrays when n > 1. The largest reduction in computation time is realized in cases where the size of the array is small (<1k samples or so) along the axis to be filtered. `scipy.fftpack` improvements - ---------------------------- Fast Fourier transform routines now accept `np.float16` inputs and upcast them to `np.float32`. Previously, they would raise an error. `scipy.cluster` improvements - ---------------------------- Methods ``"centroid"`` and ``"median"`` of `scipy.cluster.hierarchy.linkage` have been significantly sped up. Long-standing issues with using ``linkage`` on large input data (over 16 GB) have been resolved. `scipy.sparse` improvements - --------------------------- The functions `scipy.sparse.save_npz` and `scipy.sparse.load_npz` were added, providing simple serialization for some sparse formats. The `prune` method of classes `bsr_matrix`, `csc_matrix`, and `csr_matrix` was updated to reallocate backing arrays under certain conditions, reducing memory usage. The methods `argmin` and `argmax` were added to classes `coo_matrix`, `csc_matrix`, `csr_matrix`, and `bsr_matrix`. New function `scipy.sparse.csgraph.structural_rank` computes the structural rank of a graph with a given sparsity pattern. New function `scipy.sparse.linalg.spsolve_triangular` solves a sparse linear system with a triangular left hand side matrix. `scipy.special` improvements - ---------------------------- Scalar, typed versions of universal functions from `scipy.special` are available in the Cython space via ``cimport`` from the new module `scipy.special.cython_special`. These scalar functions can be expected to be significantly faster then the universal functions for scalar arguments. See the `scipy.special` tutorial for details. Better control over special-function errors is offered by the functions `scipy.special.geterr` and `scipy.special.seterr` and the context manager `scipy.special.errstate`. The names of orthogonal polynomial root functions have been changed to be consistent with other functions relating to orthogonal polynomials. For example, `scipy.special.j_roots` has been renamed `scipy.special.roots_jacobi` for consistency with the related functions `scipy.special.jacobi` and `scipy.special.eval_jacobi`. To preserve back-compatibility the old names have been left as aliases. Wright Omega function is implemented as `scipy.special.wrightomega`. `scipy.stats` improvements - -------------------------- The function `scipy.stats.weightedtau` was added. It provides a weighted version of Kendall's tau. New class `scipy.stats.multinomial` implements the multinomial distribution. New class `scipy.stats.rv_histogram` constructs a continuous univariate distribution with a piecewise linear CDF from a binned data sample. New class `scipy.stats.argus` implements the Argus distribution. `scipy.interpolate` improvements - -------------------------------- New class `scipy.interpolate.BSpline` represents splines. ``BSpline`` objects contain knots and coefficients and can evaluate the spline. The format is consistent with FITPACK, so that one can do, for example:: >>> t, c, k = splrep(x, y, s=0) >>> spl = BSpline(t, c, k) >>> np.allclose(spl(x), y) ``spl*`` functions, `scipy.interpolate.splev`, `scipy.interpolate.splint`, `scipy.interpolate.splder` and `scipy.interpolate.splantider`, accept both ``BSpline`` objects and ``(t, c, k)`` tuples for backwards compatibility. For multidimensional splines, ``c.ndim > 1``, ``BSpline`` objects are consistent with piecewise polynomials, `scipy.interpolate.PPoly`. This means that ``BSpline`` objects are not immediately consistent with `scipy.interpolate.splprep`, and one *cannot* do ``>>> BSpline(*splprep([x, y])[0])``. Consult the `scipy.interpolate` test suite for examples of the precise equivalence. In new code, prefer using ``scipy.interpolate.BSpline`` objects instead of manipulating ``(t, c, k)`` tuples directly. New function `scipy.interpolate.make_interp_spline` constructs an interpolating spline given data points and boundary conditions. New function `scipy.interpolate.make_lsq_spline` constructs a least-squares spline approximation given data points. `scipy.integrate` improvements - ------------------------------ Now `scipy.integrate.fixed_quad` supports vector-valued functions. Deprecated features =================== `scipy.interpolate.splmake`, `scipy.interpolate.spleval` and `scipy.interpolate.spline` are deprecated. The format used by `splmake/spleval` was inconsistent with `splrep/splev` which was confusing to users. `scipy.special.errprint` is deprecated. Improved functionality is available in `scipy.special.seterr`. calling `scipy.spatial.distance.pdist` or `scipy.spatial.distance.cdist` with arguments not needed by the chosen metric is deprecated. Also, metrics `"old_cosine"` and `"old_cos"` are deprecated. Backwards incompatible changes ============================== The deprecated ``scipy.weave`` submodule was removed. `scipy.spatial.distance.squareform` now returns arrays of the same dtype as the input, instead of always float64. `scipy.special.errprint` now returns a boolean. The function `scipy.signal.find_peaks_cwt` now returns an array instead of a list. `scipy.stats.kendalltau` now computes the correct p-value in case the input contains ties. The p-value is also identical to that computed by `scipy.stats.mstats.kendalltau` and by R. If the input does not contain ties there is no change w.r.t. the previous implementation. The function `scipy.linalg.block_diag` will not ignore zero-sized matrices anymore. Instead it will insert rows or columns of zeros of the appropriate size. See gh-4908 for more details. Other changes ============= SciPy wheels will now report their dependency on ``numpy`` on all platforms. This change was made because Numpy wheels are available, and because the pip upgrade behavior is finally changing for the better (use ``--upgrade-strategy=only-if-needed`` for ``pip >= 8.2``; that behavior will become the default in the next major version of ``pip``). Numerical values returned by `scipy.interpolate.interp1d` with ``kind="cubic"`` and ``"quadratic"`` may change relative to previous scipy versions. If your code depended on specific numeric values (i.e., on implementation details of the interpolators), you may want to double-check your results. Authors ======= * @endolith * Max Argus + * Herv? Audren * Alessandro Pietro Bardelli + * Michael Benfield + * Felix Berkenkamp * Matthew Brett * Per Brodtkorb * Evgeni Burovski * Pierre de Buyl * CJ Carey * Brandon Carter + * Tim Cera * Klesk Chonkin * Christian H?ggstr?m + * Luca Citi * Peadar Coyle + * Daniel da Silva + * Greg Dooper + * John Draper + * drlvk + * David Ellis + * Yu Feng * Baptiste Fontaine + * Jed Frey + * Siddhartha Gandhi + * Wim Glenn + * Akash Goel + * Christoph Gohlke * Ralf Gommers * Alexander Goncearenco + * Richard Gowers + * Alex Griffing * Radoslaw Guzinski + * Charles Harris * Callum Jacob Hays + * Ian Henriksen * Randy Heydon + * Lindsey Hiltner + * Gerrit Holl + * Hiroki IKEDA + * jfinkels + * Mher Kazandjian + * Thomas Keck + * keuj6 + * Kornel Kielczewski + * Sergey B Kirpichev + * Vasily Kokorev + * Eric Larson * Denis Laxalde * Gregory R. Lee * Josh Lefler + * Julien Lhermitte + * Evan Limanto + * Jin-Guo Liu + * Nikolay Mayorov * Geordie McBain + * Josue Melka + * Matthieu Melot * michaelvmartin15 + * Surhud More + * Brett M. Morris + * Chris Mutel + * Paul Nation * Andrew Nelson * David Nicholson + * Aaron Nielsen + * Joel Nothman * nrnrk + * Juan Nunez-Iglesias * Mikhail Pak + * Gavin Parnaby + * Thomas Pingel + * Ilhan Polat + * Aman Pratik + * Sebastian Pucilowski * Ted Pudlik * puenka + * Eric Quintero * Tyler Reddy * Joscha Reimer * Antonio Horta Ribeiro + * Edward Richards + * Roman Ring + * Rafael Rossi + * Colm Ryan + * Sami Salonen + * Alvaro Sanchez-Gonzalez + * Johannes Schmitz * Kari Schoonbee * Yurii Shevchuk + * Jonathan Siebert + * Jonathan Tammo Siebert + * Scott Sievert + * Sourav Singh * Byron Smith + * Srikiran + * Samuel St-Jean + * Yoni Teitelbaum + * Bhavika Tekwani * Martin Thoma * timbalam + * Svend Vanderveken + * Sebastiano Vigna + * Aditya Vijaykumar + * Santi Villalba + * Ze Vinicius * Pauli Virtanen * Matteo Visconti * Yusuke Watanabe + * Warren Weckesser * Phillip Weinberg + * Nils Werner * Jakub Wilk * Josh Wilson * wirew0rm + * David Wolever + * Nathan Woods * ybeltukov + * G Young * Evgeny Zhurko + A total of 121 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Issues closed for 0.19.0 - ------------------------ - - `#1767 `__: Function definitions in __fitpack.h should be moved. (Trac #1240) - - `#1774 `__: _kmeans chokes on large thresholds (Trac #1247) - - `#2089 `__: Integer overflows cause segfault in linkage function with large... - - `#2190 `__: Are odd-length window functions supposed to be always symmetrical?... - - `#2251 `__: solve_discrete_are in scipy.linalg does (sometimes) not solve... - - `#2580 `__: scipy.interpolate.UnivariateSpline (or a new superclass of it)... - - `#2592 `__: scipy.stats.anderson assumes gumbel_l - - `#3054 `__: scipy.linalg.eig does not handle infinite eigenvalues - - `#3160 `__: multinomial pmf / logpmf - - `#3904 `__: scipy.special.ellipj dn wrong values at quarter period - - `#4044 `__: Inconsistent code book initialization in kmeans - - `#4234 `__: scipy.signal.flattop documentation doesn't list a source for... - - `#4831 `__: Bugs in C code in __quadpack.h - - `#4908 `__: bug: unnessesary validity check for block dimension in scipy.sparse.block_diag - - `#4917 `__: BUG: indexing error for sparse matrix with ix_ - - `#4938 `__: Docs on extending ndimage need to be updated. - - `#5056 `__: sparse matrix element-wise multiplying dense matrix returns dense... - - `#5337 `__: Formula in documentation for correlate is wrong - - `#5537 `__: use OrderedDict in io.netcdf - - `#5750 `__: [doc] missing data index value in KDTree, cKDTree - - `#5755 `__: p-value computation in scipy.stats.kendalltau() in broken in... - - `#5757 `__: BUG: Incorrect complex output of signal.spectrogram - - `#5964 `__: ENH: expose scalar versions of scipy.special functions to cython - - `#6107 `__: scipy.cluster.hierarchy.single segmentation fault with 2**16... - - `#6278 `__: optimize.basinhopping should take a RandomState object - - `#6296 `__: InterpolatedUnivariateSpline: check_finite fails when w is unspecified - - `#6306 `__: Anderson-Darling bad results - - `#6314 `__: scipy.stats.kendaltau() p value not in agreement with R, SPSS... - - `#6340 `__: Curve_fit bounds and maxfev - - `#6377 `__: expm_multiply, complex matrices not working using start,stop,ect... - - `#6382 `__: optimize.differential_evolution stopping criterion has unintuitive... - - `#6391 `__: Global Benchmarking times out at 600s. - - `#6397 `__: mmwrite errors with large (but still 64-bit) integers - - `#6413 `__: scipy.stats.dirichlet computes multivariate gaussian differential... - - `#6428 `__: scipy.stats.mstats.mode modifies input - - `#6440 `__: Figure out ABI break policy for scipy.special Cython API - - `#6441 `__: Using Qhull for halfspace intersection : segfault - - `#6442 `__: scipy.spatial : In incremental mode volume is not recomputed - - `#6451 `__: Documentation for scipy.cluster.hierarchy.to_tree is confusing... - - `#6490 `__: interp1d (kind=zero) returns wrong value for rightmost interpolation... - - `#6521 `__: scipy.stats.entropy does *not* calculate the KL divergence - - `#6530 `__: scipy.stats.spearmanr unexpected NaN handling - - `#6541 `__: Test runner does not run scipy._lib/tests? - - `#6552 `__: BUG: misc.bytescale returns unexpected results when using cmin/cmax... - - `#6556 `__: RectSphereBivariateSpline(u, v, r) fails if min(v) >= pi - - `#6559 `__: Differential_evolution maxiter causing memory overflow - - `#6565 `__: Coverage of spectral functions could be improved - - `#6628 `__: Incorrect parameter name in binomial documentation - - `#6634 `__: Expose LAPACK's xGESVX family for linalg.solve ill-conditioned... - - `#6657 `__: Confusing documentation for `scipy.special.sph_harm` - - `#6676 `__: optimize: Incorrect size of Jacobian returned by `minimize(...,... - - `#6681 `__: add a new context manager to wrap `scipy.special.seterr` - - `#6700 `__: BUG: scipy.io.wavfile.read stays in infinite loop, warns on wav... - - `#6721 `__: scipy.special.chebyt(N) throw a 'TypeError' when N > 64 - - `#6727 `__: Documentation for scipy.stats.norm.fit is incorrect - - `#6764 `__: Documentation for scipy.spatial.Delaunay is partially incorrect - - `#6811 `__: scipy.spatial.SphericalVoronoi fails for large number of points - - `#6841 `__: spearmanr fails when nan_policy='omit' is set - - `#6869 `__: Currently in gaussian_kde, the logpdf function is calculated... - - `#6875 `__: SLSQP inconsistent handling of invalid bounds - - `#6876 `__: Python stopped working (Segfault?) with minimum/maximum filter... - - `#6889 `__: dblquad gives different results under scipy 0.17.1 and 0.18.1 - - `#6898 `__: BUG: dblquad ignores error tolerances - - `#6901 `__: Solving sparse linear systems in CSR format with complex values - - `#6903 `__: issue in spatial.distance.pdist docstring - - `#6917 `__: Problem in passing drop_rule to scipy.sparse.linalg.spilu - - `#6926 `__: signature mismatches for LowLevelCallable - - `#6961 `__: Scipy contains shebang pointing to /usr/bin/python and /bin/bash... - - `#6972 `__: BUG: special: `generate_ufuncs.py` is broken - - `#6984 `__: Assert raises test failure for test_ill_condition_warning - - `#6990 `__: BUG: sparse: Bad documentation of the `k` argument in `sparse.linalg.eigs` - - `#6991 `__: Division by zero in linregress() - - `#7011 `__: possible speed improvment in rv_continuous.fit() - - `#7015 `__: Test failure with Python 3.5 and numpy master - - `#7055 `__: SciPy 0.19.0rc1 test errors and failures on Windows - - `#7096 `__: macOS test failues for test_solve_continuous_are - - `#7100 `__: test_distance.test_Xdist_deprecated_args test error in 0.19.0rc2 Pull requests for 0.19.0 - ------------------------ - - `#2908 `__: Scipy 1.0 Roadmap - - `#3174 `__: add b-splines - - `#4606 `__: ENH: Add a unit impulse waveform function - - `#5608 `__: Adds keyword argument to choose faster convolution method - - `#5647 `__: ENH: Faster count_neighour in cKDTree / + weighted input data - - `#6021 `__: Netcdf append - - `#6058 `__: ENH: scipy.signal - Add stft and istft - - `#6059 `__: ENH: More accurate signal.freqresp for zpk systems - - `#6195 `__: ENH: Cython interface for special - - `#6234 `__: DOC: Fixed a typo in ward() help - - `#6261 `__: ENH: add docstring and clean up code for signal.normalize - - `#6270 `__: MAINT: special: add tests for cdflib - - `#6271 `__: Fix for scipy.cluster.hierarchy.is_isomorphic - - `#6273 `__: optimize: rewrite while loops as for loops - - `#6279 `__: MAINT: Bessel tweaks - - `#6291 `__: Fixes gh-6219: remove runtime warning from genextreme distribution - - `#6294 `__: STY: Some PEP8 and cleaning up imports in stats/_continuous_distns.py - - `#6297 `__: Clarify docs in misc/__init__.py - - `#6300 `__: ENH: sparse: Loosen input validation for `diags` with empty inputs - - `#6301 `__: BUG: standardizes check_finite behavior re optional weights,... - - `#6303 `__: Fixing example in _lazyselect docstring. - - `#6307 `__: MAINT: more improvements to gammainc/gammaincc - - `#6308 `__: Clarified documentation of hypergeometric distribution. - - `#6309 `__: BUG: stats: Improve calculation of the Anderson-Darling statistic. - - `#6315 `__: ENH: Descending order of x in PPoly - - `#6317 `__: ENH: stats: Add support for nan_policy to stats.median_test - - `#6321 `__: TST: fix a typo in test name - - `#6328 `__: ENH: sosfreqz - - `#6335 `__: Define LinregressResult outside of linregress - - `#6337 `__: In anderson test, added support for right skewed gumbel distribution. - - `#6341 `__: Accept several spellings for the curve_fit max number of function... - - `#6342 `__: DOC: cluster: clarify hierarchy.linkage usage - - `#6352 `__: DOC: removed brentq from its own 'see also' - - `#6362 `__: ENH: stats: Use explicit formulas for sf, logsf, etc in weibull... - - `#6369 `__: MAINT: special: add a comment to hyp0f1_complex - - `#6375 `__: Added the multinomial distribution. - - `#6387 `__: MAINT: special: improve accuracy of ellipj's `dn` at quarter... - - `#6388 `__: BenchmarkGlobal - getting it to work in Python3 - - `#6394 `__: ENH: scipy.sparse: add save and load functions for sparse matrices - - `#6400 `__: MAINT: moves global benchmark run from setup_cache to track_all - - `#6403 `__: ENH: seed kwd for basinhopping. Closes #6278 - - `#6404 `__: ENH: signal: added irrnotch and iirpeak functions. - - `#6406 `__: ENH: special: extend `sici`/`shichi` to complex arguments - - `#6407 `__: ENH: Window functions should not accept non-integer or negative... - - `#6408 `__: MAINT: _differentialevolution now uses _lib._util.check_random_state - - `#6427 `__: MAINT: Fix gmpy build & test that mpmath uses gmpy - - `#6439 `__: MAINT: ndimage: update callback function c api - - `#6443 `__: BUG: Fix volume computation in incremental mode - - `#6447 `__: Fixes issue #6413 - Minor documentation fix in the entropy function... - - `#6448 `__: ENH: Add halfspace mode to Qhull - - `#6449 `__: ENH: rtol and atol for differential_evolution termination fixes... - - `#6453 `__: DOC: Add some See Also links between similar functions - - `#6454 `__: DOC: linalg: clarify callable signature in `ordqz` - - `#6457 `__: ENH: spatial: enable non-double dtypes in squareform - - `#6459 `__: BUG: Complex matrices not handled correctly by expm_multiply... - - `#6465 `__: TST DOC Window docs, tests, etc. - - `#6469 `__: ENH: linalg: better handling of infinite eigenvalues in `eig`/`eigvals` - - `#6475 `__: DOC: calling interp1d/interp2d with NaNs is undefined - - `#6477 `__: Document magic numbers in optimize.py - - `#6481 `__: TST: Supress some warnings from test_windows - - `#6485 `__: DOC: spatial: correct typo in procrustes - - `#6487 `__: Fix Bray-Curtis formula in pdist docstring - - `#6493 `__: ENH: Add covariance functionality to scipy.optimize.curve_fit - - `#6494 `__: ENH: stats: Use log1p() to improve some calculations. - - `#6495 `__: BUG: Use MST algorithm instead of SLINK for single linkage clustering - - `#6497 `__: MRG: Add minimum_phase filter function - - `#6505 `__: reset scipy.signal.resample window shape to 1-D - - `#6507 `__: BUG: linkage: Raise exception if y contains non-finite elements - - `#6509 `__: ENH: _lib: add common machinery for low-level callback functions - - `#6520 `__: scipy.sparse.base.__mul__ non-numpy/scipy objects with 'shape'... - - `#6522 `__: Replace kl_div by rel_entr in entropy - - `#6524 `__: DOC: add next_fast_len to list of functions - - `#6527 `__: DOC: Release notes to reflect the new covariance feature in optimize.curve_fit - - `#6532 `__: ENH: Simplify _cos_win, document it, add symmetric/periodic arg - - `#6535 `__: MAINT: sparse.csgraph: updating old cython loops - - `#6540 `__: DOC: add to documentation of orthogonal polynomials - - `#6544 `__: TST: Ensure tests for scipy._lib are run by scipy.test() - - `#6546 `__: updated docstring of stats.linregress - - `#6553 `__: commited changes that I originally submitted for scipy.signal.cspline? - - `#6561 `__: BUG: modify signal.find_peaks_cwt() to return array and accept... - - `#6562 `__: DOC: Negative binomial distribution clarification - - `#6563 `__: MAINT: be more liberal in requiring numpy - - `#6567 `__: MAINT: use xrange for iteration in differential_evolution fixes... - - `#6572 `__: BUG: "sp.linalg.solve_discrete_are" fails for random data - - `#6578 `__: BUG: misc: allow both cmin/cmax and low/high params in bytescale - - `#6581 `__: Fix some unfortunate typos - - `#6582 `__: MAINT: linalg: make handling of infinite eigenvalues in `ordqz`... - - `#6585 `__: DOC: interpolate: correct seealso links to ndimage - - `#6588 `__: Update docstring of scipy.spatial.distance_matrix - - `#6592 `__: DOC: Replace 'first' by 'smallest' in mode - - `#6593 `__: MAINT: remove scipy.weave submodule - - `#6594 `__: DOC: distance.squareform: fix html docs, add note about dtype... - - `#6598 `__: [DOC] Fix incorrect error message in medfilt2d - - `#6599 `__: MAINT: linalg: turn a `solve_discrete_are` test back on - - `#6600 `__: DOC: Add SOS goals to roadmap - - `#6601 `__: DEP: Raise minimum numpy version to 1.8.2 - - `#6605 `__: MAINT: 'new' module is deprecated, don't use it - - `#6607 `__: DOC: add note on change in wheel dependency on numpy and pip... - - `#6609 `__: Fixes #6602 - Typo in docs - - `#6616 `__: ENH: generalization of continuous and discrete Riccati solvers... - - `#6621 `__: DOC: improve cluster.hierarchy docstrings. - - `#6623 `__: CS matrix prune method should copy data from large unpruned arrays - - `#6625 `__: DOC: special: complete documentation of `eval_*` functions - - `#6626 `__: TST: special: silence some deprecation warnings - - `#6631 `__: fix parameter name doc for discrete distributions - - `#6632 `__: MAINT: stats: change some instances of `special` to `sc` - - `#6633 `__: MAINT: refguide: py2k long integers are equal to py3k integers - - `#6638 `__: MAINT: change type declaration in cluster.linkage, prevent overflow - - `#6640 `__: BUG: fix issue with duplicate values used in cluster.vq.kmeans - - `#6641 `__: BUG: fix corner case in cluster.vq.kmeans for large thresholds - - `#6643 `__: MAINT: clean up truncation modes of dendrogram - - `#6645 `__: MAINT: special: rename `*_roots` functions - - `#6646 `__: MAINT: clean up mpmath imports - - `#6647 `__: DOC: add sqrt to Mahalanobis description for pdist - - `#6648 `__: DOC: special: add a section on `cython_special` to the tutorial - - `#6649 `__: ENH: Added scipy.spatial.distance.directed_hausdorff - - `#6650 `__: DOC: add Sphinx roles for DOI and arXiv links - - `#6651 `__: BUG: mstats: make sure mode(..., None) does not modify its input - - `#6652 `__: DOC: special: add section to tutorial on functions not in special - - `#6653 `__: ENH: special: add the Wright Omega function - - `#6656 `__: ENH: don't coerce input to double with custom metric in cdist... - - `#6658 `__: Faster/shorter code for computation of discordances - - `#6659 `__: DOC: special: make __init__ summaries and html summaries match - - `#6661 `__: general.rst: Fix a typo - - `#6664 `__: TST: Spectral functions' window correction factor - - `#6665 `__: [DOC] Conditions on v in RectSphereBivariateSpline - - `#6668 `__: DOC: Mention negative masses for center of mass - - `#6675 `__: MAINT: special: remove outdated README - - `#6677 `__: BUG: Fixes computation of p-values. - - `#6679 `__: BUG: optimize: return correct Jacobian for method 'SLSQP' in... - - `#6680 `__: ENH: Add structural rank to sparse.csgraph - - `#6686 `__: TST: Added Airspeed Velocity benchmarks for SphericalVoronoi - - `#6687 `__: DOC: add section "deciding on new features" to developer guide. - - `#6691 `__: ENH: Clearer error when fmin_slsqp obj doesn't return scalar - - `#6702 `__: TST: Added airspeed velocity benchmarks for scipy.spatial.distance.cdist - - `#6707 `__: TST: interpolate: test fitpack wrappers, not _impl - - `#6709 `__: TST: fix a number of test failures on 32-bit systems - - `#6711 `__: MAINT: move function definitions from __fitpack.h to _fitpackmodule.c - - `#6712 `__: MAINT: clean up wishlist in stats.morestats, and copyright statement. - - `#6715 `__: DOC: update the release notes with BSpline et al. - - `#6716 `__: MAINT: scipy.io.wavfile: No infinite loop when trying to read... - - `#6717 `__: some style cleanup - - `#6723 `__: BUG: special: cast to float before in-place multiplication in... - - `#6726 `__: address performance regressions in interp1d - - `#6728 `__: DOC: made code examples in `integrate` tutorial copy-pasteable - - `#6731 `__: DOC: scipy.optimize: Added an example for wrapping complex-valued... - - `#6732 `__: MAINT: cython_special: remove `errprint` - - `#6733 `__: MAINT: special: fix some pyflakes warnings - - `#6734 `__: DOC: sparse.linalg: fixed matrix description in `bicgstab` doc - - `#6737 `__: BLD: update `cythonize.py` to detect changes in pxi files - - `#6740 `__: DOC: special: some small fixes to docstrings - - `#6741 `__: MAINT: remove dead code in interpolate.py - - `#6742 `__: BUG: fix ``linalg.block_diag`` to support zero-sized matrices. - - `#6744 `__: ENH: interpolate: make PPoly.from_spline accept BSpline objects - - `#6746 `__: DOC: special: clarify use of Condon-Shortley phase in `sph_harm`/`lpmv` - - `#6750 `__: ENH: sparse: avoid densification on broadcasted elem-wise mult - - `#6751 `__: sinm doc explained cosm - - `#6753 `__: ENH: special: allow for more fine-tuned error handling - - `#6759 `__: Move logsumexp and pade from scipy.misc to scipy.special and... - - `#6761 `__: ENH: argmax and argmin methods for sparse matrices - - `#6762 `__: DOC: Improve docstrings of sparse matrices - - `#6763 `__: ENH: Weighted tau - - `#6768 `__: ENH: cythonized spherical Voronoi region polygon vertex sorting - - `#6770 `__: Correction of Delaunay class' documentation - - `#6775 `__: ENH: Integrating LAPACK "expert" routines with conditioning warnings... - - `#6776 `__: MAINT: Removing the trivial f2py warnings - - `#6777 `__: DOC: Update rv_continuous.fit doc. - - `#6778 `__: MAINT: cluster.hierarchy: Improved wording of error msgs - - `#6786 `__: BLD: increase minimum Cython version to 0.23.4 - - `#6787 `__: DOC: expand on ``linalg.block_diag`` changes in 0.19.0 release... - - `#6789 `__: ENH: Add further documentation for norm.fit - - `#6790 `__: MAINT: Fix a potential problem in nn_chain linkage algorithm - - `#6791 `__: DOC: Add examples to scipy.ndimage.fourier - - `#6792 `__: DOC: fix some numpydoc / Sphinx issues. - - `#6793 `__: MAINT: fix circular import after moving functions out of misc - - `#6796 `__: TST: test importing each submodule. Regression test for gh-6793. - - `#6799 `__: ENH: stats: Argus distribution - - `#6801 `__: ENH: stats: Histogram distribution - - `#6803 `__: TST: make sure tests for ``_build_utils`` are run. - - `#6804 `__: MAINT: more fixes in `loggamma` - - `#6806 `__: ENH: Faster linkage for 'centroid' and 'median' methods - - `#6810 `__: ENH: speed up upfirdn and resample_poly for n-dimensional arrays - - `#6812 `__: TST: Added ConvexHull asv benchmark code - - `#6814 `__: ENH: Different extrapolation modes for different dimensions in... - - `#6826 `__: Signal spectral window default fix - - `#6828 `__: BUG: SphericalVoronoi Space Complexity (Fixes #6811) - - `#6830 `__: RealData docstring correction - - `#6834 `__: DOC: Added reference for skewtest function. See #6829 - - `#6836 `__: DOC: Added mode='mirror' in the docstring for the functions accepting... - - `#6838 `__: MAINT: sparse: start removing old BSR methods - - `#6844 `__: handle incompatible dimensions when input is not an ndarray in... - - `#6847 `__: Added maxiter to golden search. - - `#6850 `__: BUG: added check for optional param scipy.stats.spearmanr - - `#6858 `__: MAINT: Removing redundant tests - - `#6861 `__: DEP: Fix escape sequences deprecated in Python 3.6. - - `#6862 `__: DOC: dx should be float, not int - - `#6863 `__: updated documentation curve_fit - - `#6866 `__: DOC : added some documentation to j1 referring to spherical_jn - - `#6867 `__: DOC: cdist move long examples list into Notes section - - `#6868 `__: BUG: Make stats.mode return a ModeResult namedtuple on empty... - - `#6871 `__: Corrected documentation. - - `#6874 `__: ENH: gaussian_kde.logpdf based on logsumexp - - `#6877 `__: BUG: ndimage: guard against footprints of all zeros - - `#6881 `__: python 3.6 - - `#6885 `__: Vectorized integrate.fixed_quad - - `#6886 `__: fixed typo - - `#6891 `__: TST: fix failures for linalg.dare/care due to tightened test... - - `#6892 `__: DOC: fix a bunch of Sphinx errors. - - `#6894 `__: TST: Added asv benchmarks for scipy.spatial.Voronoi - - `#6908 `__: BUG: Fix return dtype for complex input in spsolve - - `#6909 `__: ENH: fftpack: use float32 routines for float16 inputs. - - `#6911 `__: added min/max support to binned_statistic - - `#6913 `__: Fix 6875: SLSQP raise ValueError for all invalid bounds. - - `#6914 `__: DOCS: GH6903 updating docs of Spatial.distance.pdist - - `#6916 `__: MAINT: fix some issues for 32-bit Python - - `#6924 `__: BLD: update Bento build for scipy.LowLevelCallable - - `#6932 `__: ENH: Use OrderedDict in io.netcdf. Closes gh-5537 - - `#6933 `__: BUG: fix LowLevelCallable issue on 32-bit Python. - - `#6936 `__: BUG: sparse: handle size-1 2D indexes correctly - - `#6938 `__: TST: fix test failures in special on 32-bit Python. - - `#6939 `__: Added attributes list to cKDTree docstring - - `#6940 `__: improve efficiency of dok_matrix.tocoo - - `#6942 `__: DOC: add link to liac-arff package in the io.arff docstring. - - `#6943 `__: MAINT: Docstring fixes and an additional test for linalg.solve - - `#6944 `__: DOC: Add example of odeint with a banded Jacobian to the integrate... - - `#6946 `__: ENH: hypergeom.logpmf in terms of betaln - - `#6947 `__: TST: speedup distance tests - - `#6948 `__: DEP: Deprecate the keyword "debug" from linalg.solve - - `#6950 `__: BUG: Correctly treat large integers in MMIO (fixes #6397) - - `#6952 `__: ENH: Minor user-friendliness cleanup in LowLevelCallable - - `#6956 `__: DOC: improve description of 'output' keyword for convolve - - `#6957 `__: ENH more informative error in sparse.bmat - - `#6962 `__: Shebang fixes - - `#6964 `__: DOC: note argmin/argmax addition - - `#6965 `__: BUG: Fix issues passing error tolerances in dblquad and tplquad. - - `#6971 `__: fix the docstring of signaltools.correlate - - `#6973 `__: Silence expected numpy warnings in scipy.ndimage.interpolation.zoom() - - `#6975 `__: BUG: special: fix regex in `generate_ufuncs.py` - - `#6976 `__: Update docstring for griddata - - `#6978 `__: Avoid division by zero in zoom factor calculation - - `#6979 `__: BUG: ARE solvers did not check the generalized case carefully - - `#6985 `__: ENH: sparse: add scipy.sparse.linalg.spsolve_triangular - - `#6994 `__: MAINT: spatial: updates to plotting utils - - `#6995 `__: DOC: Bad documentation of k in sparse.linalg.eigs See #6990 - - `#6997 `__: TST: Changed the test with a less singular example - - `#7000 `__: DOC: clarify interp1d 'zero' argument - - `#7007 `__: BUG: Fix division by zero in linregress() for 2 data points - - `#7009 `__: BUG: Fix problem in passing drop_rule to scipy.sparse.linalg.spilu - - `#7012 `__: speed improvment in _distn_infrastructure.py - - `#7014 `__: Fix Typo: add a single quotation mark to fix a slight typo - - `#7021 `__: MAINT: stats: use machine constants from np.finfo, not machar - - `#7026 `__: MAINT: update .mailmap - - `#7032 `__: Fix layout of rv_histogram docs - - `#7035 `__: DOC: update 0.19.0 release notes - - `#7036 `__: ENH: Add more boundary options to signal.stft - - `#7040 `__: TST: stats: skip too slow tests - - `#7042 `__: MAINT: sparse: speed up setdiag tests - - `#7043 `__: MAINT: refactory and code cleaning Xdist - - `#7053 `__: Fix msvc 9 and 10 compile errors - - `#7060 `__: DOC: updated release notes with #7043 and #6656 - - `#7062 `__: MAINT: Change defaut STFT boundary kwarg to "zeros" - - `#7064 `__: Fix ValueError: path is on mount 'X:', start on mount 'D:' on... - - `#7067 `__: TST: Fix PermissionError: [Errno 13] Permission denied on Windows - - `#7068 `__: TST: Fix UnboundLocalError: local variable 'data' referenced... - - `#7069 `__: Fix OverflowError: Python int too large to convert to C long... - - `#7071 `__: TST: silence RuntimeWarning for nan test of stats.spearmanr - - `#7072 `__: Fix OverflowError: Python int too large to convert to C long... - - `#7084 `__: TST: linalg: bump tolerance in test_falker - - `#7095 `__: TST: linalg: bump more tolerances in test_falker - - `#7101 `__: TST: Relax solve_continuous_are test case 2 and 12 - - `#7106 `__: BUG: stop cdist "correlation" modifying input - - `#7116 `__: Backports to 0.19.0rc2 Checksums ========= MD5 ~~~ dde4d5d44a0274a5abb01be4a3cd486a scipy-0.19.0-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 08809612b46e660e567e3272ec11c808 scipy-0.19.0-cp27-cp27m-manylinux1_i686.whl 0e49f7fc8d31c1c79f0a4d63b29e8a1f scipy-0.19.0-cp27-cp27m-manylinux1_x86_64.whl a2669158cf847856d292b8a60cdaa170 scipy-0.19.0-cp27-cp27mu-manylinux1_i686.whl adfa1f5127a789165dfe9ff140ec0d6e scipy-0.19.0-cp27-cp27mu-manylinux1_x86_64.whl d568c9f60683c33b81ebc1c39eea198a scipy-0.19.0-cp34-cp34m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl a90148fec477c1950578b40a1197509f scipy-0.19.0-cp34-cp34m-manylinux1_i686.whl ed27be5380e0aaf0229adf747e760f8c scipy-0.19.0-cp34-cp34m-manylinux1_x86_64.whl 4cda63dc7b73bd03bdf9e8ebc6027526 scipy-0.19.0-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl ff8e652b5e918b276793f1ce542a5959 scipy-0.19.0-cp35-cp35m-manylinux1_i686.whl 60741a900a145eb924ec861ec2743582 scipy-0.19.0-cp35-cp35m-manylinux1_x86_64.whl 81685a961d6118459b7787e8465c8d36 scipy-0.19.0-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 83f0750862c80a659686797d4ec9bca0 scipy-0.19.0-cp36-cp36m-manylinux1_i686.whl 3cbb30615496fbbf9b52c9a643c6fe5e scipy-0.19.0-cp36-cp36m-manylinux1_x86_64.whl 735cdb6fbfcb9917535749816202d0af scipy-0.19.0.tar.gz b21466e87a642940fb9ba35be74940a3 scipy-0.19.0.tar.xz 91b8396231eec780222a57703d3ec550 scipy-0.19.0.zip SHA256 ~~~~~~ 517a85600d6574fef1a67e6d2001b847c27c8bfd136f7a12879c3f91e7bb291f scipy-0.19.0-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 3c34987ee52fd98b34f2e4a6d277b452d49056f1383550acc54c5bab408a194c scipy-0.19.0-cp27-cp27m-manylinux1_i686.whl eabdfde8276007e9aec9a400f9528645001a30f7d78b04a0ab215183d9523e2a scipy-0.19.0-cp27-cp27m-manylinux1_x86_64.whl fa67bbb0a3225fcd8610d693e7b2ca08fda107359e48229f7b83593bbb70cc97 scipy-0.19.0-cp27-cp27mu-manylinux1_i686.whl 4147b97709e75822e73d312e4d262410baafa961a7b11649a7b4b7c2d41fb4fe scipy-0.19.0-cp27-cp27mu-manylinux1_x86_64.whl 663e78bfa197376547424aff9fb5009e7b2f26855ee5aaf1a2ddbb2f4dc6af3b scipy-0.19.0-cp34-cp34m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 2e70ded029d51f6a48d4b2a154f583b85aa2e3290dfd71e0b6bbcfe9454cffdd scipy-0.19.0-cp34-cp34m-manylinux1_i686.whl 4b2731a191dfa48a05b2f5bc18881595a1418092092ecdd8d3feab80f72adc96 scipy-0.19.0-cp34-cp34m-manylinux1_x86_64.whl 57f7be33f1009ad6199132e8a7e5d4c9727224680d8cbc4596a2a8935a86f96b scipy-0.19.0-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 0496f2b204a63cde3797e5452bf671ee25afc11bd9489ae69cd4dccee13083a1 scipy-0.19.0-cp35-cp35m-manylinux1_i686.whl 1bcf71f2e534a1aabf9f075700701bf3af434120b1b114dfa4723d02e076ed1f scipy-0.19.0-cp35-cp35m-manylinux1_x86_64.whl e1c45905f550b5f14e1f47697c92bab5c1e6ba77da5a441bd2affa4621c41b26 scipy-0.19.0-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl 47f537214293a5d74b05d217afa49b582b7fd9428ec9ea64be69210cfc56611a scipy-0.19.0-cp36-cp36m-manylinux1_i686.whl f5a3a7dcbeb345c227770029870aeb547a3c207a6cbc0106d6157139fd0c23e9 scipy-0.19.0-cp36-cp36m-manylinux1_x86_64.whl eba12b5f757a8c839b26a06f4936ecb80b65cb3674981ee25449b2a55663abe8 scipy-0.19.0.tar.gz ed52232afb2b321a4978e39040d94bf81af90176ba64f58c4499dc305a024437 scipy-0.19.0.tar.xz 4190d34bf9a09626cd42100bbb12e3d96b2daf1a8a3244e991263eb693732122 scipy-0.19.0.zip -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iQEcBAEBCAAGBQJYwWscAAoJEIp0pQ0zQcu+JG8H+gOy04ecVl+IBisy/Fz5wfuy xrsCT5yhRPQpxKph8g/6Us5Oh7s8ixrVt9gTVccAspmWXQJhMy5kcppC3s5WsvU+ jFOwUnW7c9QvtIf2ZD9Ay/56WojlXg1ui17MqoCbmkEn2QE8KTKu93hIZpVD5wmV 1fhd7u/ieeQ7sfj6gMZzt0AGpVjnGedEzHRY4zI0PkiCY+Ex8sc2W8G2h5Qbnx9r KoqECIuesLQzVbNgPhWaWaiE1TNX0EJYdWQll0T8scI4opUdg6vEaR05aPhxeR1r KaGEnvfASTZ369COkuVB4JINlKQj0dwLBFIr9NGVzX4vU74GMh5TuDfJlA/mvGU= =bOnA -----END PGP SIGNATURE----- From ashwin.pathak at students.iiit.ac.in Thu Mar 9 12:29:21 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Thu, 09 Mar 2017 22:59:21 +0530 Subject: [SciPy-Dev] Running Tests Message-ID: <013410d0d282beadc8cdc835f5af1dd3@students.iiit.ac.in> Hello all, I was learning about nose and test suites, I am not able to figure out how to run test for particular functions that I changed. Can someone help me know the procedure as to how to run tests for individual functions. Thank you From pmhobson at gmail.com Thu Mar 9 12:34:09 2017 From: pmhobson at gmail.com (Paul Hobson) Date: Thu, 9 Mar 2017 09:34:09 -0800 Subject: [SciPy-Dev] Running Tests In-Reply-To: <013410d0d282beadc8cdc835f5af1dd3@students.iiit.ac.in> References: <013410d0d282beadc8cdc835f5af1dd3@students.iiit.ac.in> Message-ID: Ashwin, I typically do: path/to/my/test_file.py:test_my_function where is nosetests or pytest or whatever. You can read more about that here: http://stackoverflow.com/questions/3704473/how-do-i-run-a-single-test-with-nose-in-pylons On Thu, Mar 9, 2017 at 9:29 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: > Hello all, > I was learning about nose and test suites, I am not able to figure out > how to run test for particular functions that I changed. Can someone help > me know the procedure as to how to run tests for individual functions. > Thank you > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Mar 9 12:59:15 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 9 Mar 2017 20:59:15 +0300 Subject: [SciPy-Dev] Running Tests In-Reply-To: References: <013410d0d282beadc8cdc835f5af1dd3@students.iiit.ac.in> Message-ID: 09.03.2017 20:34 ???????????? "Paul Hobson" ???????: > > Ashwin, > > I typically do: > > path/to/my/test_file.py:test_my_function > > where is nosetests or pytest or whatever. > > You can read more about that here: > http://stackoverflow.com/questions/3704473/how-do-i-run-a-single-test-with-nose-in-pylons > > On Thu, Mar 9, 2017 at 9:29 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: >> >> Hello all, >> I was learning about nose and test suites, I am not able to figure out how to run test for particular functions that I changed. Can someone help me know the procedure as to how to run tests for individual functions. >> Thank you What Paul said, or $ python runtests.py -t path/to/test/file:test_function or $ python runtests.py -s submodule Evgeni -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Mar 9 13:26:59 2017 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 9 Mar 2017 10:26:59 -0800 Subject: [SciPy-Dev] [Numpy-discussion] ANN: SciPy 0.19.0 In-Reply-To: References: Message-ID: On Thu, Mar 9, 2017 at 7:54 AM, Evgeni Burovski wrote: > On behalf of the Scipy development team I am pleased to announce the > availability of Scipy 0.19.0. This release contains several great new > features and a large number of bug fixes and various improvements, as > detailed in the release notes below. > 121 people contributed to this release over the course of seven months. Many thanks to you Evgeni for all your hard work on making the release. Cheers, Matthew From antonior92 at gmail.com Thu Mar 9 17:38:49 2017 From: antonior92 at gmail.com (Antonio Ribeiro) Date: Thu, 9 Mar 2017 19:38:49 -0300 Subject: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy Message-ID: Hello, my name is Antonio and I am a Brazilian electrical engineer currently pursuing my master degree. I have contributed to scipy.optimize and scipy.signal implementing functions "iirnotch", "irrpeak" and the method "trust-region-exact" (under revision). I am interested in applying for the Google Summer of Code 2017 to work with the Scipy optimisation package. My proposal is to improve scipy.optimize adding optimisation methods that are able to deal with non-linear constraints. Currently the only implemented methods able to deal with non-linear constraints are the FORTRAN wrappers SLSQP and COBYLA. SLSQP is a sequential quadratic programming method and COBYLA is a derivative-free optimisation method, they both have its limitations: SLSQP is not able to deal with sparse hessians and jacobians and is unfit for large-scale problems and COBYLA, as other derivative-free methods, is a good choice for optimise noisy objective functions, however usually presents a poorer performance then derivative-based methods when the derivatives are available (or even when they are computed by automatic differentiation or finite differences). My proposal is to implement in Scipy one or more state-of-the-art solvers (interior point and SQP methods) for constrained optimisation problems. I would like to get some feedback about this, discuss the relevance of it for Scipy and get some suggestions of possible mentors. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Mar 9 19:14:56 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 10 Mar 2017 13:14:56 +1300 Subject: [SciPy-Dev] [Numpy-discussion] ANN: SciPy 0.19.0 In-Reply-To: References: Message-ID: On Fri, Mar 10, 2017 at 7:26 AM, Matthew Brett wrote: > On Thu, Mar 9, 2017 at 7:54 AM, Evgeni Burovski > wrote: > > On behalf of the Scipy development team I am pleased to announce the > > availability of Scipy 0.19.0. This release contains several great new > > features and a large number of bug fixes and various improvements, as > > detailed in the release notes below. > > 121 people contributed to this release over the course of seven months. > > Many thanks to you Evgeni for all your hard work on making the release. > Seconded - thanks a lot Evgeni! And now on to 1.0 ...... Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Fri Mar 10 13:01:32 2017 From: haberland at ucla.edu (Matt Haberland) Date: Fri, 10 Mar 2017 10:01:32 -0800 Subject: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy In-Reply-To: References: Message-ID: The choice of nonlinear optimization algorithm can have a dramatic impact on the speed and quality of the solution, and the best choice for a particular problem can be difficult to determine a priori, so it is important to have multiple options available. My work in optimal control leads to problems with (almost entirely) nonlinear constraints, and the use of derivative information is essential for reasonable performance, leaving SLSQP as the only option in SciPy right now. However, the problems are also huge and very sparse with a specific structure, so SLSQP is not very effective, and not nearly as effective as a nonlinear optimization routine could be. So despite SciPy boasting 14 options for minimization of a nonlinear objective, it wasn't suitable for this work (without the use of an external solver). I think SciPy is in need of at least one solver designed to handle large, fully nonlinear problems, and having two would be much better. Interior point and SQP are good, complementary options. On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro wrote: > Hello, my name is Antonio and I am a Brazilian electrical engineer > currently pursuing my master degree. I have contributed to scipy.optimize > and scipy.signal implementing functions "iirnotch", "irrpeak" > and the method > "trust-region-exact" (under > revision). I am interested in applying for the Google Summer of Code 2017 > to work with the Scipy optimisation package. > > My proposal is to improve scipy.optimize adding optimisation methods that > are able to deal with non-linear constraints. Currently the only > implemented methods able to deal with non-linear constraints are the > FORTRAN wrappers SLSQP and COBYLA. > > SLSQP is a sequential quadratic programming method and COBYLA is a > derivative-free optimisation method, they both have its limitations: > SLSQP is not able to deal with sparse > hessians and jacobians and is unfit for large-scale problems and COBYLA, > as other derivative-free methods, is a good choice for optimise noisy > objective functions, however usually presents a poorer performance then > derivative-based methods when the derivatives are available (or even when > they are computed by automatic differentiation or finite differences). > > My proposal is to implement in Scipy one or more state-of-the-art solvers > (interior point and SQP methods) for constrained optimisation problems. I > would like to get some feedback about this, discuss the relevance of it for > Scipy and get some suggestions of possible mentors. > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Fri Mar 10 14:40:00 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Sat, 11 Mar 2017 00:40:00 +0500 Subject: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy In-Reply-To: References: Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508@zoho.com> Hi, Antonio! I too think that moving towards more modern algorithms and their implementations is good for scipy. I would be happy to mentor this project, and most likely I will be able to. Some current thoughts and questions: 1. Have you figured what is done in SLSQP (especially "least squares" part)? Do you plan to use a similar approach or another approach to SQP? (I figured there are several somewhat different approaches.) Setting on a literature reference (or most likely several of them) is essential. 2. I think it is not wrong to focus on a single solver if you feel that it will likely take the whole time. Or maybe you can prioritize: do this first for sure and then have alternatives, plan a) to switch to another solver or plan b) to improve\add something more minor. 3. Consider whether to fit a new solver into minimize or make it as a new separate solver. The latter approach gives a freedom to implement things exactly as you want (and not to depend on old suboptimal choices) , but I guess it can be considered as impractical/inconsistent by some people. Maybe it can be decided along the way. 4. I think it is important to start to think about benchmark problems early, maybe even start with them. It's hard to develop a complicated optimization algorithm without ability to see how efficiently it works right away. ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland <haberland at ucla.edu> wrote ---- The choice of nonlinear optimization algorithm can have a dramatic impact on the speed and quality of the solution, and the best choice for a particular problem can be difficult to determine a priori, so it is important to have multiple options available. My work in optimal control leads to problems with (almost entirely) nonlinear constraints, and the use of derivative information is essential for reasonable performance, leaving SLSQP as the only option in SciPy right now. However, the problems are also huge and very sparse with a specific structure, so SLSQP is not very effective, and not nearly as effective as a nonlinear optimization routine could be. So despite SciPy boasting 14 options for minimization of a nonlinear objective, it wasn't suitable for this work (without the use of an external solver). I think SciPy is in need of at least one solver designed to handle large, fully nonlinear problems, and having two would be much better. Interior point and SQP are good, complementary options. On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> wrote: -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev Hello, my name is Antonio and I am a Brazilian electrical engineer currently pursuing my master degree. I have contributed to scipy.optimize and scipy.signal implementing functions "iirnotch", "irrpeak"and the method "trust-region-exact" (under revision). I am interested in applying for the Google Summer of Code 2017 to work with the Scipy optimisation package. My proposal is to improve scipy.optimize adding optimisation methods that are able to deal with non-linear constraints. Currently the only implemented methods able to deal with non-linear constraints are the FORTRAN wrappers SLSQP and COBYLA. SLSQP is a sequential quadratic programming method and COBYLA is a derivative-free optimisation method, they both have its limitations: SLSQP is not able to deal with sparse hessians and jacobians and is unfit for large-scale problems and COBYLA, as other derivative-free methods, is a good choice for optimise noisy objective functions, however usually presents a poorer performance then derivative-based methods when the derivatives are available (or even when they are computed by automatic differentiation or finite differences). My proposal is to implement in Scipy one or more state-of-the-art solvers (interior point and SQP methods) for constrained optimisation problems. I would like to get some feedback about this, discuss the relevance of it for Scipy and get some suggestions of possible mentors. _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Sat Mar 11 10:08:17 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Sat, 11 Mar 2017 09:08:17 -0600 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 In-Reply-To: References: Message-ID: I used ipopt for interior-point method from python, are you trying to add something similar to scipy? if yes, why not just add a wrapper for ipopt, since the license looks not restrictive? On Fri, Mar 10, 2017 at 1:40 PM, wrote: > Send SciPy-Dev mailing list submissions to > scipy-dev at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.scipy.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at scipy.org > > You can reach the person managing the list at > scipy-dev-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) > 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Fri, 10 Mar 2017 10:01:32 -0800 > From: Matt Haberland > To: SciPy Developers List > Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > Message-ID: > mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > The choice of nonlinear optimization algorithm can have a dramatic impact > on the speed and quality of the solution, and the best choice for a > particular problem can be difficult to determine a priori, so it is > important to have multiple options available. > > My work in optimal control leads to problems with (almost entirely) > nonlinear constraints, and the use of derivative information is essential > for reasonable performance, leaving SLSQP as the only option in SciPy right > now. However, the problems are also huge and very sparse with a specific > structure, so SLSQP is not very effective, and not nearly as effective as a > nonlinear optimization routine could be. So despite SciPy boasting 14 > options for minimization of a nonlinear objective, it wasn't suitable for > this work (without the use of an external solver). > > I think SciPy is in need of at least one solver designed to handle large, > fully nonlinear problems, and having two would be much better. Interior > point and SQP are good, complementary options. > > On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro > wrote: > > > Hello, my name is Antonio and I am a Brazilian electrical engineer > > currently pursuing my master degree. I have contributed to scipy.optimize > > and scipy.signal implementing functions "iirnotch", "irrpeak" > > and the method > > "trust-region-exact" (under > > revision). I am interested in applying for the Google Summer of Code 2017 > > to work with the Scipy optimisation package. > > > > My proposal is to improve scipy.optimize adding optimisation methods that > > are able to deal with non-linear constraints. Currently the only > > implemented methods able to deal with non-linear constraints are the > > FORTRAN wrappers SLSQP and COBYLA. > > > > SLSQP is a sequential quadratic programming method and COBYLA is a > > derivative-free optimisation method, they both have its limitations: > > SLSQP is not able to deal with sparse > > hessians and jacobians and is unfit for large-scale problems and COBYLA, > > as other derivative-free methods, is a good choice for optimise noisy > > objective functions, however usually presents a poorer performance then > > derivative-based methods when the derivatives are available (or even when > > they are computed by automatic differentiation or finite differences). > > > > My proposal is to implement in Scipy one or more state-of-the-art solvers > > (interior point and SQP methods) for constrained optimisation problems. I > > would like to get some feedback about this, discuss the relevance of it > for > > Scipy and get some suggestions of possible mentors. > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > -- > Matt Haberland > Assistant Adjunct Professor in the Program in Computing > Department of Mathematics > 7620E Math Sciences Building, UCLA > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170310/00ea2b5d/attachment-0001.html> > > ------------------------------ > > Message: 2 > Date: Sat, 11 Mar 2017 00:40:00 +0500 > From: Nikolay Mayorov > To: "SciPy Developers List" > Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> > Content-Type: text/plain; charset="utf-8" > > Hi, Antonio! > > > > I too think that moving towards more modern algorithms and their > implementations is good for scipy. I would be happy to mentor this project, > and most likely I will be able to. > > > > Some current thoughts and questions: > > > > 1. Have you figured what is done in SLSQP (especially "least squares" > part)? Do you plan to use a similar approach or another approach to SQP? (I > figured there are several somewhat different approaches.) Setting on a > literature reference (or most likely several of them) is essential. > > > > 2. I think it is not wrong to focus on a single solver if you feel that it > will likely take the whole time. Or maybe you can prioritize: do this first > for sure and then have alternatives, plan a) to switch to another solver or > plan b) to improve\add something more minor. > > > > 3. Consider whether to fit a new solver into minimize or make it as a new > separate solver. The latter approach gives a freedom to implement things > exactly as you want (and not to depend on old suboptimal choices) , but I > guess it can be considered as impractical/inconsistent by some people. > Maybe it can be decided along the way. > > > > 4. I think it is important to start to think about benchmark problems > early, maybe even start with them. It's hard to develop a complicated > optimization algorithm without ability to see how efficiently it works > right away. > > > > > > > > ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & > lt;haberland at ucla.edu> wrote ---- > > > > > The choice of nonlinear optimization algorithm can have a dramatic impact > on the speed and quality of the solution, and the best choice for a > particular problem can be difficult to determine a priori, so it is > important to have multiple options available. > > > > My work in optimal control leads to problems with (almost entirely) > nonlinear constraints, and the use of derivative information is essential > for reasonable performance, leaving SLSQP as the only option in SciPy right > now. However, the problems are also huge and very sparse with a specific > structure, so SLSQP is not very effective, and not nearly as effective as a > nonlinear optimization routine could be. So despite SciPy boasting 14 > options for minimization of a nonlinear objective, it wasn't suitable for > this work (without the use of an external solver). > > > > I think SciPy is in need of at least one solver designed to handle large, > fully nonlinear problems, and having two would be much better. Interior > point and SQP are good, complementary options. > > > > > On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> > wrote: > > > > > > > > > > > -- > > Matt Haberland > > Assistant Adjunct Professor in the Program in Computing > > Department of Mathematics > > 7620E Math Sciences Building, UCLA > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > Hello, my name is Antonio and I am a Brazilian electrical engineer > currently pursuing my master degree. I have contributed to scipy.optimize > and scipy.signal implementing functions "iirnotch", "irrpeak"and the method > "trust-region-exact" (under revision). I am interested in applying for the > Google Summer of Code 2017 to work with the Scipy optimisation package. > > My proposal is to improve scipy.optimize adding optimisation methods that > are able to deal with non-linear constraints. Currently the only > implemented methods able to deal with non-linear constraints are the > FORTRAN wrappers SLSQP and COBYLA. > > SLSQP is a sequential quadratic programming method and COBYLA is a > derivative-free optimisation method, they both have its limitations: SLSQP > is not able to deal with sparse > hessians and jacobians and is unfit for large-scale problems and COBYLA, > as other derivative-free methods, is a good choice for optimise noisy > objective functions, however usually presents a poorer performance then > derivative-based methods when the derivatives are available (or even when > they are computed by automatic differentiation or finite differences). > My proposal is to implement in Scipy one or more state-of-the-art solvers > (interior point and SQP methods) for constrained optimisation problems. I > would like to get some feedback about this, discuss the relevance of it for > Scipy and get some suggestions of possible mentors. > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170311/3bd6e920/attachment.html> > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 161, Issue 11 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Sat Mar 11 12:01:40 2017 From: haberland at ucla.edu (Matt Haberland) Date: Sat, 11 Mar 2017 09:01:40 -0800 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 In-Reply-To: References: Message-ID: IPOPT is under EPL. From the FAQ: *Does the EPL allow me to take the Source Code for a Program licensed under it and include all or part of it in another program licensed under the GNU General Public License (GPL), Berkeley Software Distribution (BSD) license or other Open Source license?* No. Only the owner of software can decide whether and how to license it to others. Contributors to a Program licensed under the EPL understand that source code for the Program will be made available under the terms of the EPL. Unless you are the owner of the software or have received permission from the owner, you are not authorized to apply the terms of another license to the Program by including it in a program licensed under another Open Source license. It might be worth asking the contributors, but their permission would be necessary. On Mar 11, 2017 7:09 AM, "Denis Akhiyarov" wrote: > I used ipopt for interior-point method from python, are you trying to add > something similar to scipy? if yes, why not just add a wrapper for ipopt, > since the license looks not restrictive? > > On Fri, Mar 10, 2017 at 1:40 PM, wrote: > >> Send SciPy-Dev mailing list submissions to >> scipy-dev at scipy.org >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> or, via email, send a message with subject or body 'help' to >> scipy-dev-request at scipy.org >> >> You can reach the person managing the list at >> scipy-dev-owner at scipy.org >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of SciPy-Dev digest..." >> >> >> Today's Topics: >> >> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) >> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Fri, 10 Mar 2017 10:01:32 -0800 >> From: Matt Haberland >> To: SciPy Developers List >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: >> > gmail.com> >> Content-Type: text/plain; charset="utf-8" >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy >> right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as >> a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro >> wrote: >> >> > Hello, my name is Antonio and I am a Brazilian electrical engineer >> > currently pursuing my master degree. I have contributed to >> scipy.optimize >> > and scipy.signal implementing functions "iirnotch", "irrpeak" >> > and the method >> > "trust-region-exact" (under >> > revision). I am interested in applying for the Google Summer of Code >> 2017 >> > to work with the Scipy optimisation package. >> > >> > My proposal is to improve scipy.optimize adding optimisation methods >> that >> > are able to deal with non-linear constraints. Currently the only >> > implemented methods able to deal with non-linear constraints are the >> > FORTRAN wrappers SLSQP and COBYLA. >> > >> > SLSQP is a sequential quadratic programming method and COBYLA is a >> > derivative-free optimisation method, they both have its limitations: >> > SLSQP is not able to deal with sparse >> > hessians and jacobians and is unfit for large-scale problems and COBYLA, >> > as other derivative-free methods, is a good choice for optimise noisy >> > objective functions, however usually presents a poorer performance then >> > derivative-based methods when the derivatives are available (or even >> when >> > they are computed by automatic differentiation or finite differences). >> > >> > My proposal is to implement in Scipy one or more state-of-the-art >> solvers >> > (interior point and SQP methods) for constrained optimisation problems. >> I >> > would like to get some feedback about this, discuss the relevance of it >> for >> > Scipy and get some suggestions of possible mentors. >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at scipy.org >> > https://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> > >> >> >> -- >> Matt Haberland >> Assistant Adjunct Professor in the Program in Computing >> Department of Mathematics >> 7620E Math Sciences Building, UCLA >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > 20170310/00ea2b5d/attachment-0001.html> >> >> ------------------------------ >> >> Message: 2 >> Date: Sat, 11 Mar 2017 00:40:00 +0500 >> From: Nikolay Mayorov >> To: "SciPy Developers List" >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> >> Content-Type: text/plain; charset="utf-8" >> >> Hi, Antonio! >> >> >> >> I too think that moving towards more modern algorithms and their >> implementations is good for scipy. I would be happy to mentor this project, >> and most likely I will be able to. >> >> >> >> Some current thoughts and questions: >> >> >> >> 1. Have you figured what is done in SLSQP (especially "least squares" >> part)? Do you plan to use a similar approach or another approach to SQP? (I >> figured there are several somewhat different approaches.) Setting on a >> literature reference (or most likely several of them) is essential. >> >> >> >> 2. I think it is not wrong to focus on a single solver if you feel that >> it will likely take the whole time. Or maybe you can prioritize: do this >> first for sure and then have alternatives, plan a) to switch to another >> solver or plan b) to improve\add something more minor. >> >> >> >> 3. Consider whether to fit a new solver into minimize or make it as a new >> separate solver. The latter approach gives a freedom to implement things >> exactly as you want (and not to depend on old suboptimal choices) , but I >> guess it can be considered as impractical/inconsistent by some people. >> Maybe it can be decided along the way. >> >> >> >> 4. I think it is important to start to think about benchmark problems >> early, maybe even start with them. It's hard to develop a complicated >> optimization algorithm without ability to see how efficiently it works >> right away. >> >> >> >> >> >> >> >> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & >> lt;haberland at ucla.edu> wrote ---- >> >> >> >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> >> >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> >> wrote: >> >> >> >> >> >> >> >> >> >> >> -- >> >> Matt Haberland >> >> Assistant Adjunct Professor in the Program in Computing >> >> Department of Mathematics >> >> 7620E Math Sciences Building, UCLA >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> Hello, my name is Antonio and I am a Brazilian electrical engineer >> currently pursuing my master degree. I have contributed to scipy.optimize >> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method >> "trust-region-exact" (under revision). I am interested in applying for the >> Google Summer of Code 2017 to work with the Scipy optimisation package. >> >> My proposal is to improve scipy.optimize adding optimisation methods that >> are able to deal with non-linear constraints. Currently the only >> implemented methods able to deal with non-linear constraints are the >> FORTRAN wrappers SLSQP and COBYLA. >> >> SLSQP is a sequential quadratic programming method and COBYLA is a >> derivative-free optimisation method, they both have its limitations: SLSQP >> is not able to deal with sparse >> hessians and jacobians and is unfit for large-scale problems and COBYLA, >> as other derivative-free methods, is a good choice for optimise noisy >> objective functions, however usually presents a poorer performance then >> derivative-based methods when the derivatives are available (or even when >> they are computed by automatic differentiation or finite differences). >> My proposal is to implement in Scipy one or more state-of-the-art solvers >> (interior point and SQP methods) for constrained optimisation problems. I >> would like to get some feedback about this, discuss the relevance of it for >> Scipy and get some suggestions of possible mentors. >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> >> >> >> >> >> >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > 20170311/3bd6e920/attachment.html> >> >> ------------------------------ >> >> Subject: Digest Footer >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> ------------------------------ >> >> End of SciPy-Dev Digest, Vol 161, Issue 11 >> ****************************************** >> > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benny.malengier at gmail.com Sat Mar 11 13:55:33 2017 From: benny.malengier at gmail.com (Benny Malengier) Date: Sat, 11 Mar 2017 19:55:33 +0100 Subject: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy In-Reply-To: <15ab9bc2975.f4cd34ac5338.5894437124599888508@zoho.com> References: <15ab9bc2975.f4cd34ac5338.5894437124599888508@zoho.com> Message-ID: Just some thoughts, I have seen quite some methods leveraging cvodes of the sundials suite to obtain the derivates, eg CasADi, DOTcvpSB The odes scikit I maintain has cvode interface, but not yet cvodes, though that should not be too hard to add. Other python interfaces to sundials might have it. Independent, this gives you AD, but the work is in generating the correct equations so as to leverage cvodes. Benny 2017-03-10 20:40 GMT+01:00 Nikolay Mayorov : > Hi, Antonio! > > I too think that moving towards more modern algorithms and their > implementations is good for scipy. I would be happy to mentor this project, > and most likely I will be able to. > > Some current thoughts and questions: > > 1. Have you figured what is done in SLSQP (especially "least squares" > part)? Do you plan to use a similar approach or another approach to SQP? (I > figured there are several somewhat different approaches.) Setting on a > literature reference (or most likely several of them) is essential. > > 2. I think it is not wrong to focus on a single solver if you feel that it > will likely take the whole time. Or maybe you can prioritize: do this first > for sure and then have alternatives, plan a) to switch to another solver or > plan b) to improve\add something more minor. > > 3. Consider whether to fit a new solver into minimize or make it as a new > separate solver. The latter approach gives a freedom to implement things > exactly as you want (and not to depend on old suboptimal choices) , but I > guess it can be considered as impractical/inconsistent by some people. > Maybe it can be decided along the way. > > 4. I think it is important to start to think about benchmark problems > early, maybe even start with them. It's hard to develop a complicated > optimization algorithm without ability to see how efficiently it works > right away. > > > > ---- On Fri, 10 Mar 2017 23:01:32 +0500 *Matt Haberland > >* wrote ---- > > The choice of nonlinear optimization algorithm can have a dramatic impact > on the speed and quality of the solution, and the best choice for a > particular problem can be difficult to determine a priori, so it is > important to have multiple options available. > > My work in optimal control leads to problems with (almost entirely) > nonlinear constraints, and the use of derivative information is essential > for reasonable performance, leaving SLSQP as the only option in SciPy > right now. However, the problems are also huge and very sparse with a > specific structure, so SLSQP is not very effective, and not nearly as > effective as a nonlinear optimization routine could be. So despite SciPy > boasting 14 options for minimization of a nonlinear objective, it wasn't > suitable for this work (without the use of an external solver). > > I think SciPy is in need of at least one solver designed to handle large, > fully nonlinear problems, and having two would be much better. Interior > point and SQP are good, complementary options. > > On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro > wrote: > > > > > -- > Matt Haberland > Assistant Adjunct Professor in the Program in Computing > Department of Mathematics > 7620E Math Sciences Building, UCLA > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > Hello, my name is Antonio and I am a Brazilian electrical engineer > currently pursuing my master degree. I have contributed to scipy.optimize > and scipy.signal implementing functions "iirnotch", "irrpeak" > and the method > "trust-region-exact" (under > revision). I am interested in applying for the Google Summer of Code 2017 > to work with the Scipy optimisation package. > > My proposal is to improve scipy.optimize adding optimisation methods that > are able to deal with non-linear constraints. Currently the only > implemented methods able to deal with non-linear constraints are the > FORTRAN wrappers SLSQP and COBYLA. > > SLSQP is a sequential quadratic programming method and COBYLA is a > derivative-free optimisation method, they both have its limitations: > SLSQP is not able to deal with sparse > hessians and jacobians and is unfit for large-scale problems and COBYLA, > as other derivative-free methods, is a good choice for optimise noisy > objective functions, however usually presents a poorer performance then > derivative-based methods when the derivatives are available (or even when > they are computed by automatic differentiation or finite differences). > > My proposal is to implement in Scipy one or more state-of-the-art solvers > (interior point and SQP methods) for constrained optimisation problems. I > would like to get some feedback about this, discuss the relevance of it for > Scipy and get some suggestions of possible mentors. > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From msuzen at gmail.com Sat Mar 11 14:16:52 2017 From: msuzen at gmail.com (Suzen, Mehmet) Date: Sat, 11 Mar 2017 20:16:52 +0100 Subject: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy In-Reply-To: References: Message-ID: I agree with Matt Haberland. A couple of different options would be great. Some of the Rs code can be leveraged for barrier method [1]. Embedding Steven Johnson's NLopt into SciPy will be nice for nonlinear constraints [2], a derivative-free method like COBYLA would be nice to have [3]. -m [1] https://stat.ethz.ch/R-manual/R-devel/library/stats/html/constrOptim.html https://svn.r-project.org/R/trunk/src/library/stats/R/constrOptim.R [2] http://ab-initio.mit.edu/wiki/index.php/NLopt https://mail.scipy.org/pipermail/numpy-discussion/2010-June/051220.html [3]http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithms#COBYLA_.28Constrained_Optimization_BY_Linear_Approximations.29 From antonior92 at gmail.com Sat Mar 11 14:41:39 2017 From: antonior92 at gmail.com (Antonio Ribeiro) Date: Sat, 11 Mar 2017 16:41:39 -0300 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 12 In-Reply-To: References: Message-ID: <1A30F939-8B48-4761-9005-EEBA3A3C1953@gmail.com> I would like to start by thanking both Matt Haberland and Nikolay Mayorov for making themselves avaliable for mentoring me. I would like very much to have the opportunity to be mentored by one or even both of you. About the questions raised about the project: - To have a wrapper for ?ipopt? in scipy would be nice and, if the owners give permission, it could be one of the items of the proposal. Nevertheless, the implementation of a solver in native Python/Cython still have it advantages and should not be discarded: i) the code in python is much more readable, furthermore native code make it easier for scipy collaborators to add new features and maintain it; ii) Using the proper Numpy and LAPACK functions for expensive operations and using Cython when needed, our implementation can be as fast as C or FORTRAN implementations; iii) It is easier to maintain coherence of interface with other scipy methods using a native implementation. - About the Nikolay question about the interface to use: my plan is to try to implement the solver inside the ?minimize? interface in scipy. If needed I can move to a new interface that gives me more freedom. About what to implement and in what order of priority: 1) Nikolay idea implement a set of benchmarks from the beginning is very good, it would allow to compare my method with other already implemented solvers and to assess the quality of my solver during its development. 2) After having a set of benchmarks my idea is to implement an interior points method. In my opinion interior points methods should be the priority because they are able to handle a large range of problems reasonably well. They are very powerful algorithms for large-scale problems and usually are able to deal with medium/small problems reasonably well too. 3) After that I would like to include different possibilities of Hessian approximation to the method (BFGS, L-BFGS, approximated by finite-differences or provided by the user?) 4) And, if I have time, I would like to also implement a SQP solver. It would probably not be the same as the one used in SLSQP. There are several variations of interior points methods and I am still deciding which one to implement among the several available options. I have seen both line-search and trust-region methods being used in literature and in implemented solvers (for instance "ipopt? uses a line-search approach while ?KNITRO? uses a trust-region approach). I have found a good reference about trust-region interior point methods in the Conn, Gould and Toint ?Trust-Region methods? book; Nocedal and Wright book also have a small chapter about nonlinear interior point methods that can be useful. Furthermore, I am looking into papers and surveys about the subject. I would much appreciate any recommendation or reference about the subject. Same thing with SQP methods. Best regards, Ant?nio > On Mar 11, 2017, at 14:01, scipy-dev-request at scipy.org wrote: > > Send SciPy-Dev mailing list submissions to > scipy-dev at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.scipy.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at scipy.org > > You can reach the person managing the list at > scipy-dev-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Denis Akhiyarov) > 2. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Matt Haberland) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sat, 11 Mar 2017 09:08:17 -0600 > From: Denis Akhiyarov > To: scipy-dev at scipy.org > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > I used ipopt for interior-point method from python, are you trying to add > something similar to scipy? if yes, why not just add a wrapper for ipopt, > since the license looks not restrictive? > > On Fri, Mar 10, 2017 at 1:40 PM, wrote: > >> Send SciPy-Dev mailing list submissions to >> scipy-dev at scipy.org >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> or, via email, send a message with subject or body 'help' to >> scipy-dev-request at scipy.org >> >> You can reach the person managing the list at >> scipy-dev-owner at scipy.org >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of SciPy-Dev digest..." >> >> >> Today's Topics: >> >> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) >> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Fri, 10 Mar 2017 10:01:32 -0800 >> From: Matt Haberland >> To: SciPy Developers List >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: >> > mail.gmail.com> >> Content-Type: text/plain; charset="utf-8" >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro >> wrote: >> >>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>> currently pursuing my master degree. I have contributed to scipy.optimize >>> and scipy.signal implementing functions "iirnotch", "irrpeak" >>> and the method >>> "trust-region-exact" (under >>> revision). I am interested in applying for the Google Summer of Code 2017 >>> to work with the Scipy optimisation package. >>> >>> My proposal is to improve scipy.optimize adding optimisation methods that >>> are able to deal with non-linear constraints. Currently the only >>> implemented methods able to deal with non-linear constraints are the >>> FORTRAN wrappers SLSQP and COBYLA. >>> >>> SLSQP is a sequential quadratic programming method and COBYLA is a >>> derivative-free optimisation method, they both have its limitations: >>> SLSQP is not able to deal with sparse >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>> as other derivative-free methods, is a good choice for optimise noisy >>> objective functions, however usually presents a poorer performance then >>> derivative-based methods when the derivatives are available (or even when >>> they are computed by automatic differentiation or finite differences). >>> >>> My proposal is to implement in Scipy one or more state-of-the-art solvers >>> (interior point and SQP methods) for constrained optimisation problems. I >>> would like to get some feedback about this, discuss the relevance of it >> for >>> Scipy and get some suggestions of possible mentors. >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> >> -- >> Matt Haberland >> Assistant Adjunct Professor in the Program in Computing >> Department of Mathematics >> 7620E Math Sciences Building, UCLA >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > attachments/20170310/00ea2b5d/attachment-0001.html> >> >> ------------------------------ >> >> Message: 2 >> Date: Sat, 11 Mar 2017 00:40:00 +0500 >> From: Nikolay Mayorov >> To: "SciPy Developers List" >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> >> Content-Type: text/plain; charset="utf-8" >> >> Hi, Antonio! >> >> >> >> I too think that moving towards more modern algorithms and their >> implementations is good for scipy. I would be happy to mentor this project, >> and most likely I will be able to. >> >> >> >> Some current thoughts and questions: >> >> >> >> 1. Have you figured what is done in SLSQP (especially "least squares" >> part)? Do you plan to use a similar approach or another approach to SQP? (I >> figured there are several somewhat different approaches.) Setting on a >> literature reference (or most likely several of them) is essential. >> >> >> >> 2. I think it is not wrong to focus on a single solver if you feel that it >> will likely take the whole time. Or maybe you can prioritize: do this first >> for sure and then have alternatives, plan a) to switch to another solver or >> plan b) to improve\add something more minor. >> >> >> >> 3. Consider whether to fit a new solver into minimize or make it as a new >> separate solver. The latter approach gives a freedom to implement things >> exactly as you want (and not to depend on old suboptimal choices) , but I >> guess it can be considered as impractical/inconsistent by some people. >> Maybe it can be decided along the way. >> >> >> >> 4. I think it is important to start to think about benchmark problems >> early, maybe even start with them. It's hard to develop a complicated >> optimization algorithm without ability to see how efficiently it works >> right away. >> >> >> >> >> >> >> >> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & >> lt;haberland at ucla.edu> wrote ---- >> >> >> >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> >> >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> >> wrote: >> >> >> >> >> >> >> >> >> >> >> -- >> >> Matt Haberland >> >> Assistant Adjunct Professor in the Program in Computing >> >> Department of Mathematics >> >> 7620E Math Sciences Building, UCLA >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> Hello, my name is Antonio and I am a Brazilian electrical engineer >> currently pursuing my master degree. I have contributed to scipy.optimize >> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method >> "trust-region-exact" (under revision). I am interested in applying for the >> Google Summer of Code 2017 to work with the Scipy optimisation package. >> >> My proposal is to improve scipy.optimize adding optimisation methods that >> are able to deal with non-linear constraints. Currently the only >> implemented methods able to deal with non-linear constraints are the >> FORTRAN wrappers SLSQP and COBYLA. >> >> SLSQP is a sequential quadratic programming method and COBYLA is a >> derivative-free optimisation method, they both have its limitations: SLSQP >> is not able to deal with sparse >> hessians and jacobians and is unfit for large-scale problems and COBYLA, >> as other derivative-free methods, is a good choice for optimise noisy >> objective functions, however usually presents a poorer performance then >> derivative-based methods when the derivatives are available (or even when >> they are computed by automatic differentiation or finite differences). >> My proposal is to implement in Scipy one or more state-of-the-art solvers >> (interior point and SQP methods) for constrained optimisation problems. I >> would like to get some feedback about this, discuss the relevance of it for >> Scipy and get some suggestions of possible mentors. >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> >> >> >> >> >> >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: > attachments/20170311/3bd6e920/attachment.html> >> >> ------------------------------ >> >> Subject: Digest Footer >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> ------------------------------ >> >> End of SciPy-Dev Digest, Vol 161, Issue 11 >> ****************************************** >> > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 2 > Date: Sat, 11 Mar 2017 09:01:40 -0800 > From: Matt Haberland > To: SciPy Developers List > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > IPOPT is under EPL. From the FAQ: > > *Does the EPL allow me to take the Source Code for a Program licensed under > it and include all or part of it in another program licensed under the GNU > General Public License (GPL), Berkeley Software Distribution (BSD) license > or other Open Source license?* > No. Only the owner of software can decide whether and how to license it to > others. Contributors to a Program licensed under the EPL understand that > source code for the Program will be made available under the terms of the > EPL. Unless you are the owner of the software or have received permission > from the owner, you are not authorized to apply the terms of another > license to the Program by including it in a program licensed under another > Open Source license. > > It might be worth asking the contributors, but their permission would be > necessary. > > > On Mar 11, 2017 7:09 AM, "Denis Akhiyarov" > wrote: > >> I used ipopt for interior-point method from python, are you trying to add >> something similar to scipy? if yes, why not just add a wrapper for ipopt, >> since the license looks not restrictive? >> >> On Fri, Mar 10, 2017 at 1:40 PM, wrote: >> >>> Send SciPy-Dev mailing list submissions to >>> scipy-dev at scipy.org >>> >>> To subscribe or unsubscribe via the World Wide Web, visit >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> or, via email, send a message with subject or body 'help' to >>> scipy-dev-request at scipy.org >>> >>> You can reach the person managing the list at >>> scipy-dev-owner at scipy.org >>> >>> When replying, please edit your Subject line so it is more specific >>> than "Re: Contents of SciPy-Dev digest..." >>> >>> >>> Today's Topics: >>> >>> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) >>> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) >>> >>> >>> ---------------------------------------------------------------------- >>> >>> Message: 1 >>> Date: Fri, 10 Mar 2017 10:01:32 -0800 >>> From: Matt Haberland >>> To: SciPy Developers List >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >>> Message-ID: >>> >> gmail.com> >>> Content-Type: text/plain; charset="utf-8" >>> >>> The choice of nonlinear optimization algorithm can have a dramatic impact >>> on the speed and quality of the solution, and the best choice for a >>> particular problem can be difficult to determine a priori, so it is >>> important to have multiple options available. >>> >>> My work in optimal control leads to problems with (almost entirely) >>> nonlinear constraints, and the use of derivative information is essential >>> for reasonable performance, leaving SLSQP as the only option in SciPy >>> right >>> now. However, the problems are also huge and very sparse with a specific >>> structure, so SLSQP is not very effective, and not nearly as effective as >>> a >>> nonlinear optimization routine could be. So despite SciPy boasting 14 >>> options for minimization of a nonlinear objective, it wasn't suitable for >>> this work (without the use of an external solver). >>> >>> I think SciPy is in need of at least one solver designed to handle large, >>> fully nonlinear problems, and having two would be much better. Interior >>> point and SQP are good, complementary options. >>> >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro >>> wrote: >>> >>>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>>> currently pursuing my master degree. I have contributed to >>> scipy.optimize >>>> and scipy.signal implementing functions "iirnotch", "irrpeak" >>>> and the method >>>> "trust-region-exact" (under >>>> revision). I am interested in applying for the Google Summer of Code >>> 2017 >>>> to work with the Scipy optimisation package. >>>> >>>> My proposal is to improve scipy.optimize adding optimisation methods >>> that >>>> are able to deal with non-linear constraints. Currently the only >>>> implemented methods able to deal with non-linear constraints are the >>>> FORTRAN wrappers SLSQP and COBYLA. >>>> >>>> SLSQP is a sequential quadratic programming method and COBYLA is a >>>> derivative-free optimisation method, they both have its limitations: >>>> SLSQP is not able to deal with sparse >>>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>>> as other derivative-free methods, is a good choice for optimise noisy >>>> objective functions, however usually presents a poorer performance then >>>> derivative-based methods when the derivatives are available (or even >>> when >>>> they are computed by automatic differentiation or finite differences). >>>> >>>> My proposal is to implement in Scipy one or more state-of-the-art >>> solvers >>>> (interior point and SQP methods) for constrained optimisation problems. >>> I >>>> would like to get some feedback about this, discuss the relevance of it >>> for >>>> Scipy and get some suggestions of possible mentors. >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> >>> -- >>> Matt Haberland >>> Assistant Adjunct Professor in the Program in Computing >>> Department of Mathematics >>> 7620E Math Sciences Building, UCLA >>> -------------- next part -------------- >>> An HTML attachment was scrubbed... >>> URL: >> 20170310/00ea2b5d/attachment-0001.html> >>> >>> ------------------------------ >>> >>> Message: 2 >>> Date: Sat, 11 Mar 2017 00:40:00 +0500 >>> From: Nikolay Mayorov >>> To: "SciPy Developers List" >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >>> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> >>> Content-Type: text/plain; charset="utf-8" >>> >>> Hi, Antonio! >>> >>> >>> >>> I too think that moving towards more modern algorithms and their >>> implementations is good for scipy. I would be happy to mentor this project, >>> and most likely I will be able to. >>> >>> >>> >>> Some current thoughts and questions: >>> >>> >>> >>> 1. Have you figured what is done in SLSQP (especially "least squares" >>> part)? Do you plan to use a similar approach or another approach to SQP? (I >>> figured there are several somewhat different approaches.) Setting on a >>> literature reference (or most likely several of them) is essential. >>> >>> >>> >>> 2. I think it is not wrong to focus on a single solver if you feel that >>> it will likely take the whole time. Or maybe you can prioritize: do this >>> first for sure and then have alternatives, plan a) to switch to another >>> solver or plan b) to improve\add something more minor. >>> >>> >>> >>> 3. Consider whether to fit a new solver into minimize or make it as a new >>> separate solver. The latter approach gives a freedom to implement things >>> exactly as you want (and not to depend on old suboptimal choices) , but I >>> guess it can be considered as impractical/inconsistent by some people. >>> Maybe it can be decided along the way. >>> >>> >>> >>> 4. I think it is important to start to think about benchmark problems >>> early, maybe even start with them. It's hard to develop a complicated >>> optimization algorithm without ability to see how efficiently it works >>> right away. >>> >>> >>> >>> >>> >>> >>> >>> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & >>> lt;haberland at ucla.edu> wrote ---- >>> >>> >>> >>> >>> The choice of nonlinear optimization algorithm can have a dramatic impact >>> on the speed and quality of the solution, and the best choice for a >>> particular problem can be difficult to determine a priori, so it is >>> important to have multiple options available. >>> >>> >>> >>> My work in optimal control leads to problems with (almost entirely) >>> nonlinear constraints, and the use of derivative information is essential >>> for reasonable performance, leaving SLSQP as the only option in SciPy right >>> now. However, the problems are also huge and very sparse with a specific >>> structure, so SLSQP is not very effective, and not nearly as effective as a >>> nonlinear optimization routine could be. So despite SciPy boasting 14 >>> options for minimization of a nonlinear objective, it wasn't suitable for >>> this work (without the use of an external solver). >>> >>> >>> >>> I think SciPy is in need of at least one solver designed to handle large, >>> fully nonlinear problems, and having two would be much better. Interior >>> point and SQP are good, complementary options. >>> >>> >>> >>> >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> >>> wrote: >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -- >>> >>> Matt Haberland >>> >>> Assistant Adjunct Professor in the Program in Computing >>> >>> Department of Mathematics >>> >>> 7620E Math Sciences Building, UCLA >>> >>> >>> >>> >>> _______________________________________________ >>> >>> SciPy-Dev mailing list >>> >>> SciPy-Dev at scipy.org >>> >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>> currently pursuing my master degree. I have contributed to scipy.optimize >>> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method >>> "trust-region-exact" (under revision). I am interested in applying for the >>> Google Summer of Code 2017 to work with the Scipy optimisation package. >>> >>> My proposal is to improve scipy.optimize adding optimisation methods that >>> are able to deal with non-linear constraints. Currently the only >>> implemented methods able to deal with non-linear constraints are the >>> FORTRAN wrappers SLSQP and COBYLA. >>> >>> SLSQP is a sequential quadratic programming method and COBYLA is a >>> derivative-free optimisation method, they both have its limitations: SLSQP >>> is not able to deal with sparse >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>> as other derivative-free methods, is a good choice for optimise noisy >>> objective functions, however usually presents a poorer performance then >>> derivative-based methods when the derivatives are available (or even when >>> they are computed by automatic differentiation or finite differences). >>> My proposal is to implement in Scipy one or more state-of-the-art solvers >>> (interior point and SQP methods) for constrained optimisation problems. I >>> would like to get some feedback about this, discuss the relevance of it for >>> Scipy and get some suggestions of possible mentors. >>> >>> >>> >>> >>> _______________________________________________ >>> >>> SciPy-Dev mailing list >>> >>> SciPy-Dev at scipy.org >>> >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -------------- next part -------------- >>> An HTML attachment was scrubbed... >>> URL: >> 20170311/3bd6e920/attachment.html> >>> >>> ------------------------------ >>> >>> Subject: Digest Footer >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> ------------------------------ >>> >>> End of SciPy-Dev Digest, Vol 161, Issue 11 >>> ****************************************** >>> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 161, Issue 12 > ****************************************** From iavlaserg at gmail.com Sat Mar 11 14:50:21 2017 From: iavlaserg at gmail.com (Vladislav Iakovlev) Date: Sat, 11 Mar 2017 22:50:21 +0300 Subject: [SciPy-Dev] GSoC'17: Circular statistics Message-ID: Hello! My name is Vladislav Iakovlev. I am a master student of Department of Applied Math, HSE University, Moscow, Russia. I have never contributed to open source projects, but I would be happy to start it with SciPy. I noticed that existing functionality for circular statistics is insufficient. So, I want to suggest an idea for GSoC: implement it to scipy.stats. My plan is to do it through the next steps: 1) Develop the class rv_circular, analogous to rv_continuous adjusted to circular statistic functions. 2) Develop derived classes for circular distributions. 3) Develop point estimations and statistical tests functions. During the summer, I assume to implement materials from chapters 1-8 of the book ?MARDIA, K. V. AND JUPP , P. E. (2000), Directional Statistics, John Wiley?, documentation and unit tests for it. Is this idea interesting for the Community? I?m glad to any feedback. -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sat Mar 11 16:11:04 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 11 Mar 2017 16:11:04 -0500 Subject: [SciPy-Dev] GSoC'17: Circular statistics In-Reply-To: References: Message-ID: On Sat, Mar 11, 2017 at 2:50 PM, Vladislav Iakovlev wrote: > Hello! > > > My name is Vladislav Iakovlev. I am a master student of Department of > Applied Math, HSE University, Moscow, Russia. I have never contributed to > open source projects, but I would be happy to start it with SciPy. > > > I noticed that existing functionality for circular statistics is > insufficient. So, I want to suggest an idea for GSoC: implement it to > scipy.stats. My plan is to do it through the next steps: > > 1) Develop the class rv_circular, analogous to rv_continuous adjusted > to circular statistic functions. > > 2) Develop derived classes for circular distributions. > > 3) Develop point estimations and statistical tests functions. > > During the summer, I assume to implement materials from chapters 1-8 of the > book ?MARDIA, K. V. AND JUPP , P. E. (2000), Directional Statistics, John > Wiley?, documentation and unit tests for it. > > > Is this idea interesting for the Community? I?m glad to any feedback. > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > More general question: What should be the future of circular stats in python? https://github.com/circstat/pycircstat is a python translation of the matlab toolbox. It doesn't have a license (MIT is commented out in setup.py) but the original matlab toolbox was BSD licensed on the file exchange. I worked on circularstats for several months in 2012 and didn't look at it since then. I also translated the matlab toolbox and added some more parts based on Mardia. http://josef-pkt.github.io/pages/packages/circularstats/circular.html table of content only, I never open sourced it. (*) (Related: some work on mixtures of VonMises distributions https://github.com/rc/dist_mixtures ) I looked into it briefly again https://github.com/statsmodels/statsmodels/issues/3530 but mainly with respect to adding regression with circular response variable. (*) AFAIR, I didn't implement (m)any distribution in scipy distribution style. I was looking mainly at generic constructors for distributions, e.g. using wrapping to go from distribution on the real line to distributions on the circle based on characteristic function and fourier expansion. Josef From karandesai281196 at gmail.com Sun Mar 12 04:19:39 2017 From: karandesai281196 at gmail.com (Karan Desai) Date: Sun, 12 Mar 2017 08:19:39 +0000 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread Message-ID: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Hello developers, I wrote an introductory thread about two weeks ago, expressing my interest in this project. The thread grew very huge and it took me a while to aggregate the suggestions and views of all the fellow developers of the community. Meanwhile, I also inspected the test suite of scipy. I am putting up my conclusions and thoughts further. As an added information, I should specify, as earlier that I have dropped nose and integrated pytest properly into "joblib". Some of the test suite designs were positively affirmed by a Pytest core developer on the issue thread. Here they go: 1. The homepage of Nose after its release of 1.3.7 states that it is no longer maintained. I see no reason continue supporting it, as now or later we will have to let it go. Also I am sure many prominent libraries having scipy or numpy as dependency have already started the shift. 2. If we are using pytest, then we should make full usage of pytest alone. A small utility library like joblib with a few hunderds of tests cut down around 700 lines of code by adopting pytest design. 3. While testing utils can be kept independent of pytest, we should fully embrace pytest for own own unit tests. or even better, we can thinly wrap pytest and make array assertion possible by plain assert keyword statements in some way. ( I haven't thought about it yet ). 4. Making the test suites just work with pytest is a matter of one PR, or a weekend's work and I agree with developers of scikit-image. But it takes more time to complete redesign tests in flavor of pytest, after all we have to think about future contributors joining in. People experienced with pytest should fine the test suite design familiar, for contributions. I think that scikit-image has achieved the former, but still might need some PRs for the latter ( I'm glad to help in future ). 5. Here are some notable changes which are required for pytest transition: 6. Plain assert keyword statements.Pytest's own raise and warn assert helpers produce cleaner error logs on failure.Use pytest's parametrization to get rid of yield based tests and form compact tests.Fixtures like tmpdir, capsys, monkeypatch and others provided by pytest help reduce a lot of boiler plate code. 7. Analyzing the test suite of scipy, I think that it is perfectly fine to adopt a strategy I used before in Joblib. I will present my first draft soon, and it will be heavily inspired from this issue thread: https://www.github.com/joblib/joblib/issues/411 . I made around 20 PRs and after merge of every PR, the suite was in a working condition, so keeping a separate branch shall not be required. I suggest on not falling back to unittests as it will increase the boilerplate code heavily. The reason of not supporting both nosetests and pytest is because they have different flavours, and that it would be difficult to keep up with nose an pytest always comes with something interesting every two months in new releases. I will be happy to hear and work upon further suggestions. Thanks. Regards,Karan. -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sun Mar 12 10:56:07 2017 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 12 Mar 2017 07:56:07 -0700 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai wrote: > While testing utils can be kept independent of pytest, we should fully embrace pytest for own own unit tests. or even better, we can thinly wrap pytest and make array assertion possible by plain assert keyword statements in some way. ( I haven't thought about it yet ). I don't think it's possible, no. pytest and nose don't change the operation of the `assert` statement at all; they just introspect the code and the namespace when the assertion fails to provide a helpful message. `assert x == y` just doesn't work when the arguments are arrays because the result is not coerceable to a proper `bool`. > Making the test suites just work with pytest is a matter of one PR, or a weekend's work and I agree with developers of scikit-image. But it takes more time to complete redesign tests in flavor of pytest, after all we have to think about future contributors joining in. People experienced with pytest should fine the test suite design familiar, for contributions. I think that scikit-image has achieved the former, but still might need some PRs for the latter ( I'm glad to help in future ). > > Here are some notable changes which are required for pytest transition: > > Plain assert keyword statements. Just a note: numpy's test suite, at least, must not use `assert` statements. numpy's test suite must be runnable under `python -O`. numpy does some docstring manipulation that could fail in principle under those conditions (i.e. it did fail at one point until we fixed it, so now we have to ensure that it doesn't regress). scipy might have the same issue (e.g. `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget if we've made that a policy there too. So if you mean, by this requirement, that we convert all of our `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, even in the cases where it would be possible. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sun Mar 12 11:09:50 2017 From: njs at pobox.com (Nathaniel Smith) Date: Sun, 12 Mar 2017 08:09:50 -0700 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Mar 12, 2017 7:56 AM, "Robert Kern" wrote: On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai wrote: > Plain assert keyword statements. Just a note: numpy's test suite, at least, must not use `assert` statements. numpy's test suite must be runnable under `python -O`. numpy does some docstring manipulation that could fail in principle under those conditions (i.e. it did fail at one point until we fixed it, so now we have to ensure that it doesn't regress). scipy might have the same issue (e.g. `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget if we've made that a policy there too. So if you mean, by this requirement, that we convert all of our `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, even in the cases where it would be possible. Pytest arranges for assert statements in test modules to be run even if Python has assert statements disabled in general. See http://doc.pytest.org/en/latest/announce/release-2.1.0.html -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Mar 12 11:15:18 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 12 Mar 2017 11:15:18 -0400 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Sun, Mar 12, 2017 at 11:09 AM, Nathaniel Smith wrote: > On Mar 12, 2017 7:56 AM, "Robert Kern" wrote: > > On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai > wrote: >> Plain assert keyword statements. > > Just a note: numpy's test suite, at least, must not use `assert` statements. > numpy's test suite must be runnable under `python -O`. numpy does some > docstring manipulation that could fail in principle under those conditions > (i.e. it did fail at one point until we fixed it, so now we have to ensure > that it doesn't regress). scipy might have the same issue (e.g. > `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget > if we've made that a policy there too. > > So if you mean, by this requirement, that we convert all of our > `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, > even in the cases where it would be possible. > > > Pytest arranges for assert statements in test modules to be run even if > Python has assert statements disabled in general. See > > http://doc.pytest.org/en/latest/announce/release-2.1.0.html sounds way too magic to me I like the numpy asserts, no statements just nice functions with options instead of magic code rewriting. (and assert_allclose has better defaults than np.allclose) Josef > > -n > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Sun Mar 12 11:19:16 2017 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 12 Mar 2017 08:19:16 -0700 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Sun, Mar 12, 2017 at 8:09 AM, Nathaniel Smith wrote: > > On Mar 12, 2017 7:56 AM, "Robert Kern" wrote: > > On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai wrote: > > Plain assert keyword statements. > > Just a note: numpy's test suite, at least, must not use `assert` statements. numpy's test suite must be runnable under `python -O`. numpy does some docstring manipulation that could fail in principle under those conditions (i.e. it did fail at one point until we fixed it, so now we have to ensure that it doesn't regress). scipy might have the same issue (e.g. `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget if we've made that a policy there too. > > So if you mean, by this requirement, that we convert all of our `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, even in the cases where it would be possible. > > Pytest arranges for assert statements in test modules to be run even if Python has assert statements disabled in general. See > > http://doc.pytest.org/en/latest/announce/release-2.1.0.html Neat! Nonetheless, converting the current tests to use `assert` statements is hardly a requirement, just something that might be enabled by the transition. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Sun Mar 12 12:43:06 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sun, 12 Mar 2017 19:43:06 +0300 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Sun, Mar 12, 2017 at 6:19 PM, Robert Kern wrote: > On Sun, Mar 12, 2017 at 8:09 AM, Nathaniel Smith wrote: >> >> On Mar 12, 2017 7:56 AM, "Robert Kern" wrote: >> >> On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai >> wrote: >> > Plain assert keyword statements. >> >> Just a note: numpy's test suite, at least, must not use `assert` >> statements. numpy's test suite must be runnable under `python -O`. numpy >> does some docstring manipulation that could fail in principle under those >> conditions (i.e. it did fail at one point until we fixed it, so now we have >> to ensure that it doesn't regress). scipy might have the same issue (e.g. >> `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget >> if we've made that a policy there too. >> >> So if you mean, by this requirement, that we convert all of our >> `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, >> even in the cases where it would be possible. >> >> Pytest arranges for assert statements in test modules to be run even if >> Python has assert statements disabled in general. See >> >> http://doc.pytest.org/en/latest/announce/release-2.1.0.html > > Neat! > > Nonetheless, converting the current tests to use `assert` statements is > hardly a requirement, just something that might be enabled by the > transition. > > -- > Robert Kern And since the subject is inevitably going to show up more then once, I think the GSoC proposal should explicitly mention that assert rewriting is not a part of the GSoC. And speaking of the proposal, I think it's time to see the first draft. Evgeni From nikolay.mayorov at zoho.com Sun Mar 12 15:49:19 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Mon, 13 Mar 2017 00:49:19 +0500 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 12 In-Reply-To: <1A30F939-8B48-4761-9005-EEBA3A3C1953@gmail.com> References: <1A30F939-8B48-4761-9005-EEBA3A3C1953@gmail.com> Message-ID: <15ac4116913.d223fc4b16616.7056875246372910426@zoho.com> Nevertheless, the implementation of a solver in native Python/Cython still have it advantages and should not be discarded: Totally support this. Having "inhouse" implementation has the advantages listed by Antonio and additionally it adds substance to scipy project, such that it is not just a collection of wrappers. Obviously some balance between the two approaches should be maintained. I think GSoC is the most proper place to write new mathematical/algorithmic code. I would even suggest not to include any wrapper considerations in your proposal (for this particular GSoC case), but it's only my opinion here. - About the Nikolay question about the interface to use: my plan is to try to implement the solver inside the ?minimize? interface in scipy. If needed I can move to a new interface that gives me more freedom. I think fitting it into "minimize" would be optimal, but if something very annoying/inconvenient/inconsistent will pop up it can be reconsidered I guess. After having a set of benchmarks my idea is to implement an interior points method. It sound interesting, especially because it will be completely new for scipy. 3) After that I would like to include different possibilities of Hessian approximation to the method (BFGS, L-BFGS, approximated by finite-differences or provided by the user?) And what is the "default" method for point 2 would be? 4) And, if I have time, I would like to also implement a SQP solver. It would probably not be the same as the one used in SLSQP. If you develop good intuition to the problem and code base during the first 3 points it might be possible to do it rather quickly, I believe. There are several variations of interior points methods and I am still deciding which one to implement among the several available options. I have seen both line-search and trust-region methods being used in literature and in implemented solvers My intuition (not necessary correct) that trust-region methods are somewhat more powerful / advanced that line-search methods + you already started to study / implement trust-region methods in your PR + there is a very solid and detailed book on trust-region methods you mentioned + me too have some experience with trust-region methods + there is some code for solving trust-region subproblems (like your exact method or I wrote a 2-by-2 method for subspace approach). I would maybe suggest to look into trust-region approach more closely based on all that. As for the references I suggest to scan MATLAB manual https://www.mathworks.com/help/optim/ug/constrained-nonlinear-optimization-algorithms.html#brnpd5f it contains good references to papers (but you probably found them already). ---- On Sun, 12 Mar 2017 00:41:39 +0500 Antonio Ribeiro <antonior92 at gmail.com> wrote ---- I would like to start by thanking both Matt Haberland and Nikolay Mayorov for making themselves avaliable for mentoring me. I would like very much to have the opportunity to be mentored by one or even both of you. About the questions raised about the project: - To have a wrapper for ?ipopt? in scipy would be nice and, if the owners give permission, it could be one of the items of the proposal. Nevertheless, the implementation of a solver in native Python/Cython still have it advantages and should not be discarded: i) the code in python is much more readable, furthermore native code make it easier for scipy collaborators to add new features and maintain it; ii) Using the proper Numpy and LAPACK functions for expensive operations and using Cython when needed, our implementation can be as fast as C or FORTRAN implementations; iii) It is easier to maintain coherence of interface with other scipy methods using a native implementation. - About the Nikolay question about the interface to use: my plan is to try to implement the solver inside the ?minimize? interface in scipy. If needed I can move to a new interface that gives me more freedom. About what to implement and in what order of priority: 1) Nikolay idea implement a set of benchmarks from the beginning is very good, it would allow to compare my method with other already implemented solvers and to assess the quality of my solver during its development. 2) After having a set of benchmarks my idea is to implement an interior points method. In my opinion interior points methods should be the priority because they are able to handle a large range of problems reasonably well. They are very powerful algorithms for large-scale problems and usually are able to deal with medium/small problems reasonably well too. 3) After that I would like to include different possibilities of Hessian approximation to the method (BFGS, L-BFGS, approximated by finite-differences or provided by the user?) 4) And, if I have time, I would like to also implement a SQP solver. It would probably not be the same as the one used in SLSQP. There are several variations of interior points methods and I am still deciding which one to implement among the several available options. I have seen both line-search and trust-region methods being used in literature and in implemented solvers (for instance "ipopt? uses a line-search approach while ?KNITRO? uses a trust-region approach). I have found a good reference about trust-region interior point methods in the Conn, Gould and Toint ?Trust-Region methods? book; Nocedal and Wright book also have a small chapter about nonlinear interior point methods that can be useful. Furthermore, I am looking into papers and surveys about the subject. I would much appreciate any recommendation or reference about the subject. Same thing with SQP methods. Best regards, Ant?nio > On Mar 11, 2017, at 14:01, scipy-dev-request at scipy.org wrote: > > Send SciPy-Dev mailing list submissions to > scipy-dev at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.scipy.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at scipy.org > > You can reach the person managing the list at > scipy-dev-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Denis Akhiyarov) > 2. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Matt Haberland) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sat, 11 Mar 2017 09:08:17 -0600 > From: Denis Akhiyarov <denis.akhiyarov at gmail.com> > To: scipy-dev at scipy.org > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > Message-ID: > <CALxxJLQhh-e-tM5u0EijBbWaC0etaChxkLt61LEy=EPg0A-J-A at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > I used ipopt for interior-point method from python, are you trying to add > something similar to scipy? if yes, why not just add a wrapper for ipopt, > since the license looks not restrictive? > > On Fri, Mar 10, 2017 at 1:40 PM, <scipy-dev-request at scipy.org> wrote: > >> Send SciPy-Dev mailing list submissions to >> scipy-dev at scipy.org >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> or, via email, send a message with subject or body 'help' to >> scipy-dev-request at scipy.org >> >> You can reach the person managing the list at >> scipy-dev-owner at scipy.org >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of SciPy-Dev digest..." >> >> >> Today's Topics: >> >> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) >> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Fri, 10 Mar 2017 10:01:32 -0800 >> From: Matt Haberland <haberland at ucla.edu> >> To: SciPy Developers List <scipy-dev at scipy.org> >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: >> <CADuxUiyK_d3BvjinikmG_k8LTh8JczMFPwboNaSerdnsWFp=yQ@ >> mail.gmail.com> >> Content-Type: text/plain; charset="utf-8" >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> >> wrote: >> >>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>> currently pursuing my master degree. I have contributed to scipy.optimize >>> and scipy.signal implementing functions "iirnotch", "irrpeak" >>> <https://github.com/scipy/scipy/pull/6404>and the method >>> "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under >>> revision). I am interested in applying for the Google Summer of Code 2017 >>> to work with the Scipy optimisation package. >>> >>> My proposal is to improve scipy.optimize adding optimisation methods that >>> are able to deal with non-linear constraints. Currently the only >>> implemented methods able to deal with non-linear constraints are the >>> FORTRAN wrappers SLSQP and COBYLA. >>> >>> SLSQP is a sequential quadratic programming method and COBYLA is a >>> derivative-free optimisation method, they both have its limitations: >>> SLSQP is not able to deal with sparse >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>> as other derivative-free methods, is a good choice for optimise noisy >>> objective functions, however usually presents a poorer performance then >>> derivative-based methods when the derivatives are available (or even when >>> they are computed by automatic differentiation or finite differences). >>> >>> My proposal is to implement in Scipy one or more state-of-the-art solvers >>> (interior point and SQP methods) for constrained optimisation problems. I >>> would like to get some feedback about this, discuss the relevance of it >> for >>> Scipy and get some suggestions of possible mentors. >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> >> -- >> Matt Haberland >> Assistant Adjunct Professor in the Program in Computing >> Department of Mathematics >> 7620E Math Sciences Building, UCLA >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: <https://mail.scipy.org/pipermail/scipy-dev/ >> attachments/20170310/00ea2b5d/attachment-0001.html> >> >> ------------------------------ >> >> Message: 2 >> Date: Sat, 11 Mar 2017 00:40:00 +0500 >> From: Nikolay Mayorov <nikolay.mayorov at zoho.com> >> To: "SciPy Developers List" <scipy-dev at scipy.org> >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> >> Content-Type: text/plain; charset="utf-8" >> >> Hi, Antonio! >> >> >> >> I too think that moving towards more modern algorithms and their >> implementations is good for scipy. I would be happy to mentor this project, >> and most likely I will be able to. >> >> >> >> Some current thoughts and questions: >> >> >> >> 1. Have you figured what is done in SLSQP (especially "least squares" >> part)? Do you plan to use a similar approach or another approach to SQP? (I >> figured there are several somewhat different approaches.) Setting on a >> literature reference (or most likely several of them) is essential. >> >> >> >> 2. I think it is not wrong to focus on a single solver if you feel that it >> will likely take the whole time. Or maybe you can prioritize: do this first >> for sure and then have alternatives, plan a) to switch to another solver or >> plan b) to improve\add something more minor. >> >> >> >> 3. Consider whether to fit a new solver into minimize or make it as a new >> separate solver. The latter approach gives a freedom to implement things >> exactly as you want (and not to depend on old suboptimal choices) , but I >> guess it can be considered as impractical/inconsistent by some people. >> Maybe it can be decided along the way. >> >> >> >> 4. I think it is important to start to think about benchmark problems >> early, maybe even start with them. It's hard to develop a complicated >> optimization algorithm without ability to see how efficiently it works >> right away. >> >> >> >> >> >> >> >> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & >> lt;haberland at ucla.edu&gt; wrote ---- >> >> >> >> >> The choice of nonlinear optimization algorithm can have a dramatic impact >> on the speed and quality of the solution, and the best choice for a >> particular problem can be difficult to determine a priori, so it is >> important to have multiple options available. >> >> >> >> My work in optimal control leads to problems with (almost entirely) >> nonlinear constraints, and the use of derivative information is essential >> for reasonable performance, leaving SLSQP as the only option in SciPy right >> now. However, the problems are also huge and very sparse with a specific >> structure, so SLSQP is not very effective, and not nearly as effective as a >> nonlinear optimization routine could be. So despite SciPy boasting 14 >> options for minimization of a nonlinear objective, it wasn't suitable for >> this work (without the use of an external solver). >> >> >> >> I think SciPy is in need of at least one solver designed to handle large, >> fully nonlinear problems, and having two would be much better. Interior >> point and SQP are good, complementary options. >> >> >> >> >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro &lt;antonior92 at gmail.com&gt; >> wrote: >> >> >> >> >> >> >> >> >> >> >> -- >> >> Matt Haberland >> >> Assistant Adjunct Professor in the Program in Computing >> >> Department of Mathematics >> >> 7620E Math Sciences Building, UCLA >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> Hello, my name is Antonio and I am a Brazilian electrical engineer >> currently pursuing my master degree. I have contributed to scipy.optimize >> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method >> "trust-region-exact" (under revision). I am interested in applying for the >> Google Summer of Code 2017 to work with the Scipy optimisation package. >> >> My proposal is to improve scipy.optimize adding optimisation methods that >> are able to deal with non-linear constraints. Currently the only >> implemented methods able to deal with non-linear constraints are the >> FORTRAN wrappers SLSQP and COBYLA. >> >> SLSQP is a sequential quadratic programming method and COBYLA is a >> derivative-free optimisation method, they both have its limitations: SLSQP >> is not able to deal with sparse >> hessians and jacobians and is unfit for large-scale problems and COBYLA, >> as other derivative-free methods, is a good choice for optimise noisy >> objective functions, however usually presents a poorer performance then >> derivative-based methods when the derivatives are available (or even when >> they are computed by automatic differentiation or finite differences). >> My proposal is to implement in Scipy one or more state-of-the-art solvers >> (interior point and SQP methods) for constrained optimisation problems. I >> would like to get some feedback about this, discuss the relevance of it for >> Scipy and get some suggestions of possible mentors. >> >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> >> >> >> >> >> >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: <https://mail.scipy.org/pipermail/scipy-dev/ >> attachments/20170311/3bd6e920/attachment.html> >> >> ------------------------------ >> >> Subject: Digest Footer >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> ------------------------------ >> >> End of SciPy-Dev Digest, Vol 161, Issue 11 >> ****************************************** >> > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/20170311/098a8651/attachment-0001.html> > > ------------------------------ > > Message: 2 > Date: Sat, 11 Mar 2017 09:01:40 -0800 > From: Matt Haberland <haberland at ucla.edu> > To: SciPy Developers List <scipy-dev at scipy.org> > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > Message-ID: > <CADuxUiyT==+J_5tPjPw_q=LfAKwv1t+e+6wVsF8sHASVO+PYxA at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > IPOPT is under EPL. From the FAQ: > > *Does the EPL allow me to take the Source Code for a Program licensed under > it and include all or part of it in another program licensed under the GNU > General Public License (GPL), Berkeley Software Distribution (BSD) license > or other Open Source license?* > No. Only the owner of software can decide whether and how to license it to > others. Contributors to a Program licensed under the EPL understand that > source code for the Program will be made available under the terms of the > EPL. Unless you are the owner of the software or have received permission > from the owner, you are not authorized to apply the terms of another > license to the Program by including it in a program licensed under another > Open Source license. > > It might be worth asking the contributors, but their permission would be > necessary. > > > On Mar 11, 2017 7:09 AM, "Denis Akhiyarov" <denis.akhiyarov at gmail.com> > wrote: > >> I used ipopt for interior-point method from python, are you trying to add >> something similar to scipy? if yes, why not just add a wrapper for ipopt, >> since the license looks not restrictive? >> >> On Fri, Mar 10, 2017 at 1:40 PM, <scipy-dev-request at scipy.org> wrote: >> >>> Send SciPy-Dev mailing list submissions to >>> scipy-dev at scipy.org >>> >>> To subscribe or unsubscribe via the World Wide Web, visit >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> or, via email, send a message with subject or body 'help' to >>> scipy-dev-request at scipy.org >>> >>> You can reach the person managing the list at >>> scipy-dev-owner at scipy.org >>> >>> When replying, please edit your Subject line so it is more specific >>> than "Re: Contents of SciPy-Dev digest..." >>> >>> >>> Today's Topics: >>> >>> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) >>> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) >>> >>> >>> ---------------------------------------------------------------------- >>> >>> Message: 1 >>> Date: Fri, 10 Mar 2017 10:01:32 -0800 >>> From: Matt Haberland <haberland at ucla.edu> >>> To: SciPy Developers List <scipy-dev at scipy.org> >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >>> Message-ID: >>> <CADuxUiyK_d3BvjinikmG_k8LTh8JczMFPwboNaSerdnsWFp=yQ at mail. >>> gmail.com> >>> Content-Type: text/plain; charset="utf-8" >>> >>> The choice of nonlinear optimization algorithm can have a dramatic impact >>> on the speed and quality of the solution, and the best choice for a >>> particular problem can be difficult to determine a priori, so it is >>> important to have multiple options available. >>> >>> My work in optimal control leads to problems with (almost entirely) >>> nonlinear constraints, and the use of derivative information is essential >>> for reasonable performance, leaving SLSQP as the only option in SciPy >>> right >>> now. However, the problems are also huge and very sparse with a specific >>> structure, so SLSQP is not very effective, and not nearly as effective as >>> a >>> nonlinear optimization routine could be. So despite SciPy boasting 14 >>> options for minimization of a nonlinear objective, it wasn't suitable for >>> this work (without the use of an external solver). >>> >>> I think SciPy is in need of at least one solver designed to handle large, >>> fully nonlinear problems, and having two would be much better. Interior >>> point and SQP are good, complementary options. >>> >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> >>> wrote: >>> >>>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>>> currently pursuing my master degree. I have contributed to >>> scipy.optimize >>>> and scipy.signal implementing functions "iirnotch", "irrpeak" >>>> <https://github.com/scipy/scipy/pull/6404>and the method >>>> "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under >>>> revision). I am interested in applying for the Google Summer of Code >>> 2017 >>>> to work with the Scipy optimisation package. >>>> >>>> My proposal is to improve scipy.optimize adding optimisation methods >>> that >>>> are able to deal with non-linear constraints. Currently the only >>>> implemented methods able to deal with non-linear constraints are the >>>> FORTRAN wrappers SLSQP and COBYLA. >>>> >>>> SLSQP is a sequential quadratic programming method and COBYLA is a >>>> derivative-free optimisation method, they both have its limitations: >>>> SLSQP is not able to deal with sparse >>>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>>> as other derivative-free methods, is a good choice for optimise noisy >>>> objective functions, however usually presents a poorer performance then >>>> derivative-based methods when the derivatives are available (or even >>> when >>>> they are computed by automatic differentiation or finite differences). >>>> >>>> My proposal is to implement in Scipy one or more state-of-the-art >>> solvers >>>> (interior point and SQP methods) for constrained optimisation problems. >>> I >>>> would like to get some feedback about this, discuss the relevance of it >>> for >>>> Scipy and get some suggestions of possible mentors. >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> >>> -- >>> Matt Haberland >>> Assistant Adjunct Professor in the Program in Computing >>> Department of Mathematics >>> 7620E Math Sciences Building, UCLA >>> -------------- next part -------------- >>> An HTML attachment was scrubbed... >>> URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/ >>> 20170310/00ea2b5d/attachment-0001.html> >>> >>> ------------------------------ >>> >>> Message: 2 >>> Date: Sat, 11 Mar 2017 00:40:00 +0500 >>> From: Nikolay Mayorov <nikolay.mayorov at zoho.com> >>> To: "SciPy Developers List" <scipy-dev at scipy.org> >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy >>> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> >>> Content-Type: text/plain; charset="utf-8" >>> >>> Hi, Antonio! >>> >>> >>> >>> I too think that moving towards more modern algorithms and their >>> implementations is good for scipy. I would be happy to mentor this project, >>> and most likely I will be able to. >>> >>> >>> >>> Some current thoughts and questions: >>> >>> >>> >>> 1. Have you figured what is done in SLSQP (especially "least squares" >>> part)? Do you plan to use a similar approach or another approach to SQP? (I >>> figured there are several somewhat different approaches.) Setting on a >>> literature reference (or most likely several of them) is essential. >>> >>> >>> >>> 2. I think it is not wrong to focus on a single solver if you feel that >>> it will likely take the whole time. Or maybe you can prioritize: do this >>> first for sure and then have alternatives, plan a) to switch to another >>> solver or plan b) to improve\add something more minor. >>> >>> >>> >>> 3. Consider whether to fit a new solver into minimize or make it as a new >>> separate solver. The latter approach gives a freedom to implement things >>> exactly as you want (and not to depend on old suboptimal choices) , but I >>> guess it can be considered as impractical/inconsistent by some people. >>> Maybe it can be decided along the way. >>> >>> >>> >>> 4. I think it is important to start to think about benchmark problems >>> early, maybe even start with them. It's hard to develop a complicated >>> optimization algorithm without ability to see how efficiently it works >>> right away. >>> >>> >>> >>> >>> >>> >>> >>> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & >>> lt;haberland at ucla.edu&gt; wrote ---- >>> >>> >>> >>> >>> The choice of nonlinear optimization algorithm can have a dramatic impact >>> on the speed and quality of the solution, and the best choice for a >>> particular problem can be difficult to determine a priori, so it is >>> important to have multiple options available. >>> >>> >>> >>> My work in optimal control leads to problems with (almost entirely) >>> nonlinear constraints, and the use of derivative information is essential >>> for reasonable performance, leaving SLSQP as the only option in SciPy right >>> now. However, the problems are also huge and very sparse with a specific >>> structure, so SLSQP is not very effective, and not nearly as effective as a >>> nonlinear optimization routine could be. So despite SciPy boasting 14 >>> options for minimization of a nonlinear objective, it wasn't suitable for >>> this work (without the use of an external solver). >>> >>> >>> >>> I think SciPy is in need of at least one solver designed to handle large, >>> fully nonlinear problems, and having two would be much better. Interior >>> point and SQP are good, complementary options. >>> >>> >>> >>> >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro &lt;antonior92 at gmail.com&gt; >>> wrote: >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -- >>> >>> Matt Haberland >>> >>> Assistant Adjunct Professor in the Program in Computing >>> >>> Department of Mathematics >>> >>> 7620E Math Sciences Building, UCLA >>> >>> >>> >>> >>> _______________________________________________ >>> >>> SciPy-Dev mailing list >>> >>> SciPy-Dev at scipy.org >>> >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> Hello, my name is Antonio and I am a Brazilian electrical engineer >>> currently pursuing my master degree. I have contributed to scipy.optimize >>> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method >>> "trust-region-exact" (under revision). I am interested in applying for the >>> Google Summer of Code 2017 to work with the Scipy optimisation package. >>> >>> My proposal is to improve scipy.optimize adding optimisation methods that >>> are able to deal with non-linear constraints. Currently the only >>> implemented methods able to deal with non-linear constraints are the >>> FORTRAN wrappers SLSQP and COBYLA. >>> >>> SLSQP is a sequential quadratic programming method and COBYLA is a >>> derivative-free optimisation method, they both have its limitations: SLSQP >>> is not able to deal with sparse >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, >>> as other derivative-free methods, is a good choice for optimise noisy >>> objective functions, however usually presents a poorer performance then >>> derivative-based methods when the derivatives are available (or even when >>> they are computed by automatic differentiation or finite differences). >>> My proposal is to implement in Scipy one or more state-of-the-art solvers >>> (interior point and SQP methods) for constrained optimisation problems. I >>> would like to get some feedback about this, discuss the relevance of it for >>> Scipy and get some suggestions of possible mentors. >>> >>> >>> >>> >>> _______________________________________________ >>> >>> SciPy-Dev mailing list >>> >>> SciPy-Dev at scipy.org >>> >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -------------- next part -------------- >>> An HTML attachment was scrubbed... >>> URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/ >>> 20170311/3bd6e920/attachment.html> >>> >>> ------------------------------ >>> >>> Subject: Digest Footer >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> https://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> ------------------------------ >>> >>> End of SciPy-Dev Digest, Vol 161, Issue 11 >>> ****************************************** >>> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/20170311/1fe49fd7/attachment.html> > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 161, Issue 12 > ****************************************** _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From karandesai281196 at gmail.com Sun Mar 12 16:47:57 2017 From: karandesai281196 at gmail.com (Karan Desai) Date: Sun, 12 Mar 2017 20:47:57 +0000 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: <6a3f2824-f319-d37d-3463-e9e3b3ccf457@mixmax.com> Okay, so I'll clarify one thing based on all the suggestions above: Rewrting assert methods as statements will not be a part of my proposal. I am talking about mainly two things here - one is about numpy testing utils, which will be used by other libraries directly (like assert array methods), and other is about the test suite of scipy and numpy itself, comprising of unit tests of these two libraries itself. My idea is to keep the former as much independent of testing framework as possible, while adopt pytest flavor (viz. fixtures and parametrization) for the latter. > ?And speaking of the proposal, I think it's time to see the first draft. Sure, coming up soon. Regards,Karan. On Sun, Mar 12, 2017 10:13 PM, Evgeni Burovski evgeny.burovskiy at gmail.com wrote: On Sun, Mar 12, 2017 at 6:19 PM, Robert Kern wrote: > On Sun, Mar 12, 2017 at 8:09 AM, Nathaniel Smith wrote: >> >> On Mar 12, 2017 7:56 AM, "Robert Kern" wrote: >> >> On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai >> wrote: >> > Plain assert keyword statements. >> >> Just a note: numpy's test suite, at least, must not use `assert` >> statements. numpy's test suite must be runnable under `python -O`. numpy >> does some docstring manipulation that could fail in principle under those >> conditions (i.e. it did fail at one point until we fixed it, so now we have >> to ensure that it doesn't regress). scipy might have the same issue (e.g. >> `scipy.special` ufuncs and `scipy.stats` distribution objects), but I forget >> if we've made that a policy there too. >> >> So if you mean, by this requirement, that we convert all of our >> `assert_equal(x, y)` calls to `assert x == y`, no, we won't be doing that, >> even in the cases where it would be possible. >> >> Pytest arranges for assert statements in test modules to be run even if >> Python has assert statements disabled in general. See >> >> http://doc.pytest.org/en/latest/announce/release-2.1.0.html > > Neat! > > Nonetheless, converting the current tests to use `assert` statements is > hardly a requirement, just something that might be enabled by the > transition. > > -- > Robert Kern And since the subject is inevitably going to show up more then once, I think the GSoC proposal should explicitly mention that assert rewriting is not a part of the GSoC. And speaking of the proposal, I think it's time to see the first draft. Evgeni _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From antonior92 at gmail.com Sun Mar 12 17:38:08 2017 From: antonior92 at gmail.com (Antonio Ribeiro) Date: Sun, 12 Mar 2017 18:38:08 -0300 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 16 In-Reply-To: References: Message-ID: <7F895B76-1057-405A-BCDF-3B2487F0DC18@gmail.com> *And what is the "default" method for point 2 would be?* I am thinking about make it dependent on what is provided by the user: if the user provide the hessian the method would use it, otherwise the default would be a quasi-newton hessian approximation. I agree with the choice of a trust-region approach, it really seems the most sensible choice. Furthermore, thank you for the references, I will take a look. > On Mar 12, 2017, at 16:49, scipy-dev-request at scipy.org wrote: > > Send SciPy-Dev mailing list submissions to > scipy-dev at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.scipy.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at scipy.org > > You can reach the person managing the list at > scipy-dev-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. Re: SciPy-Dev Digest, Vol 161, Issue 12 (Nikolay Mayorov) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Mon, 13 Mar 2017 00:49:19 +0500 > From: Nikolay Mayorov > To: "SciPy Developers List" > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 12 > Message-ID: <15ac4116913.d223fc4b16616.7056875246372910426 at zoho.com> > Content-Type: text/plain; charset="utf-8" > > Nevertheless, the implementation > > of a solver in native Python/Cython still have it advantages and should not be discarded: > > > > > > Totally support this. Having "inhouse" implementation has the advantages listed by Antonio and additionally it adds substance to scipy project, such that it is not just a collection of wrappers. Obviously some balance between the two approaches should be maintained. I think GSoC is the most proper place to write new mathematical/algorithmic code. I would even suggest not to include any wrapper considerations in your proposal (for this particular GSoC case), but it's only my opinion here. > > > > > - About the Nikolay question about the interface to use: my plan is to try to implement the > > solver inside the ?minimize? interface in scipy. If needed I can move to a new interface > > that gives me more freedom. > > > > > > I think fitting it into "minimize" would be optimal, but if something very annoying/inconvenient/inconsistent will pop up it can be reconsidered I guess. > > > > After having a set of benchmarks my idea is to implement an interior points method. > > > > > > It sound interesting, especially because it will be completely new for scipy. > > > > > 3) After that I would like to include different possibilities of Hessian approximation to > > the method (BFGS, L-BFGS, approximated by finite-differences or provided by the user?) > > > > > And what is the "default" method for point 2 would be? > > > 4) And, if I have time, I would like to also implement a SQP solver. It would probably not be the > > > same as the one used in SLSQP. > > > > > If you develop good intuition to the problem and code base during the first 3 points it might be possible to do it rather quickly, I believe. > > > > > There are several variations of interior points methods and I am still deciding which > > one to implement among the several available options. I have seen both > > line-search and trust-region methods being used in literature and in implemented solvers > > > > > My intuition (not necessary correct) that trust-region methods are somewhat more powerful / advanced that line-search methods + you already started to study / implement trust-region methods in your PR + there is a very solid and detailed book on trust-region methods you mentioned + me too have some experience with trust-region methods + there is some code for solving trust-region subproblems (like your exact method or I wrote a 2-by-2 method for subspace approach). I would maybe suggest to look into trust-region approach more closely based on all that. > > > > As for the references I suggest to scan MATLAB manual https://www.mathworks.com/help/optim/ug/constrained-nonlinear-optimization-algorithms.html#brnpd5f it contains good references to papers (but you probably found them already). > > > > > > ---- On Sun, 12 Mar 2017 00:41:39 +0500 Antonio Ribeiro <antonior92 at gmail.com> wrote ---- > > > > > I would like to start by thanking both Matt Haberland and Nikolay Mayorov for making > > themselves avaliable for mentoring me. I would like very much to have the opportunity > > to be mentored by one or even both of you. > > > > About the questions raised about the project: > > > > - To have a wrapper for ?ipopt? in scipy would be nice and, if the owners give permission, > > it could be one of the items of the proposal. Nevertheless, the implementation > > of a solver in native Python/Cython still have it advantages and should not be discarded: > > i) the code in python is much more readable, furthermore native code make it easier for scipy collaborators > > to add new features and maintain it; ii) Using the proper Numpy and LAPACK functions > > for expensive operations and using Cython when needed, our implementation can be as > > fast as C or FORTRAN implementations; iii) It is easier to maintain coherence of interface > > with other scipy methods using a native implementation. > > - About the Nikolay question about the interface to use: my plan is to try to implement the > > solver inside the ?minimize? interface in scipy. If needed I can move to a new interface > > that gives me more freedom. > > > > About what to implement and in what order of priority: > > > > 1) Nikolay idea implement a set of benchmarks from the beginning is very good, > > it would allow to compare my method with other already implemented solvers > > and to assess the quality of my solver during its development. > > 2) After having a set of benchmarks my idea is to implement an interior points method. > > In my opinion interior points methods should be the priority because they are able to > > handle a large range of problems reasonably well. They are very powerful algorithms > > for large-scale problems and usually are able to deal with medium/small > > problems reasonably well too. > > 3) After that I would like to include different possibilities of Hessian approximation to > > the method (BFGS, L-BFGS, approximated by finite-differences or provided by the user?) > > 4) And, if I have time, I would like to also implement a SQP solver. It would probably not be the > > same as the one used in SLSQP. > > > > There are several variations of interior points methods and I am still deciding which > > one to implement among the several available options. I have seen both > > line-search and trust-region methods being used in literature and in implemented solvers > > (for instance "ipopt? uses a line-search approach while ?KNITRO? uses a trust-region approach). > > I have found a good reference about trust-region interior point methods in the > > Conn, Gould and Toint ?Trust-Region methods? book; Nocedal and Wright book also have > > a small chapter about nonlinear interior point methods that can be useful. Furthermore, > > I am looking into papers and surveys about the subject. I would much appreciate > > any recommendation or reference about the subject. Same thing with SQP methods. > > > > Best regards, > > Ant?nio > > > > > > > > > > > > > > > > > > > > > > > > > > > On Mar 11, 2017, at 14:01, scipy-dev-request at scipy.org wrote: > > > > > > Send SciPy-Dev mailing list submissions to > > > scipy-dev at scipy.org > > > > > > To subscribe or unsubscribe via the World Wide Web, visit > > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > or, via email, send a message with subject or body 'help' to > > > scipy-dev-request at scipy.org > > > > > > You can reach the person managing the list at > > > scipy-dev-owner at scipy.org > > > > > > When replying, please edit your Subject line so it is more specific > > > than "Re: Contents of SciPy-Dev digest..." > > > > > > > > > Today's Topics: > > > > > > 1. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Denis Akhiyarov) > > > 2. Re: SciPy-Dev Digest, Vol 161, Issue 11 (Matt Haberland) > > > > > > > > > ---------------------------------------------------------------------- > > > > > > Message: 1 > > > Date: Sat, 11 Mar 2017 09:08:17 -0600 > > > From: Denis Akhiyarov <denis.akhiyarov at gmail.com> > > > To: scipy-dev at scipy.org > > > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > > > Message-ID: > > > <CALxxJLQhh-e-tM5u0EijBbWaC0etaChxkLt61LEy=EPg0A-J-A at mail.gmail.com> > > > Content-Type: text/plain; charset="utf-8" > > > > > > I used ipopt for interior-point method from python, are you trying to add > > > something similar to scipy? if yes, why not just add a wrapper for ipopt, > > > since the license looks not restrictive? > > > > > > On Fri, Mar 10, 2017 at 1:40 PM, <scipy-dev-request at scipy.org> wrote: > > > > > >> Send SciPy-Dev mailing list submissions to > > >> scipy-dev at scipy.org > > >> > > >> To subscribe or unsubscribe via the World Wide Web, visit > > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >> or, via email, send a message with subject or body 'help' to > > >> scipy-dev-request at scipy.org > > >> > > >> You can reach the person managing the list at > > >> scipy-dev-owner at scipy.org > > >> > > >> When replying, please edit your Subject line so it is more specific > > >> than "Re: Contents of SciPy-Dev digest..." > > >> > > >> > > >> Today's Topics: > > >> > > >> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) > > >> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) > > >> > > >> > > >> ---------------------------------------------------------------------- > > >> > > >> Message: 1 > > >> Date: Fri, 10 Mar 2017 10:01:32 -0800 > > >> From: Matt Haberland <haberland at ucla.edu> > > >> To: SciPy Developers List <scipy-dev at scipy.org> > > >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > > >> Message-ID: > > >> <CADuxUiyK_d3BvjinikmG_k8LTh8JczMFPwboNaSerdnsWFp=yQ@ > > >> mail.gmail.com> > > >> Content-Type: text/plain; charset="utf-8" > > >> > > >> The choice of nonlinear optimization algorithm can have a dramatic impact > > >> on the speed and quality of the solution, and the best choice for a > > >> particular problem can be difficult to determine a priori, so it is > > >> important to have multiple options available. > > >> > > >> My work in optimal control leads to problems with (almost entirely) > > >> nonlinear constraints, and the use of derivative information is essential > > >> for reasonable performance, leaving SLSQP as the only option in SciPy right > > >> now. However, the problems are also huge and very sparse with a specific > > >> structure, so SLSQP is not very effective, and not nearly as effective as a > > >> nonlinear optimization routine could be. So despite SciPy boasting 14 > > >> options for minimization of a nonlinear objective, it wasn't suitable for > > >> this work (without the use of an external solver). > > >> > > >> I think SciPy is in need of at least one solver designed to handle large, > > >> fully nonlinear problems, and having two would be much better. Interior > > >> point and SQP are good, complementary options. > > >> > > >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> > > >> wrote: > > >> > > >>> Hello, my name is Antonio and I am a Brazilian electrical engineer > > >>> currently pursuing my master degree. I have contributed to scipy.optimize > > >>> and scipy.signal implementing functions "iirnotch", "irrpeak" > > >>> <https://github.com/scipy/scipy/pull/6404>and the method > > >>> "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under > > >>> revision). I am interested in applying for the Google Summer of Code 2017 > > >>> to work with the Scipy optimisation package. > > >>> > > >>> My proposal is to improve scipy.optimize adding optimisation methods that > > >>> are able to deal with non-linear constraints. Currently the only > > >>> implemented methods able to deal with non-linear constraints are the > > >>> FORTRAN wrappers SLSQP and COBYLA. > > >>> > > >>> SLSQP is a sequential quadratic programming method and COBYLA is a > > >>> derivative-free optimisation method, they both have its limitations: > > >>> SLSQP is not able to deal with sparse > > >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, > > >>> as other derivative-free methods, is a good choice for optimise noisy > > >>> objective functions, however usually presents a poorer performance then > > >>> derivative-based methods when the derivatives are available (or even when > > >>> they are computed by automatic differentiation or finite differences). > > >>> > > >>> My proposal is to implement in Scipy one or more state-of-the-art solvers > > >>> (interior point and SQP methods) for constrained optimisation problems. I > > >>> would like to get some feedback about this, discuss the relevance of it > > >> for > > >>> Scipy and get some suggestions of possible mentors. > > >>> > > >>> _______________________________________________ > > >>> SciPy-Dev mailing list > > >>> SciPy-Dev at scipy.org > > >>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>> > > >>> > > >> > > >> > > >> -- > > >> Matt Haberland > > >> Assistant Adjunct Professor in the Program in Computing > > >> Department of Mathematics > > >> 7620E Math Sciences Building, UCLA > > >> -------------- next part -------------- > > >> An HTML attachment was scrubbed... > > >> URL: <https://mail.scipy.org/pipermail/scipy-dev/ > > >> attachments/20170310/00ea2b5d/attachment-0001.html> > > >> > > >> ------------------------------ > > >> > > >> Message: 2 > > >> Date: Sat, 11 Mar 2017 00:40:00 +0500 > > >> From: Nikolay Mayorov <nikolay.mayorov at zoho.com> > > >> To: "SciPy Developers List" <scipy-dev at scipy.org> > > >> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > > >> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> > > >> Content-Type: text/plain; charset="utf-8" > > >> > > >> Hi, Antonio! > > >> > > >> > > >> > > >> I too think that moving towards more modern algorithms and their > > >> implementations is good for scipy. I would be happy to mentor this project, > > >> and most likely I will be able to. > > >> > > >> > > >> > > >> Some current thoughts and questions: > > >> > > >> > > >> > > >> 1. Have you figured what is done in SLSQP (especially "least squares" > > >> part)? Do you plan to use a similar approach or another approach to SQP? (I > > >> figured there are several somewhat different approaches.) Setting on a > > >> literature reference (or most likely several of them) is essential. > > >> > > >> > > >> > > >> 2. I think it is not wrong to focus on a single solver if you feel that it > > >> will likely take the whole time. Or maybe you can prioritize: do this first > > >> for sure and then have alternatives, plan a) to switch to another solver or > > >> plan b) to improve\add something more minor. > > >> > > >> > > >> > > >> 3. Consider whether to fit a new solver into minimize or make it as a new > > >> separate solver. The latter approach gives a freedom to implement things > > >> exactly as you want (and not to depend on old suboptimal choices) , but I > > >> guess it can be considered as impractical/inconsistent by some people. > > >> Maybe it can be decided along the way. > > >> > > >> > > >> > > >> 4. I think it is important to start to think about benchmark problems > > >> early, maybe even start with them. It's hard to develop a complicated > > >> optimization algorithm without ability to see how efficiently it works > > >> right away. > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & > > >> lt;haberland at ucla.edu&gt; wrote ---- > > >> > > >> > > >> > > >> > > >> The choice of nonlinear optimization algorithm can have a dramatic impact > > >> on the speed and quality of the solution, and the best choice for a > > >> particular problem can be difficult to determine a priori, so it is > > >> important to have multiple options available. > > >> > > >> > > >> > > >> My work in optimal control leads to problems with (almost entirely) > > >> nonlinear constraints, and the use of derivative information is essential > > >> for reasonable performance, leaving SLSQP as the only option in SciPy right > > >> now. However, the problems are also huge and very sparse with a specific > > >> structure, so SLSQP is not very effective, and not nearly as effective as a > > >> nonlinear optimization routine could be. So despite SciPy boasting 14 > > >> options for minimization of a nonlinear objective, it wasn't suitable for > > >> this work (without the use of an external solver). > > >> > > >> > > >> > > >> I think SciPy is in need of at least one solver designed to handle large, > > >> fully nonlinear problems, and having two would be much better. Interior > > >> point and SQP are good, complementary options. > > >> > > >> > > >> > > >> > > >> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro &lt;antonior92 at gmail.com&gt; > > >> wrote: > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> -- > > >> > > >> Matt Haberland > > >> > > >> Assistant Adjunct Professor in the Program in Computing > > >> > > >> Department of Mathematics > > >> > > >> 7620E Math Sciences Building, UCLA > > >> > > >> > > >> > > >> > > >> _______________________________________________ > > >> > > >> SciPy-Dev mailing list > > >> > > >> SciPy-Dev at scipy.org > > >> > > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >> > > >> > > >> Hello, my name is Antonio and I am a Brazilian electrical engineer > > >> currently pursuing my master degree. I have contributed to scipy.optimize > > >> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method > > >> "trust-region-exact" (under revision). I am interested in applying for the > > >> Google Summer of Code 2017 to work with the Scipy optimisation package. > > >> > > >> My proposal is to improve scipy.optimize adding optimisation methods that > > >> are able to deal with non-linear constraints. Currently the only > > >> implemented methods able to deal with non-linear constraints are the > > >> FORTRAN wrappers SLSQP and COBYLA. > > >> > > >> SLSQP is a sequential quadratic programming method and COBYLA is a > > >> derivative-free optimisation method, they both have its limitations: SLSQP > > >> is not able to deal with sparse > > >> hessians and jacobians and is unfit for large-scale problems and COBYLA, > > >> as other derivative-free methods, is a good choice for optimise noisy > > >> objective functions, however usually presents a poorer performance then > > >> derivative-based methods when the derivatives are available (or even when > > >> they are computed by automatic differentiation or finite differences). > > >> My proposal is to implement in Scipy one or more state-of-the-art solvers > > >> (interior point and SQP methods) for constrained optimisation problems. I > > >> would like to get some feedback about this, discuss the relevance of it for > > >> Scipy and get some suggestions of possible mentors. > > >> > > >> > > >> > > >> > > >> _______________________________________________ > > >> > > >> SciPy-Dev mailing list > > >> > > >> SciPy-Dev at scipy.org > > >> > > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> -------------- next part -------------- > > >> An HTML attachment was scrubbed... > > >> URL: <https://mail.scipy.org/pipermail/scipy-dev/ > > >> attachments/20170311/3bd6e920/attachment.html> > > >> > > >> ------------------------------ > > >> > > >> Subject: Digest Footer > > >> > > >> _______________________________________________ > > >> SciPy-Dev mailing list > > >> SciPy-Dev at scipy.org > > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >> > > >> > > >> ------------------------------ > > >> > > >> End of SciPy-Dev Digest, Vol 161, Issue 11 > > >> ****************************************** > > >> > > > -------------- next part -------------- > > > An HTML attachment was scrubbed... > > > URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/20170311/098a8651/attachment-0001.html> > > > > > > ------------------------------ > > > > > > Message: 2 > > > Date: Sat, 11 Mar 2017 09:01:40 -0800 > > > From: Matt Haberland <haberland at ucla.edu> > > > To: SciPy Developers List <scipy-dev at scipy.org> > > > Subject: Re: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11 > > > Message-ID: > > > <CADuxUiyT==+J_5tPjPw_q=LfAKwv1t+e+6wVsF8sHASVO+PYxA at mail.gmail.com> > > > Content-Type: text/plain; charset="utf-8" > > > > > > IPOPT is under EPL. From the FAQ: > > > > > > *Does the EPL allow me to take the Source Code for a Program licensed under > > > it and include all or part of it in another program licensed under the GNU > > > General Public License (GPL), Berkeley Software Distribution (BSD) license > > > or other Open Source license?* > > > No. Only the owner of software can decide whether and how to license it to > > > others. Contributors to a Program licensed under the EPL understand that > > > source code for the Program will be made available under the terms of the > > > EPL. Unless you are the owner of the software or have received permission > > > from the owner, you are not authorized to apply the terms of another > > > license to the Program by including it in a program licensed under another > > > Open Source license. > > > > > > It might be worth asking the contributors, but their permission would be > > > necessary. > > > > > > > > > On Mar 11, 2017 7:09 AM, "Denis Akhiyarov" <denis.akhiyarov at gmail.com> > > > wrote: > > > > > >> I used ipopt for interior-point method from python, are you trying to add > > >> something similar to scipy? if yes, why not just add a wrapper for ipopt, > > >> since the license looks not restrictive? > > >> > > >> On Fri, Mar 10, 2017 at 1:40 PM, <scipy-dev-request at scipy.org> wrote: > > >> > > >>> Send SciPy-Dev mailing list submissions to > > >>> scipy-dev at scipy.org > > >>> > > >>> To subscribe or unsubscribe via the World Wide Web, visit > > >>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>> or, via email, send a message with subject or body 'help' to > > >>> scipy-dev-request at scipy.org > > >>> > > >>> You can reach the person managing the list at > > >>> scipy-dev-owner at scipy.org > > >>> > > >>> When replying, please edit your Subject line so it is more specific > > >>> than "Re: Contents of SciPy-Dev digest..." > > >>> > > >>> > > >>> Today's Topics: > > >>> > > >>> 1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland) > > >>> 2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov) > > >>> > > >>> > > >>> ---------------------------------------------------------------------- > > >>> > > >>> Message: 1 > > >>> Date: Fri, 10 Mar 2017 10:01:32 -0800 > > >>> From: Matt Haberland <haberland at ucla.edu> > > >>> To: SciPy Developers List <scipy-dev at scipy.org> > > >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > > >>> Message-ID: > > >>> <CADuxUiyK_d3BvjinikmG_k8LTh8JczMFPwboNaSerdnsWFp=yQ at mail. > > >>> gmail.com> > > >>> Content-Type: text/plain; charset="utf-8" > > >>> > > >>> The choice of nonlinear optimization algorithm can have a dramatic impact > > >>> on the speed and quality of the solution, and the best choice for a > > >>> particular problem can be difficult to determine a priori, so it is > > >>> important to have multiple options available. > > >>> > > >>> My work in optimal control leads to problems with (almost entirely) > > >>> nonlinear constraints, and the use of derivative information is essential > > >>> for reasonable performance, leaving SLSQP as the only option in SciPy > > >>> right > > >>> now. However, the problems are also huge and very sparse with a specific > > >>> structure, so SLSQP is not very effective, and not nearly as effective as > > >>> a > > >>> nonlinear optimization routine could be. So despite SciPy boasting 14 > > >>> options for minimization of a nonlinear objective, it wasn't suitable for > > >>> this work (without the use of an external solver). > > >>> > > >>> I think SciPy is in need of at least one solver designed to handle large, > > >>> fully nonlinear problems, and having two would be much better. Interior > > >>> point and SQP are good, complementary options. > > >>> > > >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro <antonior92 at gmail.com> > > >>> wrote: > > >>> > > >>>> Hello, my name is Antonio and I am a Brazilian electrical engineer > > >>>> currently pursuing my master degree. I have contributed to > > >>> scipy.optimize > > >>>> and scipy.signal implementing functions "iirnotch", "irrpeak" > > >>>> <https://github.com/scipy/scipy/pull/6404>and the method > > >>>> "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under > > >>>> revision). I am interested in applying for the Google Summer of Code > > >>> 2017 > > >>>> to work with the Scipy optimisation package. > > >>>> > > >>>> My proposal is to improve scipy.optimize adding optimisation methods > > >>> that > > >>>> are able to deal with non-linear constraints. Currently the only > > >>>> implemented methods able to deal with non-linear constraints are the > > >>>> FORTRAN wrappers SLSQP and COBYLA. > > >>>> > > >>>> SLSQP is a sequential quadratic programming method and COBYLA is a > > >>>> derivative-free optimisation method, they both have its limitations: > > >>>> SLSQP is not able to deal with sparse > > >>>> hessians and jacobians and is unfit for large-scale problems and COBYLA, > > >>>> as other derivative-free methods, is a good choice for optimise noisy > > >>>> objective functions, however usually presents a poorer performance then > > >>>> derivative-based methods when the derivatives are available (or even > > >>> when > > >>>> they are computed by automatic differentiation or finite differences). > > >>>> > > >>>> My proposal is to implement in Scipy one or more state-of-the-art > > >>> solvers > > >>>> (interior point and SQP methods) for constrained optimisation problems. > > >>> I > > >>>> would like to get some feedback about this, discuss the relevance of it > > >>> for > > >>>> Scipy and get some suggestions of possible mentors. > > >>>> > > >>>> _______________________________________________ > > >>>> SciPy-Dev mailing list > > >>>> SciPy-Dev at scipy.org > > >>>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>>> > > >>>> > > >>> > > >>> > > >>> -- > > >>> Matt Haberland > > >>> Assistant Adjunct Professor in the Program in Computing > > >>> Department of Mathematics > > >>> 7620E Math Sciences Building, UCLA > > >>> -------------- next part -------------- > > >>> An HTML attachment was scrubbed... > > >>> URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/ > > >>> 20170310/00ea2b5d/attachment-0001.html> > > >>> > > >>> ------------------------------ > > >>> > > >>> Message: 2 > > >>> Date: Sat, 11 Mar 2017 00:40:00 +0500 > > >>> From: Nikolay Mayorov <nikolay.mayorov at zoho.com> > > >>> To: "SciPy Developers List" <scipy-dev at scipy.org> > > >>> Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy > > >>> Message-ID: <15ab9bc2975.f4cd34ac5338.5894437124599888508 at zoho.com> > > >>> Content-Type: text/plain; charset="utf-8" > > >>> > > >>> Hi, Antonio! > > >>> > > >>> > > >>> > > >>> I too think that moving towards more modern algorithms and their > > >>> implementations is good for scipy. I would be happy to mentor this project, > > >>> and most likely I will be able to. > > >>> > > >>> > > >>> > > >>> Some current thoughts and questions: > > >>> > > >>> > > >>> > > >>> 1. Have you figured what is done in SLSQP (especially "least squares" > > >>> part)? Do you plan to use a similar approach or another approach to SQP? (I > > >>> figured there are several somewhat different approaches.) Setting on a > > >>> literature reference (or most likely several of them) is essential. > > >>> > > >>> > > >>> > > >>> 2. I think it is not wrong to focus on a single solver if you feel that > > >>> it will likely take the whole time. Or maybe you can prioritize: do this > > >>> first for sure and then have alternatives, plan a) to switch to another > > >>> solver or plan b) to improve\add something more minor. > > >>> > > >>> > > >>> > > >>> 3. Consider whether to fit a new solver into minimize or make it as a new > > >>> separate solver. The latter approach gives a freedom to implement things > > >>> exactly as you want (and not to depend on old suboptimal choices) , but I > > >>> guess it can be considered as impractical/inconsistent by some people. > > >>> Maybe it can be decided along the way. > > >>> > > >>> > > >>> > > >>> 4. I think it is important to start to think about benchmark problems > > >>> early, maybe even start with them. It's hard to develop a complicated > > >>> optimization algorithm without ability to see how efficiently it works > > >>> right away. > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> ---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland & > > >>> lt;haberland at ucla.edu&gt; wrote ---- > > >>> > > >>> > > >>> > > >>> > > >>> The choice of nonlinear optimization algorithm can have a dramatic impact > > >>> on the speed and quality of the solution, and the best choice for a > > >>> particular problem can be difficult to determine a priori, so it is > > >>> important to have multiple options available. > > >>> > > >>> > > >>> > > >>> My work in optimal control leads to problems with (almost entirely) > > >>> nonlinear constraints, and the use of derivative information is essential > > >>> for reasonable performance, leaving SLSQP as the only option in SciPy right > > >>> now. However, the problems are also huge and very sparse with a specific > > >>> structure, so SLSQP is not very effective, and not nearly as effective as a > > >>> nonlinear optimization routine could be. So despite SciPy boasting 14 > > >>> options for minimization of a nonlinear objective, it wasn't suitable for > > >>> this work (without the use of an external solver). > > >>> > > >>> > > >>> > > >>> I think SciPy is in need of at least one solver designed to handle large, > > >>> fully nonlinear problems, and having two would be much better. Interior > > >>> point and SQP are good, complementary options. > > >>> > > >>> > > >>> > > >>> > > >>> On Thu, Mar 9, 2017 at 2:38 PM, Antonio Ribeiro &lt;antonior92 at gmail.com&gt; > > >>> wrote: > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> -- > > >>> > > >>> Matt Haberland > > >>> > > >>> Assistant Adjunct Professor in the Program in Computing > > >>> > > >>> Department of Mathematics > > >>> > > >>> 7620E Math Sciences Building, UCLA > > >>> > > >>> > > >>> > > >>> > > >>> _______________________________________________ > > >>> > > >>> SciPy-Dev mailing list > > >>> > > >>> SciPy-Dev at scipy.org > > >>> > > >>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>> > > >>> > > >>> Hello, my name is Antonio and I am a Brazilian electrical engineer > > >>> currently pursuing my master degree. I have contributed to scipy.optimize > > >>> and scipy.signal implementing functions "iirnotch", "irrpeak"and the method > > >>> "trust-region-exact" (under revision). I am interested in applying for the > > >>> Google Summer of Code 2017 to work with the Scipy optimisation package. > > >>> > > >>> My proposal is to improve scipy.optimize adding optimisation methods that > > >>> are able to deal with non-linear constraints. Currently the only > > >>> implemented methods able to deal with non-linear constraints are the > > >>> FORTRAN wrappers SLSQP and COBYLA. > > >>> > > >>> SLSQP is a sequential quadratic programming method and COBYLA is a > > >>> derivative-free optimisation method, they both have its limitations: SLSQP > > >>> is not able to deal with sparse > > >>> hessians and jacobians and is unfit for large-scale problems and COBYLA, > > >>> as other derivative-free methods, is a good choice for optimise noisy > > >>> objective functions, however usually presents a poorer performance then > > >>> derivative-based methods when the derivatives are available (or even when > > >>> they are computed by automatic differentiation or finite differences). > > >>> My proposal is to implement in Scipy one or more state-of-the-art solvers > > >>> (interior point and SQP methods) for constrained optimisation problems. I > > >>> would like to get some feedback about this, discuss the relevance of it for > > >>> Scipy and get some suggestions of possible mentors. > > >>> > > >>> > > >>> > > >>> > > >>> _______________________________________________ > > >>> > > >>> SciPy-Dev mailing list > > >>> > > >>> SciPy-Dev at scipy.org > > >>> > > >>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> -------------- next part -------------- > > >>> An HTML attachment was scrubbed... > > >>> URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/ > > >>> 20170311/3bd6e920/attachment.html> > > >>> > > >>> ------------------------------ > > >>> > > >>> Subject: Digest Footer > > >>> > > >>> _______________________________________________ > > >>> SciPy-Dev mailing list > > >>> SciPy-Dev at scipy.org > > >>> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >>> > > >>> > > >>> ------------------------------ > > >>> > > >>> End of SciPy-Dev Digest, Vol 161, Issue 11 > > >>> ****************************************** > > >>> > > >> > > >> > > >> _______________________________________________ > > >> SciPy-Dev mailing list > > >> SciPy-Dev at scipy.org > > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > > >> > > >> > > > -------------- next part -------------- > > > An HTML attachment was scrubbed... > > > URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/20170311/1fe49fd7/attachment.html> > > > > > > ------------------------------ > > > > > > Subject: Digest Footer > > > > > > _______________________________________________ > > > SciPy-Dev mailing list > > > SciPy-Dev at scipy.org > > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > > ------------------------------ > > > > > > End of SciPy-Dev Digest, Vol 161, Issue 12 > > > ****************************************** > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 161, Issue 16 > ****************************************** From ralf.gommers at gmail.com Mon Mar 13 03:58:14 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 13 Mar 2017 20:58:14 +1300 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: <6a3f2824-f319-d37d-3463-e9e3b3ccf457@mixmax.com> References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> <6a3f2824-f319-d37d-3463-e9e3b3ccf457@mixmax.com> Message-ID: On Mon, Mar 13, 2017 at 9:47 AM, Karan Desai wrote: > Okay, so I'll clarify one thing based on all the suggestions above: > > Rewrting assert methods as statements will not be a part of my proposal. I > am talking about mainly two things here - one is about numpy testing utils, > which will be used by other libraries directly (like assert array methods), > and other is about the test suite of scipy and numpy itself, comprising of > unit tests of these two libraries itself. > > My idea is to keep the former as much independent of testing framework as > possible, while adopt pytest flavor (viz. fixtures and parametrization) for > the latter. > That sounds right to me. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Mar 13 03:59:58 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 13 Mar 2017 20:59:58 +1300 Subject: [SciPy-Dev] [GSoC 2017] Nose to Pytest Migration - conclusions from previous thread In-Reply-To: References: <683dd04a-34f3-268f-2e78-1b8d71084baa@mixmax.com> Message-ID: On Mon, Mar 13, 2017 at 3:56 AM, Robert Kern wrote: > On Sun, Mar 12, 2017 at 12:19 AM, Karan Desai > wrote: > > > > > Plain assert keyword statements. > > Just a note: numpy's test suite, at least, must not use `assert` > statements. numpy's test suite must be runnable under `python -O`. numpy > does some docstring manipulation that could fail in principle under those > conditions (i.e. it did fail at one point until we fixed it, so now we have > to ensure that it doesn't regress). scipy might have the same issue (e.g. > `scipy.special` ufuncs and `scipy.stats` distribution objects), but I > forget if we've made that a policy there too. > Yes, that's the policy - no plain assert statements anywhere. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.zhurko at gmail.com Mon Mar 13 18:00:31 2017 From: evgeny.zhurko at gmail.com (Evgeny Zhurko) Date: Tue, 14 Mar 2017 01:00:31 +0300 Subject: [SciPy-Dev] GSoC17 scipy.diff & B-splines questions Message-ID: Hi, My name's Evgeny Zhurko. I'm a 3'rd year student of the Belarusian State University of Informatics and Radioelectronics, Minsk, Belarus. I want to participate in GSoC with Scipy. About scipy.diff: It's not necessary to invent the wheel, so, as for me, the best way to implement this idea will be moving numerical and algorithm differentiation from numdifftools to scipy with some changes (new class NDerivative for n> = 1 - from GSOC15 discussion). API proposed by Maniteja Nandana (GSoC15) i find reasonable, but with small controversal questions: I think it's not necessary to separate `richardson` as a special method, because we can include richardson interpolation as a required part of computing. N (derivative order) was proposed 1-4. Numdifftools can compute derivative of order 1-10, so scipy will compute the same :-) Current scipy functionality has parameter "method" as '2-point' or '3-point'. I adhere to the naming 'forward', 'backward', 'central', 'complex', 'multicomplex' for parameter "method". Boundary optional param should be added in new API as a change numdifftools. Estimates: it's not very hard task to move numdifftools to scipy with small changing functionality. So, optional part of idea can be partially implemented as required part of idea. About B-spline improvement: Do you want to be able to read the relevant literature? :-) Generally speaking, i find this idea the most interesting and the hardest for me, because i never worked with splines before. Best Regards, Evgeny Zhurko -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.zhurko at gmail.com Tue Mar 14 05:01:30 2017 From: evgeny.zhurko at gmail.com (Evgeny Zhurko) Date: Tue, 14 Mar 2017 12:01:30 +0300 Subject: [SciPy-Dev] GSoC17 scipy.diff & B-splines questions In-Reply-To: References: Message-ID: About B-spline improvement(i lost a part of question in previous message): Do you expect student with a good knowledge of splines or it's can be hard-working student that spend about quarter of the all time(maybe more) to reading the relevant literature? :-) Generally speaking, i find this idea the most interesting and the hardest for me, because i never worked with splines before. 2017-03-14 1:00 GMT+03:00 Evgeny Zhurko : > Hi, > > My name's Evgeny Zhurko. I'm a 3'rd year student of the Belarusian State > University of Informatics and Radioelectronics, Minsk, Belarus. I want to > participate in GSoC with Scipy. > > About scipy.diff: > It's not necessary to invent the wheel, so, as for me, the best way to > implement this idea will be moving numerical and algorithm differentiation > from numdifftools to scipy with some changes (new class NDerivative for n> > = 1 - from GSOC15 discussion). > API proposed by Maniteja Nandana (GSoC15) i find reasonable, but with > small controversal questions: > I think it's not necessary to separate `richardson` as a special method, > because we can include richardson interpolation as a required part of > computing. > N (derivative order) was proposed 1-4. Numdifftools can compute derivative > of order 1-10, so scipy will compute the same :-) > > Current scipy functionality has parameter "method" as '2-point' or > '3-point'. I adhere to the naming 'forward', 'backward', 'central', > 'complex', 'multicomplex' for parameter "method". Boundary optional param > should be added in new API as a change numdifftools. > > Estimates: it's not very hard task to move numdifftools to scipy with > small changing functionality. So, optional part of idea can be partially > implemented as required part of idea. > > About B-spline improvement: > Do you want to be able to read the relevant literature? :-) > Generally speaking, i find this idea the most interesting and the hardest > for me, because i never worked with splines before. > > Best Regards, > Evgeny Zhurko > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Tue Mar 14 14:24:34 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Tue, 14 Mar 2017 23:24:34 +0500 Subject: [SciPy-Dev] GSoC17 scipy.diff & B-splines questions In-Reply-To: References: Message-ID: <15ace108756.e14c041639491.7794770190852580833@zoho.com> Hi Evgeny, I'll give you my general opinion about your last question. In order to write a good proposal for B-splines project you'll need to have a rather good understanding of the subject, it's just how it is. So if you are really interested in this subject then go ahead and start thinking about your proposal and learn necessary (for the proposal) things about B-splines. But you definitely don't need to develop complete and expert knowledge of this subject to apply, just get a good idea of what you want to implement during GSoC and how you can achieve that. Also I would strongly recommend working on one of easier issues listed by Evgeni Burovski here https://github.com/scipy/scipy/issues/6730 as part of the application process. Evgeni himself might have his opinion different from mine :). I'm sure he will answer you too. Kind regards, Nikolay Mayorov. ---- On Tue, 14 Mar 2017 14:01:30 +0500 Evgeny Zhurko <evgeny.zhurko at gmail.com> wrote ---- About B-spline improvement(i lost a part of question in previous message): Do you expect student with a good knowledge of splines or it's can be hard-working student that spend about quarter of the all time(maybe more) to reading the relevant literature? :-) Generally speaking, i find this idea the most interesting and the hardest for me, because i never worked with splines before. 2017-03-14 1:00 GMT+03:00 Evgeny Zhurko <evgeny.zhurko at gmail.com>: Hi, My name's Evgeny Zhurko. I'm a 3'rd year student of the Belarusian State University of Informatics and Radioelectronics, Minsk, Belarus. I want to participate in GSoC with Scipy. About scipy.diff: It's not necessary to invent the wheel, so, as for me, the best way to implement this idea will be moving numerical and algorithm differentiation from numdifftools to scipy with some changes (new class NDerivative for n> = 1 - from GSOC15 discussion). API proposed by Maniteja Nandana (GSoC15) i find reasonable, but with small controversal questions: I think it's not necessary to separate `richardson` as a special method, because we can include richardson interpolation as a required part of computing. N (derivative order) was proposed 1-4. Numdifftools can compute derivative of order 1-10, so scipy will compute the same :-) Current scipy functionality has parameter "method" as '2-point' or '3-point'. I adhere to the naming 'forward', 'backward', 'central', 'complex', 'multicomplex' for parameter "method". Boundary optional param should be added in new API as a change numdifftools. Estimates: it's not very hard task to move numdifftools to scipy with small changing functionality. So, optional part of idea can be partially implemented as required part of idea. About B-spline improvement: Do you want to be able to read the relevant literature? :-) Generally speaking, i find this idea the most interesting and the hardest for me, because i never worked with splines before. Best Regards, Evgeny Zhurko _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From toddrjen at gmail.com Wed Mar 15 18:47:57 2017 From: toddrjen at gmail.com (Todd) Date: Wed, 15 Mar 2017 18:47:57 -0400 Subject: [SciPy-Dev] GSoC'17: Circular statistics In-Reply-To: References: Message-ID: On Mar 11, 2017 14:50, "Vladislav Iakovlev" wrote: Hello! My name is Vladislav Iakovlev. I am a master student of Department of Applied Math, HSE University, Moscow, Russia. I have never contributed to open source projects, but I would be happy to start it with SciPy. I noticed that existing functionality for circular statistics is insufficient. So, I want to suggest an idea for GSoC: implement it to scipy.stats. My plan is to do it through the next steps: 1) Develop the class rv_circular, analogous to rv_continuous adjusted to circular statistic functions. 2) Develop derived classes for circular distributions. 3) Develop point estimations and statistical tests functions. During the summer, I assume to implement materials from chapters 1-8 of the book ?MARDIA, K. V. AND JUPP , P. E. (2000), Directional Statistics, John Wiley?, documentation and unit tests for it. Is this idea interesting for the Community? I?m glad to any feedback. This may be a little crazy, but would it be possible to wait for parameterized dtypes and create a circular dtype to handle all this automagically? -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Wed Mar 15 19:30:49 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 15 Mar 2017 19:30:49 -0400 Subject: [SciPy-Dev] GSoC'17: Circular statistics In-Reply-To: References: Message-ID: On Wed, Mar 15, 2017 at 6:47 PM, Todd wrote: > > > On Mar 11, 2017 14:50, "Vladislav Iakovlev" wrote: > > Hello! > > > My name is Vladislav Iakovlev. I am a master student of Department of > Applied Math, HSE University, Moscow, Russia. I have never contributed to > open source projects, but I would be happy to start it with SciPy. > > > I noticed that existing functionality for circular statistics is > insufficient. So, I want to suggest an idea for GSoC: implement it to > scipy.stats. My plan is to do it through the next steps: > > 1) Develop the class rv_circular, analogous to rv_continuous adjusted > to circular statistic functions. > > 2) Develop derived classes for circular distributions. > > 3) Develop point estimations and statistical tests functions. > > During the summer, I assume to implement materials from chapters 1-8 of the > book ?MARDIA, K. V. AND JUPP , P. E. (2000), Directional Statistics, John > Wiley?, documentation and unit tests for it. > > > Is this idea interesting for the Community? I?m glad to any feedback. > > > This may be a little crazy, but would it be possible to wait for > parameterized dtypes and create a circular dtype to handle all this > automagically? A dtype will not know statistics. AFAIU, I would be just something like a substitute for units as in astropy, somewhere there still needs to be the code to compute statistics on a circle or sphere or ... (Personally, I like plain float and complex ndarrays and prefer to leave all units, datetimes and whatever in a wrapper for the interface but out of the algorithms.) One design decision for rv_circular is how to handle integration on a circle, i.e. how to map cdf on R into a cdf on a circle, and the associated integration over subintervals. vonmised.cdf is defined on R which turned out to be computationally convenient but doesn't make a "proper" cdf that is e.g. limited to [0, 1] and doesn't define support bounds for integration. Josef > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From amanpratik10 at gmail.com Thu Mar 16 04:06:09 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Thu, 16 Mar 2017 13:36:09 +0530 Subject: [SciPy-Dev] GSoC 2017: BSpline support Message-ID: Hello Developers, This is Aman Pratik. I am currently pursuing my B.Tech from Indian Institute of Technology, Varanasi. I am a keen software developer and not very new to the open source community. I am interested in your project "Improve support for B-splines" for GSoC 2017. I have been working in Python for the past 2 years and have good idea about Mathematical Methods and Techniques. I am quite familiar with scipy mainly as a user and also as a developer. I also have some experience with Cython. These are the PRs I have worked/working on for the past month. ENH: Add evaluation of periodic splines #6730 DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 My GitHub Profile: https://www.github.com/amanp10 I am interested in Mathematical models and functions and therefore willing to work on splines. Also, I am familiar with Benchmark tests, Unit tests and other technical knowledge I would require for this project. I have started my study for the subject and am looking forward to guidance from the potential mentors or other developers. Thank You -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Mar 16 10:13:25 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 16 Mar 2017 17:13:25 +0300 Subject: [SciPy-Dev] Opinions on deprecating and removing scipy.interpolate.interpolate_wrapper? Message-ID: Hi, Here's this corner of scipy.interpolate: In [1]: from scipy.interpolate.interpolate_wrapper import _interpolate linear absolute_import logarithmic atleast_1d_and_contiguous nearest block np block_average_above print_function division These functions are tested, but are not documented, and not exposed to the top level namespace. These were added in 2008 and have only received a handful maintenance touches since then. They look like bits and pieces from some larger effort which went unfinished and abandoned. Here's a PR which proposes to deprecate and eventually remove them, https://github.com/scipy/scipy/pull/7187 Are you using these functions? Would you miss them if they're gone? Do you see a benefit from them being kept? Would you use them if they are documented? Please speak up. Cheers, Evgeni From evgeny.burovskiy at gmail.com Thu Mar 16 10:33:59 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 16 Mar 2017 17:33:59 +0300 Subject: [SciPy-Dev] GSoC 2017: BSpline support In-Reply-To: References: Message-ID: On Thu, Mar 16, 2017 at 11:06 AM, Aman Pratik wrote: > Hello Developers, > > This is Aman Pratik. I am currently pursuing my B.Tech from Indian > Institute of Technology, Varanasi. I am a keen software developer and not > very new to the open source community. I am interested in your project > "Improve support for B-splines" for GSoC 2017. > > I have been working in Python for the past 2 years and have good idea > about Mathematical Methods and Techniques. I am quite familiar with scipy > mainly as a user and also as a developer. I also have some experience with > Cython. > > These are the PRs I have worked/working on for the past month. > > ENH: Add evaluation of periodic splines #6730 > > > DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) > > > ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 > > > My GitHub Profile: https://www.github.com/amanp10 > > I am interested in Mathematical models and functions and therefore willing > to work on splines. Also, I am familiar with Benchmark tests, Unit tests > and other technical knowledge I would require for this project. I have > started my study for the subject and am looking forward to guidance from > the potential mentors or other developers. > > Thank You > > Hi Aman, and welcome! I see that you already started with https://github.com/scipy/scipy/pull/7185, great! We'll also need a proposal. Since writing a good proposal typically takes several iterations, I encourage you to start working on it, too. When you have a draft, please send it to the list. All the best, Evgeni -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Mar 16 10:50:26 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 16 Mar 2017 17:50:26 +0300 Subject: [SciPy-Dev] GSoC17 scipy.diff & B-splines questions In-Reply-To: <15ace108756.e14c041639491.7794770190852580833@zoho.com> References: <15ace108756.e14c041639491.7794770190852580833@zoho.com> Message-ID: On Tue, Mar 14, 2017 at 9:24 PM, Nikolay Mayorov wrote: > Hi Evgeny, > > I'll give you my general opinion about your last question. In order to write > a good proposal for B-splines project you'll need to have a rather good > understanding of the subject, it's just how it is. So if you are really > interested in this subject then go ahead and start thinking about your > proposal and learn necessary (for the proposal) things about B-splines. But > you definitely don't need to develop complete and expert knowledge of this > subject to apply, just get a good idea of what you want to implement during > GSoC and how you can achieve that. > > Also I would strongly recommend working on one of easier issues listed by > Evgeni Burovski here https://github.com/scipy/scipy/issues/6730 as part of > the application process. > > Evgeni himself might have his opinion different from mine :). I'm sure he > will answer you too. > > Kind regards, > Nikolay Mayorov. > > > ---- On Tue, 14 Mar 2017 14:01:30 +0500 Evgeny Zhurko > wrote ---- > > About B-spline improvement(i lost a part of question in previous message): > Do you expect student with a good knowledge of splines or it's can be > hard-working student that spend about quarter of the all time(maybe more) to > reading the relevant literature? :-) > Generally speaking, i find this idea the most interesting and the hardest > for me, because i never worked with splines before. > > 2017-03-14 1:00 GMT+03:00 Evgeny Zhurko : > > Hi, > > My name's Evgeny Zhurko. I'm a 3'rd year student of the Belarusian State > University of Informatics and Radioelectronics, Minsk, Belarus. I want to > participate in GSoC with Scipy. > > About scipy.diff: > It's not necessary to invent the wheel, so, as for me, the best way to > implement this idea will be moving numerical and algorithm differentiation > from numdifftools to scipy with some changes (new class NDerivative for n> = > 1 - from GSOC15 discussion). > API proposed by Maniteja Nandana (GSoC15) i find reasonable, but with small > controversal questions: > I think it's not necessary to separate `richardson` as a special method, > because we can include richardson interpolation as a required part of > computing. > N (derivative order) was proposed 1-4. Numdifftools can compute derivative > of order 1-10, so scipy will compute the same :-) > > Current scipy functionality has parameter "method" as '2-point' or > '3-point'. I adhere to the naming 'forward', 'backward', 'central', > 'complex', 'multicomplex' for parameter "method". Boundary optional param > should be added in new API as a change numdifftools. > > Estimates: it's not very hard task to move numdifftools to scipy with small > changing functionality. So, optional part of idea can be partially > implemented as required part of idea. > > About B-spline improvement: > Do you want to be able to read the relevant literature? :-) > Generally speaking, i find this idea the most interesting and the hardest > for me, because i never worked with splines before. > > Best Regards, > Evgeny Zhurko Hi Evgeny, To echo what Nikolay said: we certainly do not expect a student to have a pre-existing expert knowledge on the subject! The ability to read literature and implement algorithms off papers and/or books can be essential however. This project also can be naturally made into a series of sub-projects, some of them algorithmic, some more of UI improvements. Thus a student can use a "rolling wave" type approach to reading the literature, where they read up as needed. And also the first draft of the proposal does not need to be perfect: it typically takes several iterations to converge. Cheers, Evgeni From evgeny.burovskiy at gmail.com Thu Mar 16 11:46:00 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 16 Mar 2017 18:46:00 +0300 Subject: [SciPy-Dev] GSoC'17: Circular statistics In-Reply-To: References: Message-ID: Hi, On Sun, Mar 12, 2017 at 12:11 AM, wrote: > On Sat, Mar 11, 2017 at 2:50 PM, Vladislav Iakovlev wrote: >> Hello! >> >> >> My name is Vladislav Iakovlev. I am a master student of Department of >> Applied Math, HSE University, Moscow, Russia. I have never contributed to >> open source projects, but I would be happy to start it with SciPy. >> >> >> I noticed that existing functionality for circular statistics is >> insufficient. So, I want to suggest an idea for GSoC: implement it to >> scipy.stats. My plan is to do it through the next steps: >> >> 1) Develop the class rv_circular, analogous to rv_continuous adjusted >> to circular statistic functions. >> >> 2) Develop derived classes for circular distributions. >> >> 3) Develop point estimations and statistical tests functions. >> >> During the summer, I assume to implement materials from chapters 1-8 of the >> book ?MARDIA, K. V. AND JUPP , P. E. (2000), Directional Statistics, John >> Wiley?, documentation and unit tests for it. >> >> >> Is this idea interesting for the Community? I?m glad to any feedback. >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> Full disclosure: I work at HSE where Vladislav is a student and we are in touch off-line. > More general question: What should be the future of circular stats in python? My personal take is that basic things are in scope for scipy and more advanced things are better suited for statsmodels (e.g., a regression w.r.t. a circular variable? maybe except the analog of scipy.stats.linregress). There is certainly room for rv_circular variable (https://github.com/scipy/scipy/issues/4598#issuecomment-77359593). There is a series of PRs from astropy which stall because of backwards compat: https://github.com/scipy/scipy/pull/5747, also https://github.com/scipy/scipy/issues/6644 I personally find it a bit odd to have to install the whole of astropy to do a one-off calculation of a circular median or something like that. But it can be just me :-). If astropy devs want to collaborate, all the better. IMO, this can be a small subpackage in scipy.stats, scipy.stats.circular (a made-up name). > https://github.com/circstat/pycircstat is a python translation of the > matlab toolbox. It doesn't have a license (MIT is commented out in > setup.py) but the original matlab toolbox was BSD licensed on the file > exchange. The (lack of a) license is a bit of a problem. It also seems to depend on pandas. > I worked on circularstats for several months in 2012 and didn't look > at it since then. I also translated the matlab toolbox and added some > more parts based on Mardia. > http://josef-pkt.github.io/pages/packages/circularstats/circular.html > table of content only, I never open sourced it. (*) Would you be interested in working on this again? (working, advising, reviewing, some other form of participation?) Cheers, Evgeni From warren.weckesser at gmail.com Thu Mar 16 15:19:08 2017 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Thu, 16 Mar 2017 15:19:08 -0400 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats Message-ID: I'm working on an update to the Frechet distribution in scipy.stats (see https://github.com/scipy/scipy/issues/3258 and https://github.com/scipy/scipy/pull/3275). Instead jumping through the "lazy_where" hoops that are required for conditional computations, it would be much easier to create a ufunc for the standard PDF, CDF and possibly other required functions. Easier, that is, if I use the ufunc generation tools that we have over in scipy.special. Would there be any objections to this? We already have quite a few functions for probability distributions in scipy.special: https://docs.scipy.org/doc/scipy/reference/special.html#raw-statistical-functions I wouldn't mind creating ufuncs for some of the other distributions, too. A ufunc implementation is more efficient, simplifies the code in scipy.stats, and automatically handles broadcasting. I'm bringing this up here to see if anyone has any objections to the expansion of the statistical functions in scipy.special. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Thu Mar 16 15:39:49 2017 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Thu, 16 Mar 2017 15:39:49 -0400 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser < warren.weckesser at gmail.com> wrote: > I'm working on an update to the Frechet distribution in scipy.stats (see > https://github.com/scipy/scipy/issues/3258 and https://github.com/scipy/ > scipy/pull/3275). > > Instead jumping through the "lazy_where" hoops that are required for > conditional computations, it would be much easier to create a ufunc for the > standard PDF, CDF and possibly other required functions. Easier, that is, > if I use the ufunc generation tools that we have over in scipy.special. > Would there be any objections to this? We already have quite a few > functions for probability distributions in scipy.special: > https://docs.scipy.org/doc/scipy/reference/special.html# > raw-statistical-functions > > I wouldn't mind creating ufuncs for some of the other distributions, too. > A ufunc implementation is more efficient, simplifies the code in > scipy.stats, and automatically handles broadcasting. > > I'm bringing this up here to see if anyone has any objections to the > expansion of the statistical functions in scipy.special. > > Warren > > In my previous email, the heading hints at an alternative that I didn't mention in the text. The question implied in the heading is: what do folks think about adding ufunc generation tools to scipy.stats, instead of generating the ufuncs in scipy.special. There are a lot of conditional computations in scipy.stats that would benefit from being implemented as ufuncs, but probably don't need to be public functions. So instead of adding more functions to scipy.special, perhaps we could add code in scipy.stats for generating ufuncs, many of which would be private. Of course, we could just generate private ufuncs in scipy.special, and only use them in scipy.stats. What do you think? Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Mar 16 16:14:32 2017 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 16 Mar 2017 13:14:32 -0700 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser < warren.weckesser at gmail.com> wrote: > > On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser < warren.weckesser at gmail.com> wrote: >> >> I'm working on an update to the Frechet distribution in scipy.stats (see https://github.com/scipy/scipy/issues/3258 and https://github.com/scipy/scipy/pull/3275). >> >> Instead jumping through the "lazy_where" hoops that are required for conditional computations, it would be much easier to create a ufunc for the standard PDF, CDF and possibly other required functions. Easier, that is, if I use the ufunc generation tools that we have over in scipy.special. Would there be any objections to this? We already have quite a few functions for probability distributions in scipy.special: https://docs.scipy.org/doc/scipy/reference/special.html#raw-statistical-functions >> >> I wouldn't mind creating ufuncs for some of the other distributions, too. A ufunc implementation is more efficient, simplifies the code in scipy.stats, and automatically handles broadcasting. >> >> I'm bringing this up here to see if anyone has any objections to the expansion of the statistical functions in scipy.special. >> >> Warren > > In my previous email, the heading hints at an alternative that I didn't mention in the text. The question implied in the heading is: what do folks think about adding ufunc generation tools to scipy.stats, instead of generating the ufuncs in scipy.special. There are a lot of conditional computations in scipy.stats that would benefit from being implemented as ufuncs, but probably don't need to be public functions. So instead of adding more functions to scipy.special, perhaps we could add code in scipy.stats for generating ufuncs, many of which would be private. Of course, we could just generate private ufuncs in scipy.special, and only use them in scipy.stats. +1 for adding additional more standard PDF/CDF functions to scipy.special as needed. There's already precedent for putting statistics-related but not distribution-related ufuncs into scipy.special, specifically for the conditional operations, e.g. boxcox(). On the other hand, if the functions you are thinking of would not be part of the public API, then I'd prefer to implement them in scipy.stats instead of scipy.special. What work do you think is entailed in implementing the ufuncs in scipy.stats? Is there infrastructure we need to duplicate? Can we abstract out the build infrastructure to a common place? I haven't looked at the build details for scipy.special in some time. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.craig.wilson at gmail.com Thu Mar 16 16:28:10 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Thu, 16 Mar 2017 15:28:10 -0500 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: > Can we abstract out the build infrastructure to a common place? This should be fairly easy to do. It could be set up so that each module that needs ufuncs could have a local config file (cribbed off of the current `FUNCS` string in `generate_ufuncs.py`). Though I also don't object to adding more functions to special. On Thu, Mar 16, 2017 at 3:14 PM, Robert Kern wrote: > On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser > wrote: >> >> On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser >> wrote: >>> >>> I'm working on an update to the Frechet distribution in scipy.stats (see >>> https://github.com/scipy/scipy/issues/3258 and >>> https://github.com/scipy/scipy/pull/3275). >>> >>> Instead jumping through the "lazy_where" hoops that are required for >>> conditional computations, it would be much easier to create a ufunc for the >>> standard PDF, CDF and possibly other required functions. Easier, that is, >>> if I use the ufunc generation tools that we have over in scipy.special. >>> Would there be any objections to this? We already have quite a few >>> functions for probability distributions in scipy.special: >>> https://docs.scipy.org/doc/scipy/reference/special.html#raw-statistical-functions >>> >>> I wouldn't mind creating ufuncs for some of the other distributions, too. >>> A ufunc implementation is more efficient, simplifies the code in >>> scipy.stats, and automatically handles broadcasting. >>> >>> I'm bringing this up here to see if anyone has any objections to the >>> expansion of the statistical functions in scipy.special. >>> >>> Warren >> >> In my previous email, the heading hints at an alternative that I didn't >> mention in the text. The question implied in the heading is: what do folks >> think about adding ufunc generation tools to scipy.stats, instead of >> generating the ufuncs in scipy.special. There are a lot of conditional >> computations in scipy.stats that would benefit from being implemented as >> ufuncs, but probably don't need to be public functions. So instead of >> adding more functions to scipy.special, perhaps we could add code in >> scipy.stats for generating ufuncs, many of which would be private. Of >> course, we could just generate private ufuncs in scipy.special, and only use >> them in scipy.stats. > > +1 for adding additional more standard PDF/CDF functions to scipy.special as > needed. > > There's already precedent for putting statistics-related but not > distribution-related ufuncs into scipy.special, specifically for the > conditional operations, e.g. boxcox(). On the other hand, if the functions > you are thinking of would not be part of the public API, then I'd prefer to > implement them in scipy.stats instead of scipy.special. > > What work do you think is entailed in implementing the ufuncs in > scipy.stats? Is there infrastructure we need to duplicate? Can we abstract > out the build infrastructure to a common place? I haven't looked at the > build details for scipy.special in some time. > > -- > Robert Kern > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > From josh.craig.wilson at gmail.com Thu Mar 16 16:38:41 2017 From: josh.craig.wilson at gmail.com (Joshua Wilson) Date: Thu, 16 Mar 2017 15:38:41 -0500 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: ps--if we decide to abstract out the infrastructure I'd be willing to write the code. On Thu, Mar 16, 2017 at 3:28 PM, Joshua Wilson wrote: >> Can we abstract out the build infrastructure to a common place? > > This should be fairly easy to do. It could be set up so that each > module that needs ufuncs could have a local config file (cribbed off > of the current `FUNCS` string in `generate_ufuncs.py`). > > Though I also don't object to adding more functions to special. > > On Thu, Mar 16, 2017 at 3:14 PM, Robert Kern wrote: >> On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser >> wrote: >>> >>> On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser >>> wrote: >>>> >>>> I'm working on an update to the Frechet distribution in scipy.stats (see >>>> https://github.com/scipy/scipy/issues/3258 and >>>> https://github.com/scipy/scipy/pull/3275). >>>> >>>> Instead jumping through the "lazy_where" hoops that are required for >>>> conditional computations, it would be much easier to create a ufunc for the >>>> standard PDF, CDF and possibly other required functions. Easier, that is, >>>> if I use the ufunc generation tools that we have over in scipy.special. >>>> Would there be any objections to this? We already have quite a few >>>> functions for probability distributions in scipy.special: >>>> https://docs.scipy.org/doc/scipy/reference/special.html#raw-statistical-functions >>>> >>>> I wouldn't mind creating ufuncs for some of the other distributions, too. >>>> A ufunc implementation is more efficient, simplifies the code in >>>> scipy.stats, and automatically handles broadcasting. >>>> >>>> I'm bringing this up here to see if anyone has any objections to the >>>> expansion of the statistical functions in scipy.special. >>>> >>>> Warren >>> >>> In my previous email, the heading hints at an alternative that I didn't >>> mention in the text. The question implied in the heading is: what do folks >>> think about adding ufunc generation tools to scipy.stats, instead of >>> generating the ufuncs in scipy.special. There are a lot of conditional >>> computations in scipy.stats that would benefit from being implemented as >>> ufuncs, but probably don't need to be public functions. So instead of >>> adding more functions to scipy.special, perhaps we could add code in >>> scipy.stats for generating ufuncs, many of which would be private. Of >>> course, we could just generate private ufuncs in scipy.special, and only use >>> them in scipy.stats. >> >> +1 for adding additional more standard PDF/CDF functions to scipy.special as >> needed. >> >> There's already precedent for putting statistics-related but not >> distribution-related ufuncs into scipy.special, specifically for the >> conditional operations, e.g. boxcox(). On the other hand, if the functions >> you are thinking of would not be part of the public API, then I'd prefer to >> implement them in scipy.stats instead of scipy.special. >> >> What work do you think is entailed in implementing the ufuncs in >> scipy.stats? Is there infrastructure we need to duplicate? Can we abstract >> out the build infrastructure to a common place? I haven't looked at the >> build details for scipy.special in some time. >> >> -- >> Robert Kern >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> From warren.weckesser at gmail.com Fri Mar 17 00:20:52 2017 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Fri, 17 Mar 2017 00:20:52 -0400 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: On Thu, Mar 16, 2017 at 4:14 PM, Robert Kern wrote: > On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser < > warren.weckesser at gmail.com> wrote: > > > > On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser < > warren.weckesser at gmail.com> wrote: > >> > >> I'm working on an update to the Frechet distribution in scipy.stats > (see https://github.com/scipy/scipy/issues/3258 and > https://github.com/scipy/scipy/pull/3275). > >> > >> Instead jumping through the "lazy_where" hoops that are required for > conditional computations, it would be much easier to create a ufunc for the > standard PDF, CDF and possibly other required functions. Easier, that is, > if I use the ufunc generation tools that we have over in scipy.special. > Would there be any objections to this? We already have quite a few > functions for probability distributions in scipy.special: > https://docs.scipy.org/doc/scipy/reference/special.html# > raw-statistical-functions > >> > >> I wouldn't mind creating ufuncs for some of the other distributions, > too. A ufunc implementation is more efficient, simplifies the code in > scipy.stats, and automatically handles broadcasting. > >> > >> I'm bringing this up here to see if anyone has any objections to the > expansion of the statistical functions in scipy.special. > >> > >> Warren > > > > In my previous email, the heading hints at an alternative that I didn't > mention in the text. The question implied in the heading is: what do folks > think about adding ufunc generation tools to scipy.stats, instead of > generating the ufuncs in scipy.special. There are a lot of conditional > computations in scipy.stats that would benefit from being implemented as > ufuncs, but probably don't need to be public functions. So instead of > adding more functions to scipy.special, perhaps we could add code in > scipy.stats for generating ufuncs, many of which would be private. Of > course, we could just generate private ufuncs in scipy.special, and only > use them in scipy.stats. > > +1 for adding additional more standard PDF/CDF functions to scipy.special > as needed. > > There's already precedent for putting statistics-related but not > distribution-related ufuncs into scipy.special, specifically for the > conditional operations, e.g. boxcox(). On the other hand, if the functions > you are thinking of would not be part of the public API, then I'd prefer to > implement them in scipy.stats instead of scipy.special. > > What work do you think is entailed in implementing the ufuncs in > scipy.stats? Is there infrastructure we need to duplicate? Can we abstract > out the build infrastructure to a common place? I haven't looked at the > build details for scipy.special in some time. > The code that generates the ufunc boilerplate code is in scipy/special/generate_ufuncs.py. It generates the appropriate wrapper code for a core scalar function that is written in Cython, C or C++. I just submitted a pull request (https://github.com/scipy/scipy/pull/7190, still WIP) in which I wrote the core distribution functions for the Frechet distribution in Cython, added the signature information to the big honkin' FUNCS string in generate_ufuncs.py, added placeholders for the docstrings in add_newdocs.py, and then used the ufuncs in the implementation of the `frechet` class in stats. For the moment, the Frechet distribution ufuncs are in scipy.special, and they are private, but a trivial change will make them public, if there is interest. I don't have a strong opinion either way, but as you say, there is a precedent for including them as public functions in scipy.special. If we start converting existing distribution implementations (which I think would be a good thing for the stats code), we'll end up with a *lot* more functions being added somewhere. Warren > > -- > Robert Kern > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Fri Mar 17 00:24:59 2017 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Fri, 17 Mar 2017 00:24:59 -0400 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: On Thu, Mar 16, 2017 at 4:38 PM, Joshua Wilson wrote: > ps--if we decide to abstract out the infrastructure I'd be willing to > write the code. > > Great! I know you've been doing a lot of work in scipy.special, so you probably know the code generation parts better than most of us. Warren > On Thu, Mar 16, 2017 at 3:28 PM, Joshua Wilson > wrote: > >> Can we abstract out the build infrastructure to a common place? > > > > This should be fairly easy to do. It could be set up so that each > > module that needs ufuncs could have a local config file (cribbed off > > of the current `FUNCS` string in `generate_ufuncs.py`). > > > > Though I also don't object to adding more functions to special. > > > > On Thu, Mar 16, 2017 at 3:14 PM, Robert Kern > wrote: > >> On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser > >> wrote: > >>> > >>> On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser > >>> wrote: > >>>> > >>>> I'm working on an update to the Frechet distribution in scipy.stats > (see > >>>> https://github.com/scipy/scipy/issues/3258 and > >>>> https://github.com/scipy/scipy/pull/3275). > >>>> > >>>> Instead jumping through the "lazy_where" hoops that are required for > >>>> conditional computations, it would be much easier to create a ufunc > for the > >>>> standard PDF, CDF and possibly other required functions. Easier, > that is, > >>>> if I use the ufunc generation tools that we have over in > scipy.special. > >>>> Would there be any objections to this? We already have quite a few > >>>> functions for probability distributions in scipy.special: > >>>> https://docs.scipy.org/doc/scipy/reference/special.html# > raw-statistical-functions > >>>> > >>>> I wouldn't mind creating ufuncs for some of the other distributions, > too. > >>>> A ufunc implementation is more efficient, simplifies the code in > >>>> scipy.stats, and automatically handles broadcasting. > >>>> > >>>> I'm bringing this up here to see if anyone has any objections to the > >>>> expansion of the statistical functions in scipy.special. > >>>> > >>>> Warren > >>> > >>> In my previous email, the heading hints at an alternative that I didn't > >>> mention in the text. The question implied in the heading is: what do > folks > >>> think about adding ufunc generation tools to scipy.stats, instead of > >>> generating the ufuncs in scipy.special. There are a lot of conditional > >>> computations in scipy.stats that would benefit from being implemented > as > >>> ufuncs, but probably don't need to be public functions. So instead of > >>> adding more functions to scipy.special, perhaps we could add code in > >>> scipy.stats for generating ufuncs, many of which would be private. Of > >>> course, we could just generate private ufuncs in scipy.special, and > only use > >>> them in scipy.stats. > >> > >> +1 for adding additional more standard PDF/CDF functions to > scipy.special as > >> needed. > >> > >> There's already precedent for putting statistics-related but not > >> distribution-related ufuncs into scipy.special, specifically for the > >> conditional operations, e.g. boxcox(). On the other hand, if the > functions > >> you are thinking of would not be part of the public API, then I'd > prefer to > >> implement them in scipy.stats instead of scipy.special. > >> > >> What work do you think is entailed in implementing the ufuncs in > >> scipy.stats? Is there infrastructure we need to duplicate? Can we > abstract > >> out the build infrastructure to a common place? I haven't looked at the > >> build details for scipy.special in some time. > >> > >> -- > >> Robert Kern > >> > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at scipy.org > >> https://mail.scipy.org/mailman/listinfo/scipy-dev > >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From iavlaserg at gmail.com Fri Mar 17 07:59:34 2017 From: iavlaserg at gmail.com (Vladislav Iakovlev) Date: Fri, 17 Mar 2017 14:59:34 +0300 Subject: [SciPy-Dev] GSoC Proposal: Circular statistics Message-ID: Thanks to Josef and other colleagues for productive discussion! It clarified a lot for me! This is a first iteration of my GSoC proposal. The goal of the project is to implement circular statistics functionality in scipy.stats. Motivation: There exist several Python packages related to this theme, but most of them has some shortcomings. Also, it will be very conveniently to have this functionality in SciPy. It will simplify users work because they will not have to search and install other packages. What will be done: Base class rv_circular that will provide infrastructure for distributions characteristics. List of supposed characteristics: pdf, cdf, moments, mean, dispersion, standard deviation, percentiles, median, entropy, kurtosis, skewness. Derived classes for distributions. Methods for calculating values of characteristics can be redefined in this classes. List of supposed distributions: Uniform distribution, Triangular distribution, Cardioid distribution, Wrapped distributions: Cauchy, Normal, Von Mises. Functions for statistical tests: Tests of uniformity and Goodness-of-Fit. Tests related to Von Mises distribution: tests for mean and concentration parameters, multi-sample tests. Some non-parametric tests. I have developed a little ?demo-version? of circular stats toolbox: https://github.com/yakovlevvs/Circular-Statistics It is very raw and will be rewritten; I created this just to demonstrate my vision of how it will be arranged. Approximate timeline: April: Reading books, understanding related mathematics; May ? June, 20: Implementing rv_circular class, documentation and tests for it; June, 20 ? July, 10: Implementing distribution classes; July, 10 ? August, 20: Implementing statistical tests and point estimations; Related literature: Kanti V. Mardia, Peter E. Jupp: Directional statistics, 2000. Jammalamadaka, S. Rao. Topics in circular statistics / S. Rao Jammalamadaka, A. SenGupta. -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Fri Mar 17 16:00:19 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Sat, 18 Mar 2017 01:30:19 +0530 Subject: [SciPy-Dev] GSoC 2017: BSpline support In-Reply-To: References: Message-ID: After looking at the issue tracker #6730 I have made my first proposal for the GSoC project "Improve support for B-splines". This is in continuation with my first mail regarding the same matter. This is a long list of what I think could be done as part of the project. I don't have much knowledge of Splines or B-Splines hence the work plan is not very detailed. Please have a look. *Minor changes:-* ::: Documentation: *> Rework the tutorial for scipy.interpolate:* The 1D interpolation section prominently recommends interp1d. We likely want to tone it down and recommend instead CubicSpline, Akima1DInterpolator, BSpline etc. This can be done after reading some basics about these algorithms and making the changes as required. *> Come up with better names for make_interp_spline/make_lsq_spline routines.* This too can be handled after some discussion. Both these documentation changes can be done within a few days time, 3 to 4 days should be enough. ::: Enhancements: *> Add string aliases to the bc_type keyword of make_interp_spline* to match what CubicSpline allows. I.e. bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) This requires mainly imitating whats there in CubicSpline so wont be very difficult. *> Convert a CubicSpline object to a BSpline* This could be done by assigning the __class__ attribute to the instance. However, since this is not preferred, we may construct a new method which will create a new BSpline object using a CubicSpline object by assigning the corresponding attribute values. (I need your suggestion, which approach to follow?) Both these tasks should not take more than 6 to 7 days, including all the documentation and unit tests. *Major Enhancements:-* *> Knot insertion:* BSpline class should grow a method insert_knot (or insert_knots). This can either wrap fitpack.insert directly, or reuse parts of it (e.g. retain the buffer swapping plumbing from _fitpack and reimplement the Fortran routine fitpack/fpinst.f in C or Cython). Since fitpack.insert is already implemented for BSplines, wrapping around it would be the simplest solution as I perceive. *> Root-finding:* At a minimum, sproot should be wrapped into a method of BSpline class. Better still, it should be generalized to allow non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. This too can be done by imitating PPoly.roots and PPoly.solve. Both these tasks can be completed within 10 days time including all the documentation and tests. *BSpline Variations:-* * > Cardinal BSplines (BSplines with equidistant knots).* We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks. For implementing this feature I will have to read some material to get a good understanding of Cardinal BSplines and its implementation These are two of the places (for now) I would start my study from. The implementation in scipy.signal can be of help. https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline http://dx.doi.org/10.1016/j.aml.2010.06.029 This task could take a long time, maybe 3 weeks or more. *> Spline fitting to data.* ATM there's only FITPACK, and we want a more controllable alternative. A first exploratory shot could be to implement de Boors's newknot routine and see how it behaves in real life. This too requires reading of some literature to understand the math and also looking at the FITPACK code for guidance. The following literature can be helpful. https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/B-spline/de-Boor.html https://en.wikipedia.org/wiki/De_Boor%27s_algorithm Both these tasks would take over a month for complete implementation. *optional:* I am not sure about the scope of these additions. Also, if we decide to implement these, I am not quite sure they can be completed within the span of GSoC. Probably, we can just take up one of these tasks. *> PSpline:* The term P-spline stands for "penalized B-spline". It refers to using the B-spline representation where the coefficients are determined partly by the data to be fitted, and partly by an additional penalty function that aims to impose smoothness to avoid overfitting. Adequate literature can be found here, https://projecteuclid.org/download/pdf_1/euclid.ss/1038425655 SAS have already implemented it, https://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_transreg_sect020.htm This feature could take up a lot of time, including documentation,tests,examples etc. *> Non-Uniform Rational B-splines (NURBS):* Unlike simple B-splines, the control points each have a weight. The following literature can be helpful, https://en.wikipedia.org/wiki/Non-uniform_rational_B-spline https://www.rhino3d.com/nurbs The following articles can help with the implementation. https://www.codeproject.com/Articles/996281/NURBS-curve-made-easy http://www.nar-associates.com/nurbs/c_code.html#chapter4 This feature too can take up lots of time for full completion. *> Constructing periodic splines*. Fitpack does it. With full matrices it's straightforward, a naive implementation with banded matrices has issues with numerical stability. We can definitely look into the banded matrix issue if time permits. I am not sure what we may find so cant say anything about the time required. Final Assessment:- After these tasks have been completed. All the new features,methods would undergo rigorous tests and doctests. Benchmark tests could also be added wherever necessary. Features having scope for work could be reported on GitHub. I plan on reading as much as I can related to Cardinal BSplines,De Boor's newknot routine,Penalized BSplines,NURBS etc. before the start of GSoC. This will help me in saving time finding literature related to all these new additions. Looking forward to feedback and corrections. On 16 March 2017 at 20:03, Evgeni Burovski wrote: > > > On Thu, Mar 16, 2017 at 11:06 AM, Aman Pratik > wrote: > >> Hello Developers, >> >> This is Aman Pratik. I am currently pursuing my B.Tech from Indian >> Institute of Technology, Varanasi. I am a keen software developer and not >> very new to the open source community. I am interested in your project >> "Improve support for B-splines" for GSoC 2017. >> >> I have been working in Python for the past 2 years and have good idea >> about Mathematical Methods and Techniques. I am quite familiar with scipy >> mainly as a user and also as a developer. I also have some experience with >> Cython. >> >> These are the PRs I have worked/working on for the past month. >> >> ENH: Add evaluation of periodic splines #6730 >> >> >> DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) >> >> >> ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 >> >> >> My GitHub Profile: https://www.github.com/amanp10 >> >> I am interested in Mathematical models and functions and therefore >> willing to work on splines. Also, I am familiar with Benchmark tests, Unit >> tests and other technical knowledge I would require for this project. I >> have started my study for the subject and am looking forward to guidance >> from the potential mentors or other developers. >> >> Thank You >> >> > > Hi Aman, and welcome! > > I see that you already started with https://github.com/scipy/ > scipy/pull/7185, great! > > We'll also need a proposal. Since writing a good proposal typically takes > several iterations, I encourage you to start working on it, too. When you > have a draft, please send it to the list. > > All the best, > > Evgeni > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Mar 18 13:57:28 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 18 Mar 2017 11:57:28 -0600 Subject: [SciPy-Dev] NumPy 1.12.1 released Message-ID: Hi All, I'm pleased to announce the release of NumPy 1.12.1. NumPy 1.12.1 supports Python 2.7 and 3.4 - 3.6 and fixes bugs and regressions found in NumPy 1.12.0. In particular, the regression in f2py constant parsing is fixed. Wheels for Linux, Windows, and OSX can be found on pypi. Archives can be downloaded from github . *Contributors* A total of 10 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * Charles Harris * Eric Wieser * Greg Young * Joerg Behrmann + * John Kirkham * Julian Taylor * Marten van Kerkwijk * Matthew Brett * Shota Kawabuchi * Jean Utke + *Fixes Backported* * #8483: BUG: Fix wrong future nat warning and equiv type logic error... * #8489: BUG: Fix wrong masked median for some special cases * #8490: DOC: Place np.average in inline code * #8491: TST: Work around isfinite inconsistency on i386 * #8494: BUG: Guard against replacing constants without `'_'` spec in f2py. * #8524: BUG: Fix mean for float 16 non-array inputs for 1.12 * #8571: BUG: Fix calling python api with error set and minor leaks for... * #8602: BUG: Make iscomplexobj compatible with custom dtypes again * #8618: BUG: Fix undefined behaviour induced by bad `__array_wrap__` * #8648: BUG: Fix `MaskedArray.__setitem__` * #8659: BUG: PPC64el machines are POWER for Fortran in f2py * #8665: BUG: Look up methods on MaskedArray in `_frommethod` * #8674: BUG: Remove extra digit in `binary_repr` at limit * #8704: BUG: Fix deepcopy regression for empty arrays. * #8707: BUG: Fix ma.median for empty ndarrays Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From stellamberv at gmail.com Sat Mar 18 22:32:30 2017 From: stellamberv at gmail.com (Qianhui Wan) Date: Sat, 18 Mar 2017 22:32:30 -0400 Subject: [SciPy-Dev] (no subject) Message-ID: Hi all, I'm a senior undergraduate in US. My direction is in applied math and numerical analysis so I want to contribute to our suborg this year :) This is my first time to apply for GSoC. As I mentioned above, I've already had some background in related math and I have some experience in solving PDEs numerically with classical schemes. In the ideas list, I'm more interested in "implement scipy.diff (numerical differentiation)" and "improve the parabolic cylinder functions". For programming languages, I'm a intermediate learner in Python, MATLAB and R, also having some skills in C/C++. If anyone can give me some tips in the requirements of these two projects and proposal writing I'll be very appreciate, and if you have suggestions in choosing projects or developing ideas please let me know. Thanks! Best, Qianhui Qianhui Wan, senior Fall 2016 - Spring 2017, visiting, Math Department, University of Wisconsin, Madison, US Spring 2016, visiting, Math Department, University of California, Berkeley, US Fall 2013 - Spring 2017, Math School, Sun Yat-sen University, Guangzhou, China -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Sun Mar 19 18:08:13 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Mon, 20 Mar 2017 03:08:13 +0500 Subject: [SciPy-Dev] GSoC 2017: BSpline support In-Reply-To: References: Message-ID: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> For future I suggest to put your proposal in markdown format somewhere on github (scipy wiki is a good place). I don't know what is the correct and productive way to review such proposal, so these are some thoughts on some of your points. > Rework the tutorial for scipy.interpolate: The 1D interpolation section prominently recommends interp1d. We likely want to tone it down and recommend instead CubicSpline, Akima1DInterpolator, BSpline etc. This can be done after reading some basics about these algorithms and making the changes as required. Honestly I don't think it is that easy, you definitely need to have a certain taste and interest to rewrite this tutorial. My guess is that you will be able to do it better after you finished the main algorithmic and coding work. > Come up with better names for make_interp_spline/make_lsq_spline routines. This too can be handled after some discussion. I don't think it is necessary to put things like that in the proposal, you'll figure it out (with Evgeni) as the work progresses. > Add string aliases to the bc_type keyword of make_interp_spline to match what CubicSpline allows. I.e. bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) This requires mainly imitating whats there in CubicSpline so wont be very difficult. This can be indeed your first small objective. The documentation for bc_type in make_interp_spline reads "Each of these must be an iterable of pairs (order, value) which gives the values of derivatives of specified orders at the given edge of the interpolation interval." So BSpline is quite general and allows specifying conditions for several derivatives at one end (as opposed to CubicSpline). Maybe it will need some special considerations, just something I noticed. > Convert a CubicSpline object to a BSpline This could be done by assigning the __class__ attribute to the instance. However, since this is not preferred, we may construct a new method which will create a new BSpline object using a CubicSpline object by assigning the corresponding attribute values. (I need your suggestion, which approach to follow?) I believe ideally there should be a factory method like `BSpline.from_ppoly(...)`, as a compromise something like `BSpline.from_cubic_spline`, but I'm not sure whether it will be that valuable in that form. The problem might be not that simple, do some research on now to make conversion between polynomial and B-spline basis. > Knot insertion: BSpline class should grow a method insert_knot (or insert_knots). This can either wrap fitpack.insert directly, or reuse parts of it (e.g. retain the buffer swapping plumbing from _fitpack and reimplement the Fortran routine fitpack/fpinst.f in C or Cython). Since fitpack.insert is already implemented for BSplines, wrapping around it would be the simplest solution as I perceive. As I see there is already a public function insert (implemented in fitpack.py). So just moving it in as a method pretty much does the job. However is the final aim is to move away from FITPACK, then reimplementing insertion algorithm in Cython is useful. Additionally it can be an entry into more serious problems to follow and doing things in Cython. (Btw, I'm not sure what Evgeni meant by "e.g. retain the buffer swapping plumbing".) > Root-finding: At a minimum, sproot should be wrapped into a method of BSpline class. Better still, it should be generalized to allow non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. This too can be done by imitating PPoly.roots and PPoly.solve. Writing a new implementation for arbitrary spline orders looks like an interesting and challenging problem. Consider doing it instead of just minimal wrapper. All in all the two previous problems, if done in full capacity, don't seem easy and can occupy you for the good part of the GSoC. BSpline Variations:- > Cardinal BSplines (BSplines with equidistant knots). We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks. For implementing this feature I will have to read some material to get a good understanding of Cardinal BSplines and its implementation These are two of the places (for now) I would start my study from. The implementation in scipy.signal can be of help. https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline http://dx.doi.org/10.1016/j.aml.2010.06.029 This task could take a long time, maybe 3 weeks or more. > Spline fitting to data. ATM there's only FITPACK, and we want a more controllable alternative. A first exploratory shot could be to implement de Boors's newknot routine and see how it behaves in real life. This too requires reading of some literature to understand the math and also looking at the FITPACK code for guidance. The following literature can be helpful. https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/B-spline/de-Boor.html https://en.wikipedia.org/wiki/De_Boor%27s_algorithm Both these tasks would take over a month for complete implementation. Generally I suggest to pick (at least for now) one subject most interesting to you and think it through in more details. Now it's unclear what exactly you want to program (which classes, functions, etc.) and time estimates don't look reliable. For "Spline fitting to data" I'm somewhat confused, because make_lsq_spline and make_interp_spline seem to already solve fitting problems without relying on FITPACK. Evgeni, what is the situation here? For now that's all from me. To sum up: focus on fewer things with more details, you don't necessary need to go into the most advanced things. If I'm not mistaken, the whole project can be organized as "get rid of some things from fitpack" and it can be good. I think Evgeni will clarify/comment on things. Nikolay. After looking at the issue tracker #6730 I have made my first proposal for the GSoC project "Improve support for B-splines". This is in continuation with my first mail regarding the same matter. This is a long list of what I think could be done as part of the project. I don't have much knowledge of Splines or B-Splines hence the work plan is not very detailed. Please have a look. Minor changes:- ::: Documentation: > Rework the tutorial for scipy.interpolate: The 1D interpolation section prominently recommends interp1d. We likely want to tone it down and recommend instead CubicSpline, Akima1DInterpolator, BSpline etc. This can be done after reading some basics about these algorithms and making the changes as required. > Come up with better names for make_interp_spline/make_lsq_spline routines. This too can be handled after some discussion. Both these documentation changes can be done within a few days time, 3 to 4 days should be enough. ::: Enhancements: > Add string aliases to the bc_type keyword of make_interp_spline to match what CubicSpline allows. I.e. bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) This requires mainly imitating whats there in CubicSpline so wont be very difficult. > Convert a CubicSpline object to a BSpline This could be done by assigning the __class__ attribute to the instance. However, since this is not preferred, we may construct a new method which will create a new BSpline object using a CubicSpline object by assigning the corresponding attribute values. (I need your suggestion, which approach to follow?) Both these tasks should not take more than 6 to 7 days, including all the documentation and unit tests. Major Enhancements:- > Knot insertion: BSpline class should grow a method insert_knot (or insert_knots). This can either wrap fitpack.insert directly, or reuse parts of it (e.g. retain the buffer swapping plumbing from _fitpack and reimplement the Fortran routine fitpack/fpinst.f in C or Cython). Since fitpack.insert is already implemented for BSplines, wrapping around it would be the simplest solution as I perceive. > Root-finding: At a minimum, sproot should be wrapped into a method of BSpline class. Better still, it should be generalized to allow non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. This too can be done by imitating PPoly.roots and PPoly.solve. Both these tasks can be completed within 10 days time including all the documentation and tests. BSpline Variations:- > Cardinal BSplines (BSplines with equidistant knots). We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks. For implementing this feature I will have to read some material to get a good understanding of Cardinal BSplines and its implementation These are two of the places (for now) I would start my study from. The implementation in scipy.signal can be of help. https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline http://dx.doi.org/10.1016/j.aml.2010.06.029 This task could take a long time, maybe 3 weeks or more. > Spline fitting to data. ATM there's only FITPACK, and we want a more controllable alternative. A first exploratory shot could be to implement de Boors's newknot routine and see how it behaves in real life. This too requires reading of some literature to understand the math and also looking at the FITPACK code for guidance. The following literature can be helpful. https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/B-spline/de-Boor.html https://en.wikipedia.org/wiki/De_Boor%27s_algorithm Both these tasks would take over a month for complete implementation. optional: I am not sure about the scope of these additions. Also, if we decide to implement these, I am not quite sure they can be completed within the span of GSoC. Probably, we can just take up one of these tasks. > PSpline: The term P-spline stands for "penalized B-spline". It refers to using the B-spline representation where the coefficients are determined partly by the data to be fitted, and partly by an additional penalty function that aims to impose smoothness to avoid overfitting. Adequate literature can be found here, https://projecteuclid.org/download/pdf_1/euclid.ss/1038425655 SAS have already implemented it, https://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_transreg_sect020.htm This feature could take up a lot of time, including documentation,tests,examples etc. > Non-Uniform Rational B-splines (NURBS): Unlike simple B-splines, the control points each have a weight. The following literature can be helpful, https://en.wikipedia.org/wiki/Non-uniform_rational_B-spline https://www.rhino3d.com/nurbs The following articles can help with the implementation. https://www.codeproject.com/Articles/996281/NURBS-curve-made-easy http://www.nar-associates.com/nurbs/c_code.html#chapter4 This feature too can take up lots of time for full completion. > Constructing periodic splines. Fitpack does it. With full matrices it's straightforward, a naive implementation with banded matrices has issues with numerical stability. We can definitely look into the banded matrix issue if time permits. I am not sure what we may find so cant say anything about the time required. Final Assessment:- After these tasks have been completed. All the new features,methods would undergo rigorous tests and doctests. Benchmark tests could also be added wherever necessary. Features having scope for work could be reported on GitHub. I plan on reading as much as I can related to Cardinal BSplines,De Boor's newknot routine,Penalized BSplines,NURBS etc. before the start of GSoC. This will help me in saving time finding literature related to all these new additions. Looking forward to feedback and corrections. On 16 March 2017 at 20:03, Evgeni Burovski <evgeny.burovskiy at gmail.com> wrote: _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev On Thu, Mar 16, 2017 at 11:06 AM, Aman Pratik <amanpratik10 at gmail.com> wrote: Hello Developers, This is Aman Pratik. I am currently pursuing my B.Tech from Indian Institute of Technology, Varanasi. I am a keen software developer and not very new to the open source community. I am interested in your project "Improve support for B-splines" for GSoC 2017. I have been working in Python for the past 2 years and have good idea about Mathematical Methods and Techniques. I am quite familiar with scipy mainly as a user and also as a developer. I also have some experience with Cython. These are the PRs I have worked/working on for the past month. ENH: Add evaluation of periodic splines #6730 DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 My GitHub Profile: https://www.github.com/amanp10 I am interested in Mathematical models and functions and therefore willing to work on splines. Also, I am familiar with Benchmark tests, Unit tests and other technical knowledge I would require for this project. I have started my study for the subject and am looking forward to guidance from the potential mentors or other developers. Thank You Hi Aman, and welcome! I see that you already started with https://github.com/scipy/scipy/pull/7185, great! We'll also need a proposal. Since writing a good proposal typically takes several iterations, I encourage you to start working on it, too. When you have a draft, please send it to the list. All the best, Evgeni _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From antonior92 at gmail.com Mon Mar 20 14:24:14 2017 From: antonior92 at gmail.com (Antonio Ribeiro) Date: Mon, 20 Mar 2017 15:24:14 -0300 Subject: [SciPy-Dev] CUTEst in Scipy Message-ID: Hello, I am developing my google of summer proposal about constrained optimisation in Scipy and it will be very important to have a good set of benchmarks. There is a great library with diverse optimisation benchmarks called CUTEst . It is under LGPL 2.1 license. This CUTEst library include a huge amount of problem sets and is often used in optimisation papers. Furthermore, many of the available optimisation software provide some interface to it. I am interested in using problems from this set in my project and I want to ask how should I proceed? 1) Select a subset of tests from the CUTEst library and implement them in native python under scipy. 2) Include some interface to CUTEst in scipy. By what I understand LGPL license is more restrictive than BSD-3 used by Scipy. In this case, could we ask for permission? 3) There is an interface for CUTEst implemented in the library pyOpus (under GPL3 license) . Since this is a library external to Scipy (with an incompatible license) how should I proceed in this case: can I make my benchmarks dependent on it? Can we ask permission for include this interface in Scipy? Any suggestions on how to proceed and what to include in my proposal is very welcome. Ant?nio -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Mon Mar 20 15:26:55 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Tue, 21 Mar 2017 00:26:55 +0500 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: References: Message-ID: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> Hi! I heard about CUTEst, but it looked quite complex so I decided not to deal with it. To your options: 1) It might be the most practical option, considering you'll have other (main) parts of work to do. Also it's not clear whether to include test problems in scipy or store them somewhere else. My opinion is that having a collection of problems right in scipy.optimize is good. I don't exactly remember why we decided not to include test problems for least_squares, I guess because it required many considerations. In theory there are ASV benchmarks in scipy, but they don't feel adequate for benchmarking optimization problems. So developing benchmarking infrastructure in scipy.optimize might be a good side project, my concern is that it will distract you from the main path greatly. To sum up: the simplest way is to pick some problems, write your own benchmark suite and store it somewhere else. This way you will be able to focus on the algorithms. Other options are possible if you have a really good idea how to do them and ready to devote time to it. 2) My understanding is that you don't need any permissions as you won't use CUTEst code (someone could correct me). As I see it: your add an interface into scipy, a user install CUTEst and use this interface to work with CUTEst. It doesn't seem very practical, because installing CUTEst looks like a pain. Do you agree or I am mistaken? 3) I guess you can ask an author to reuse his code in scipy to implement an interface to CUTEst. Generally about 2 and 3: it might be a good idea to use CUTEst for your internal benchmarking during the development and optionally you can add an interface to CUTEst into scipy. But I think it will be hardly ever used after that. To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking seems unpractical. It would be useful if you will actually try to work with CUTEst, maybe it will turn out to be not that difficult. In such case many of my points are not valid. I hope you'll be able to make some sense from my rambling. Nikolay. ---- On Mon, 20 Mar 2017 23:24:14 +0500 Antonio Ribeiro <antonior92 at gmail.com> wrote ---- Hello, I am developing my google of summer proposal about constrained optimisation in Scipy and it will be very important to have a good set of benchmarks. There is a great library with diverse optimisation benchmarks called CUTEst <https://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki/>. It is under LGPL 2.1 license. This CUTEst library include a huge amount of problem sets and is often used in optimisation papers. Furthermore, many of the available optimisation software provide some interface to it. I am interested in using problems from this set in my project and I want to ask how should I proceed? 1) Select a subset of tests from the CUTEst library and implement them in native python under scipy. 2) Include some interface to CUTEst in scipy. By what I understand LGPL license is more restrictive than BSD-3 used by Scipy. In this case, could we ask for permission? 3) There is an interface for CUTEst implemented in the library pyOpus (under GPL3 license) <http://fides.fe.uni-lj.si/pyopus/download1.html>. Since this is a library external to Scipy (with an incompatible license) how should I proceed in this case: can I make my benchmarks dependent on it? Can we ask permission for include this interface in Scipy? Any suggestions on how to proceed and what to include in my proposal is very welcome. Ant?nio _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Mar 20 18:21:59 2017 From: andyfaff at gmail.com (Andrew Nelson) Date: Tue, 21 Mar 2017 09:21:59 +1100 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> References: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> Message-ID: There are some least squares problems from the NIST test suite in the benchmarks/benchmarks/go_benchmark_functions, but they're couched in terms of a simple scalar (chi2) minimisation problem, and they are all bounded problems. On 21 March 2017 at 06:26, Nikolay Mayorov wrote: > Hi! > > I heard about CUTEst, but it looked quite complex so I decided not to deal > with it. To your options: > > 1) It might be the most practical option, considering you'll have other > (main) parts of work to do. Also it's not clear whether to include test > problems in scipy or store them somewhere else. My opinion is that having a > collection of problems right in scipy.optimize is good. I don't exactly > remember why we decided not to include test problems for least_squares, I > guess because it required many considerations. In theory there are ASV > benchmarks in scipy, but they don't feel adequate for benchmarking > optimization problems. So developing benchmarking infrastructure in > scipy.optimize might be a good side project, my concern is that it will > distract you from the main path greatly. > > To sum up: the simplest way is to pick some problems, write your own > benchmark suite and store it somewhere else. This way you will be able to > focus on the algorithms. Other options are possible if you have a really > good idea how to do them and ready to devote time to it. > > 2) My understanding is that you don't need any permissions as you won't > use CUTEst code (someone could correct me). As I see it: your add an > interface into scipy, a user install CUTEst and use this interface to work > with CUTEst. It doesn't seem very practical, because installing CUTEst > looks like a pain. Do you agree or I am mistaken? > > 3) I guess you can ask an author to reuse his code in scipy to implement > an interface to CUTEst. > > Generally about 2 and 3: it might be a good idea to use CUTEst for your > internal benchmarking during the development and optionally you can add an > interface to CUTEst into scipy. But I think it will be hardly ever used > after that. > > To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking > seems unpractical. It would be useful if you will actually try to work with > CUTEst, maybe it will turn out to be not that difficult. In such case many > of my points are not valid. > > I hope you'll be able to make some sense from my rambling. > > Nikolay. > > > ---- On Mon, 20 Mar 2017 23:24:14 +0500 *Antonio Ribeiro > >* wrote ---- > > Hello, > > I am developing my google of summer proposal about constrained > optimisation in Scipy > and it will be very important to have a good set of benchmarks. > > There is a great library with diverse optimisation benchmarks called > CUTEst . It is > under LGPL 2.1 license. > > This CUTEst library include a huge amount of problem sets > and is often used in optimisation papers. Furthermore, many of the > available > optimisation software provide some interface to it. I am interested in > using problems from this set in my project and I want to ask how > should I proceed? > > 1) Select a subset of tests from the CUTEst library and implement them in > native python under scipy. > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > license is > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > permission? > > 3) There is an interface for CUTEst implemented in the library pyOpus > (under GPL3 license) > . Since this is a > library external to Scipy (with an incompatible license) how should I > proceed in this case: can I make my benchmarks dependent on it? Can we ask > permission for include this interface in Scipy? > > Any suggestions on how to proceed and what to include in my proposal is > very welcome. > > Ant?nio > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From newville at cars.uchicago.edu Mon Mar 20 19:35:15 2017 From: newville at cars.uchicago.edu (Matt Newville) Date: Mon, 20 Mar 2017 18:35:15 -0500 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 28 In-Reply-To: References: Message-ID: Date: Tue, 21 Mar 2017 09:21:59 +1100 > From: Andrew Nelson > To: SciPy Developers List > Subject: Re: [SciPy-Dev] CUTEst in Scipy > Message-ID: > gmail.com> > Content-Type: text/plain; charset="utf-8" > > There are some least squares problems from the NIST test suite in the > benchmarks/benchmarks/go_benchmark_functions, but they're couched in terms > of a simple scalar (chi2) minimisation problem, and they are all bounded > problems. > > The NIST StRD test suite ( http://www.itl.nist.gov/div898/strd/nls/nls_main.shtml) could be considered for inclusion as a test suite for scipy.optimize. These are generally small but non-trivial problems for unconstrained non-linear optimization. Each of the 20+ data sets comes with 2 sets of starting values for the variables and certified values for both best fit values and estimated uncertainties. I believe that none of the optimizers in scipy.optimize will get the right answer to published precision in every case, though leastsq and least_squares can solve most of these problems reasonably well from at least 1 of the 2 starting points. This may imply that these problems are not "sparse" in the sense of what (as far as I understand) the GSOC project intends to focus on. But, they are a good set of test cases to include in any assessment of an optimization algorithm. The NIST StRD datasets are public domain. There is existing test code in lmfit to use these data sets with most of the scipy optimizers. This could easily be modified to be part of scipy. See https://github.com/lmfit/lmfit-py/tree/master/NIST_STRD and https://github.com/lmfit/lmfit-py/blob/master/tests/test_NIST_Strd.py for more details. Also, note that the tests focus only on the quality of the results. They may record number of iterations, but make no attempt to benchmark the runtime. I think that migrating this test suite to scipy would not be a huge undertaking, though I cannot say whether that is an appropriate use of time for the proposed GSOC project. Cheers, --Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From antonior92 at gmail.com Tue Mar 21 09:46:03 2017 From: antonior92 at gmail.com (Antonio Ribeiro) Date: Tue, 21 Mar 2017 10:46:03 -0300 Subject: [SciPy-Dev] CUTEst in Scipy Message-ID: Thank you for the insightful answer Nikolay! I agree with you that include CUTEst dependency to Scipy could be troublesome, and furthermore it could be very painful to make this interface work well for more than one operational systems... The solution of implementing and store the benchmark somewhere else is not ideal but seems a good option because it make things simpler. About the use of NIST test suite: Thank you for the suggestion Andrew, but I don't think it is a good test suit for the methods I intend to implement. 2017-03-20 19:23 GMT-03:00 : > Send SciPy-Dev mailing list submissions to > scipy-dev at scipy.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.scipy.org/mailman/listinfo/scipy-dev > or, via email, send a message with subject or body 'help' to > scipy-dev-request at scipy.org > > You can reach the person managing the list at > scipy-dev-owner at scipy.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-Dev digest..." > > > Today's Topics: > > 1. CUTEst in Scipy (Antonio Ribeiro) > 2. Re: CUTEst in Scipy (Nikolay Mayorov) > 3. Re: CUTEst in Scipy (Andrew Nelson) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Mon, 20 Mar 2017 15:24:14 -0300 > From: Antonio Ribeiro > To: scipy-dev at scipy.org > Subject: [SciPy-Dev] CUTEst in Scipy > Message-ID: > gmail.com> > Content-Type: text/plain; charset="utf-8" > > Hello, > > I am developing my google of summer proposal about constrained optimisation > in Scipy > and it will be very important to have a good set of benchmarks. > > There is a great library with diverse optimisation benchmarks called CUTEst > . It is under LGPL > 2.1 license. > > This CUTEst library include a huge amount of problem sets > and is often used in optimisation papers. Furthermore, many of the > available > optimisation software provide some interface to it. I am interested in > using problems from this set in my project and I want to ask how > should I proceed? > > 1) Select a subset of tests from the CUTEst library and implement them in > native python under scipy. > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > license is > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > permission? > > 3) There is an interface for CUTEst implemented in the library pyOpus > (under GPL3 license) > . Since this is a > library > external to Scipy (with an incompatible license) how should I proceed in > this case: can I make my benchmarks dependent on it? Can we ask permission > for include this interface in Scipy? > > Any suggestions on how to proceed and what to include in my proposal is > very welcome. > > Ant?nio > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170320/f2417724/attachment-0001.html> > > ------------------------------ > > Message: 2 > Date: Tue, 21 Mar 2017 00:26:55 +0500 > From: Nikolay Mayorov > To: "SciPy Developers List" > Subject: Re: [SciPy-Dev] CUTEst in Scipy > Message-ID: <15aed2fc740.ec8f556f6477.4506434177608850793 at zoho.com> > Content-Type: text/plain; charset="utf-8" > > Hi! > > > > I heard about CUTEst, but it looked quite complex so I decided not to deal > with it. To your options: > > > > 1) It might be the most practical option, considering you'll have other > (main) parts of work to do. Also it's not clear whether to include test > problems in scipy or store them somewhere else. My opinion is that having a > collection of problems right in scipy.optimize is good. I don't exactly > remember why we decided not to include test problems for least_squares, I > guess because it required many considerations. In theory there are ASV > benchmarks in scipy, but they don't feel adequate for benchmarking > optimization problems. So developing benchmarking infrastructure in > scipy.optimize might be a good side project, my concern is that it will > distract you from the main path greatly. > > > > To sum up: the simplest way is to pick some problems, write your own > benchmark suite and store it somewhere else. This way you will be able to > focus on the algorithms. Other options are possible if you have a really > good idea how to do them and ready to devote time to it. > > > > 2) My understanding is that you don't need any permissions as you won't > use CUTEst code (someone could correct me). As I see it: your add an > interface into scipy, a user install CUTEst and use this interface to work > with CUTEst. It doesn't seem very practical, because installing CUTEst > looks like a pain. Do you agree or I am mistaken? > > > > 3) I guess you can ask an author to reuse his code in scipy to implement > an interface to CUTEst. > > > > Generally about 2 and 3: it might be a good idea to use CUTEst for your > internal benchmarking during the development and optionally you can add an > interface to CUTEst into scipy. But I think it will be hardly ever used > after that. > > > > To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking > seems unpractical. It would be useful if you will actually try to work with > CUTEst, maybe it will turn out to be not that difficult. In such case many > of my points are not valid. > > > > I hope you'll be able to make some sense from my rambling. > > > Nikolay. > > > > > ---- On Mon, 20 Mar 2017 23:24:14 +0500 Antonio Ribeiro & > lt;antonior92 at gmail.com> wrote ---- > > > > > Hello, > > > > I am developing my google of summer proposal about constrained > optimisation in Scipy > > and it will be very important to have a good set of benchmarks. > > > > There is a great library with diverse optimisation benchmarks called > CUTEst <https://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki/>. It > is under LGPL 2.1 license. > > > > This CUTEst library include a huge amount of problem sets > > and is often used in optimisation papers. Furthermore, many of the > available > > optimisation software provide some interface to it. I am interested in > > using problems from this set in my project and I want to ask how > > should I proceed? > > > > > 1) Select a subset of tests from the CUTEst library and implement them in > native python under scipy. > > > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > license is > > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > permission? > > > > 3) There is an interface for CUTEst implemented in the library pyOpus > (under GPL3 license) > > <http://fides.fe.uni-lj.si/pyopus/download1.html>. Since this is a > library external to Scipy (with an incompatible license) how should I > proceed in this case: can I make my benchmarks dependent on it? Can we ask > permission for include this interface in Scipy? > > > > > > Any suggestions on how to proceed and what to include in my proposal is > very welcome. > > > > Ant?nio > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170321/751cb63f/attachment-0001.html> > > ------------------------------ > > Message: 3 > Date: Tue, 21 Mar 2017 09:21:59 +1100 > From: Andrew Nelson > To: SciPy Developers List > Subject: Re: [SciPy-Dev] CUTEst in Scipy > Message-ID: > gmail.com> > Content-Type: text/plain; charset="utf-8" > > There are some least squares problems from the NIST test suite in the > benchmarks/benchmarks/go_benchmark_functions, but they're couched in terms > of a simple scalar (chi2) minimisation problem, and they are all bounded > problems. > > > On 21 March 2017 at 06:26, Nikolay Mayorov > wrote: > > > Hi! > > > > I heard about CUTEst, but it looked quite complex so I decided not to > deal > > with it. To your options: > > > > 1) It might be the most practical option, considering you'll have other > > (main) parts of work to do. Also it's not clear whether to include test > > problems in scipy or store them somewhere else. My opinion is that > having a > > collection of problems right in scipy.optimize is good. I don't exactly > > remember why we decided not to include test problems for least_squares, I > > guess because it required many considerations. In theory there are ASV > > benchmarks in scipy, but they don't feel adequate for benchmarking > > optimization problems. So developing benchmarking infrastructure in > > scipy.optimize might be a good side project, my concern is that it will > > distract you from the main path greatly. > > > > To sum up: the simplest way is to pick some problems, write your own > > benchmark suite and store it somewhere else. This way you will be able to > > focus on the algorithms. Other options are possible if you have a really > > good idea how to do them and ready to devote time to it. > > > > 2) My understanding is that you don't need any permissions as you won't > > use CUTEst code (someone could correct me). As I see it: your add an > > interface into scipy, a user install CUTEst and use this interface to > work > > with CUTEst. It doesn't seem very practical, because installing CUTEst > > looks like a pain. Do you agree or I am mistaken? > > > > 3) I guess you can ask an author to reuse his code in scipy to implement > > an interface to CUTEst. > > > > Generally about 2 and 3: it might be a good idea to use CUTEst for your > > internal benchmarking during the development and optionally you can add > an > > interface to CUTEst into scipy. But I think it will be hardly ever used > > after that. > > > > To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking > > seems unpractical. It would be useful if you will actually try to work > with > > CUTEst, maybe it will turn out to be not that difficult. In such case > many > > of my points are not valid. > > > > I hope you'll be able to make some sense from my rambling. > > > > Nikolay. > > > > > > ---- On Mon, 20 Mar 2017 23:24:14 +0500 *Antonio Ribeiro > > >* wrote ---- > > > > Hello, > > > > I am developing my google of summer proposal about constrained > > optimisation in Scipy > > and it will be very important to have a good set of benchmarks. > > > > There is a great library with diverse optimisation benchmarks called > > CUTEst . It is > > under LGPL 2.1 license. > > > > This CUTEst library include a huge amount of problem sets > > and is often used in optimisation papers. Furthermore, many of the > > available > > optimisation software provide some interface to it. I am interested in > > using problems from this set in my project and I want to ask how > > should I proceed? > > > > 1) Select a subset of tests from the CUTEst library and implement them in > > native python under scipy. > > > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > > license is > > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > > permission? > > > > 3) There is an interface for CUTEst implemented in the library pyOpus > > (under GPL3 license) > > . Since this is a > > library external to Scipy (with an incompatible license) how should I > > proceed in this case: can I make my benchmarks dependent on it? Can we > ask > > permission for include this interface in Scipy? > > > > Any suggestions on how to proceed and what to include in my proposal is > > very welcome. > > > > Ant?nio > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > -- > _____________________________________ > Dr. Andrew Nelson > > > _____________________________________ > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170321/de9dde91/attachment.html> > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > ------------------------------ > > End of SciPy-Dev Digest, Vol 161, Issue 28 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jomsdev at gmail.com Wed Mar 22 02:07:05 2017 From: jomsdev at gmail.com (Jordi Montes) Date: Tue, 21 Mar 2017 23:07:05 -0700 Subject: [SciPy-Dev] Sketching-based Matrix Computations Message-ID: *# I started this discussion on GitHub . This email is a digested version of the conversation to check what people think about this on the mailing list.* TL;DR: *I think having an implementation for some randomize matrix algorithms would be great. I would like to discuss if the community finds it useful too.* Hello, I am Jordi. I have been working on a xdata open source project for the last year called libSkylark . The library is suitable for general statistical data analysis and optimization applications, but it is heavily focused on distributed systems. *Randomized matrix algorithms* have been a hot topic in research the last years. Recent developments have shown their utility in large-scale machine learning and statistical data analysis applications. Although many ML applications would take advantage of the implementation of these algorithms the scope of them goes far beyond. I believe it would be a good idea integrate some of them in Scipy and I would like to know the opinion of others to check if it would be worth it. To my eyes, Randomized Numerical Linear Algebra is big/useful enough to has its submodule. However, this is something that the community has to decide. Meanwhile, to demonstrate that these methods can be useful I would implement them under, scipy.linalg where the scipy linear algebra functions live. The idea is to bring value since the first moment. I will implement Blendenpik , a least-squares solver based on these techniques that outperforms LAPACK by significant factors and scales significantly better than any QR-based solver. The proposal is: - Implement CWT . CWT lets us create a matrix *S* such as *SA* is shorter than *A but conservating some properties*. - Compute *QR = SA* using scipy. - From here, there are two potential features: - Implement Blendenpik for solving Linear Square problems faster without losing any accuracy. - Compute the squared row norms to obtain leverage scores. Would be better to start with Blendenpik, because it solves a more common problem by far. I am thinking about doing it for sparse matrices first because CWT can take advantage of it. However, I wonder if the QR method on script takes advantage of sparsity too. I have already set up the development environment so I could start as soon as we decide what to do. As I said, I am open to discussion, so anyone who could be interested is welcome. Thank you, Jordi. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Mar 22 05:23:15 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 22 Mar 2017 22:23:15 +1300 Subject: [SciPy-Dev] migration of all scipy.org mailing lists Message-ID: Hi all, The server for the scipy.org mailing list is in very bad shape, so we (led by Didrik Pinte) are planning to complete the migration of active mailing lists to the python.org infrastructure and to decommission the lists than seem dormant/obsolete. The scipy-user mailing list was already moved to python.org a month or two ago, and that migration went smoothly. These are the lists we plan to migrate: astropy ipython-dev ipython-user numpy-discussion numpy-svn scipy-dev scipy-organizers scipy-svn And these we plan to retire: Announce APUG Ipython-tickets Mailman numpy-refactor numpy-refactor-git numpy-tickets Pyxg scipy-tickets NiPy-devel There will be a short period (<24 hours) where messages to the list may be refused, with an informative message as to why. The mailing list address will change from listname at scipy.org to listname at python.org This will happen asap, likely within a day or two. So two requests: 1. If you see any issue with this plan, please reply and keep Didrik and myself on Cc (we are not subscribed to all lists). 2. If you see this message on a numpy/scipy list, but not on another list (could be due to a moderation queue) then please forward this message again to that other list. Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Mar 22 05:24:37 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 22 Mar 2017 22:24:37 +1300 Subject: [SciPy-Dev] migration of all scipy.org mailing lists In-Reply-To: References: Message-ID: (and now with Didrik on Cc - apologies) On Wed, Mar 22, 2017 at 10:23 PM, Ralf Gommers wrote: > Hi all, > > The server for the scipy.org mailing list is in very bad shape, so we > (led by Didrik Pinte) are planning to complete the migration of active > mailing lists to the python.org infrastructure and to decommission the > lists than seem dormant/obsolete. > > The scipy-user mailing list was already moved to python.org a month or > two ago, and that migration went smoothly. > > These are the lists we plan to migrate: > > astropy > ipython-dev > ipython-user > numpy-discussion > numpy-svn > scipy-dev > scipy-organizers > scipy-svn > > And these we plan to retire: > > Announce > APUG > Ipython-tickets > Mailman > numpy-refactor > numpy-refactor-git > numpy-tickets > Pyxg > scipy-tickets > NiPy-devel > > There will be a short period (<24 hours) where messages to the list may be > refused, with an informative message as to why. The mailing list address > will change from listname at scipy.org to listname at python.org > > This will happen asap, likely within a day or two. So two requests: > 1. If you see any issue with this plan, please reply and keep Didrik and > myself on Cc (we are not subscribed to all lists). > 2. If you see this message on a numpy/scipy list, but not on another list > (could be due to a moderation queue) then please forward this message again > to that other list. > > Thanks, > Ralf > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From juanshishido at gmail.com Sat Mar 11 23:56:06 2017 From: juanshishido at gmail.com (Juan Shishido) Date: Sat, 11 Mar 2017 20:56:06 -0800 Subject: [SciPy-Dev] Fwd: SciPy 2017 Registration Now Open In-Reply-To: References: <1773.1488919711.61692924258bf1c9f6c9c26.43380076@etouches.com> Message-ID: Hello, I'm part of the SciPy 2017 organizing committee and would like to share some information regarding our upcoming conference. SciPy is a community dedicated to the advancement of scientific computing through open source Python software for mathematics, science, and engineering. The annual SciPy Conference allows participants from all types of organizations to showcase their latest projects, learn from skilled users and developers, and collaborate on code development. The conference features two days of tutorials by followed by three days of presentations, and concludes with two days of developer sprints on projects of interest to attendees. The topics presented at SciPy are very diverse, with a focus on advanced software engineering and original uses of Python and its scientific libraries, either in theoretical or experimental research, from both academia and the industry. Please consider participating by submitting a talk or tutorial proposal or by simply attending. Thanks! ---------- Forwarded message ---------- From: "SciPy 2017 Organizers" Date: Mar 7, 2017 12:48 PM Subject: SciPy 2017 Registration Now Open To: Cc: Early Bird Registration until May 22. Register now ! T-Minus 15 Days for Tutorial Submissions *Due Date: March 22, 2016* The SciPy experience kicks off with two days of tutorials (July 10-11). These sessions provide affordable access to expert training, and consistently receive fantastic feedback from participants. We're looking for submissions on topics from introductory to advanced - we'll have attendees across the gamut looking to learn. Plus, you can earn an instructor stipend to apply towards your conference participation. Visit the SciPy 2017 website for details or: Submit a Tutorial Proposal Here Talk and Poster Proposals Due March 27th *Visit the SciPy 2017 website for full details or click here to submit a proposal .* Choose a topic in one of the 3 main conference tracks: - Scientific Computing in Python (General track) - AI and Machine Learning - SciPy Tools * And/or submit for one of the 10 domain-specific mini-symposia:* - Astronomy - Biology, Biophysics and Biostatistics - Computational Science and Numerical Techniques - Data Science - Earth, Ocean and Geo Science - Materials Science - Neuroscience - Open Data and Reproducibility - Python and Hardware Data Acquisition - Social Sciences Submit a Talk or Poster Proposal Here Financial Scholarship Applications Now Being Accepted With the support of our sponsors, the SciPy conference provides financial assistance to attendees based on both community contribution and financial need. In 2016 we were able to support 20 applicants, including 5 diversity aid recipients, selected for their outstanding contributions to open source scientific Python projects. *If you'd like to be considered for financial scholarship, please apply here before April 21, 2017.* Calendar and Important Deadlines - *Currently Open: *Sprint submissions, BoF submissions, financial scholarship applications, talk submissions, registration - *March 22, 2017: *Tutorial submission deadline *March 27, 2017*: Talk and Poster submission deadline - *Apr 21, 2017:* Financial scholarship application deadline - *Apr 24, 2017:*Tutorials speakers and schedule announced - *May 12, 2017:* General conference speakers announced - *May 22, 2017*: Early-bird registration ends - *Jun 6, 2017:* BoF and Sprint submission deadline (for pre-scheduling and inclusion in program) Follow @SciPy on Twitter , Facebook & Google+ for highlights from previous SciPy conferences and all of the latest 2017 updates. ------------------------------ Sponsor Thank Yous! We gratefully recognize two of our SciPy conference gold sponsors, whose contributions ensure the accessibility of the conference and growth of the important work being done by the scientific Python community. ------------------------------ Know a company that might want to sponsor SciPy 2017 or provide scholarship funding? Share the sponsorship prospectus -------------------------------------- To unsubscribe from this mailing list, please click here -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Mar 22 05:13:54 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 22 Mar 2017 22:13:54 +1300 Subject: [SciPy-Dev] migration of all scipy.org mailing lists Message-ID: Hi all, The server for the scipy.org mailing list is in very bad shape, so we (led by Didrik Pinte) are planning to complete the migration of active mailing lists to the python.org infrastructure and to decommission the lists than seem dormant/obsolete. The scipy-user mailing list was already moved to python.org a month or two ago, and that migration went smoothly. These are the lists we plan to migrate: astropy ipython-dev ipython-user numpy-discussion numpy-svn scipy-dev scipy-organizers scipy-svn And these we plan to retire: Announce APUG Ipython-tickets Mailman numpy-refactor numpy-refactor-git numpy-tickets Pyxg scipy-tickets NiPy-devel This will happen asap, likely within a day or two. So two requests: 1. If you see any issue with this plan, please reply and keep Didrik and myself on Cc (we are not subscribed to all lists). 2. If you see this message on a numpy/scipy list, but not on another list (could be due to a moderation queue) then please forward this message again to that other list. Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From haberland at ucla.edu Wed Mar 22 18:35:06 2017 From: haberland at ucla.edu (Matt Haberland) Date: Wed, 22 Mar 2017 15:35:06 -0700 Subject: [SciPy-Dev] (no subject) In-Reply-To: References: Message-ID: Hi Qianhui, I'm Matt; I'll be co-mentoring the scipy.diff project if a proposal is selected, so my answer is geared towards that. It seems that you have found https://github.com/scipy/scipy/wiki/GSoC-2017-project-ideas as you wrote "implement scipy.diff (numerical differentiation)". If you haven't already followed the recommended reading links, I would continue with those. Read carefully, researching things you're not familiar with. I suggest that you adopt the liberal interpretation of 'numerical differentiation' - the evaluation of the numerical values of derivatives - rather than the more restrictive definition of 'finite differences'. Please research and consider the pros and cons of the various methods of evaluating the numerical values of derivatives, including automatic differentiation and complex step methods. Before writing a proposal, consider the following: What are some common applications that require numerical derivatives, what differentiation methods are most suitable for these applications, and how does that information suggest what capabilities scipy.diff should have? What algorithms can you find (in academic literature and textbooks), and under what determines which is best for a particular application? Which can you hope to implement, given constraints on time and expertise? (Your mentors may not be differentiation experts - I am not - so you might have to find answers about complicated algorithms on your own!) What existing code can you draw from, and what shortcomings of that code will you need to address? (Check bug reports, for example.) With all that in mind, synthesize a schedule for creating the most useful scipy.diff in the time you'll have available, and outline a path for future work. I'll say that I'm particularly interested in derivatives for nonlinear programming, especially for solving optimal control problems using direct methods. In particular, I have Python code for evaluating the objective function and constraints, and accurate derivates can greatly improve convergence rates, so automatic differentiation (AD) is the natural choice. However, other applications may have black box functions, in which case AD is not an option. Are there situations in which AD is not possible but complex step methods can be used and would outperform (real) finite differences? You can find more general thoughts about writing good GSoC proposals online; I but these are the things that come to my mind when I think about a scipy.diff proposal. Matt On Sat, Mar 18, 2017 at 7:32 PM, Qianhui Wan wrote: > Hi all, > > I'm a senior undergraduate in US. My direction is in applied math and > numerical analysis so I want to contribute to our suborg this year :) This > is my first time to apply for GSoC. > > As I mentioned above, I've already had some background in related math and > I have some experience in solving PDEs numerically with classical schemes. > In the ideas list, I'm more interested in "implement scipy.diff (numerical > differentiation)" and "improve the parabolic cylinder functions". For > programming languages, I'm a intermediate learner in Python, MATLAB and R, > also having some skills in C/C++. > > If anyone can give me some tips in the requirements of these two projects > and proposal writing I'll be very appreciate, and if you have suggestions > in choosing projects or developing ideas please let me know. Thanks! > > Best, > Qianhui > > Qianhui Wan, senior > Fall 2016 - Spring 2017, visiting, Math Department, University of > Wisconsin, Madison, US > Spring 2016, visiting, Math Department, University of California, > Berkeley, US > Fall 2013 - Spring 2017, Math School, Sun Yat-sen University, Guangzhou, > China > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 7620E Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Thu Mar 23 10:31:32 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Thu, 23 Mar 2017 20:01:32 +0530 Subject: [SciPy-Dev] GSoC 2017: BSpline support In-Reply-To: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> Message-ID: Hello, I need some help with *"**Convert a CubicSpline/PPoly object to a BSpline". *I am unable to find suitable literature for it, or maybe I am overlooking something. I would be really grateful if you could guide me to some reading material or code samples. Thank You On 20 March 2017 at 03:38, Nikolay Mayorov wrote: > For future I suggest to put your proposal in markdown format somewhere on > github (scipy wiki is a good place). > > I don't know what is the correct and productive way to review such > proposal, so these are some thoughts on some of your points. > > *> Rework the tutorial for scipy.interpolate:* The 1D interpolation > section prominently recommends interp1d. We likely want to > tone it down and recommend instead CubicSpline, Akima1DInterpolator, > BSpline etc. > This can be done after reading some basics about these algorithms and > making the changes as required. > > > Honestly I don't think it is that easy, you definitely need to have a > certain taste and interest to rewrite this tutorial. My guess is that you > will be able to do it better after you finished the main algorithmic and > coding work. > > *> Come up with better names for make_interp_spline/make_lsq_spline > routines.* > This too can be handled after some discussion. > > > I don't think it is necessary to put things like that in the proposal, > you'll figure it out (with Evgeni) as the work progresses. > > *> Add string aliases to the bc_type keyword of make_interp_spline* to > match what CubicSpline allows. I.e. > bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) > This requires mainly imitating whats there in CubicSpline so wont be very > difficult. > > > This can be indeed your first small objective. The documentation for > bc_type in make_interp_spline reads > > "Each of these must be an *iterable* of pairs (order, value) which gives > the values of derivatives of specified orders at the given edge of the > interpolation interval." > > So BSpline is quite general and allows specifying conditions for several > derivatives at one end (as opposed to CubicSpline). Maybe it will need some > special considerations, just something I noticed. > > > *> Convert a CubicSpline object to a BSpline* > This could be done by assigning the __class__ attribute to the instance. > However, since this is not preferred, we may > construct a new method which will create a new BSpline object using a > CubicSpline object by assigning the corresponding attribute > values. (I need your suggestion, which approach to follow?) > > > I believe ideally there should be a factory method like > `BSpline.from_ppoly(...)`, as a compromise something like > `BSpline.from_cubic_spline`, but I'm not sure whether it will be that > valuable in that form. The problem might be not that simple, do some > research on now to make conversion between polynomial and B-spline basis. > > *> Knot insertion:* BSpline class should grow a method insert_knot (or > insert_knots). This can either wrap fitpack.insert > directly, or reuse parts of it (e.g. retain the buffer swapping plumbing > from _fitpack and reimplement the Fortran routine > fitpack/fpinst.f in C or Cython). > Since fitpack.insert is already implemented for BSplines, wrapping > around it would be the simplest solution as I perceive. > > > As I see there is already a public function insert (implemented in > fitpack.py). So just moving it in as a method pretty much does the job. > However is the final aim is to move away from FITPACK, then reimplementing > insertion algorithm in Cython is useful. Additionally it can be an entry > into more serious problems to follow and doing things in Cython. (Btw, I'm > not sure what Evgeni meant by "e.g. retain the buffer swapping plumbing".) > > *> Root-finding:* At a minimum, sproot should be wrapped into a method of > BSpline class. Better still, it should be generalized to allow > non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. > This too can be done by imitating PPoly.roots and PPoly.solve. > > > Writing a new implementation for arbitrary spline orders looks like an > interesting and challenging problem. Consider doing it instead of just > minimal wrapper. > > All in all the two previous problems, if done in full capacity, don't seem > easy and can occupy you for the good part of the GSoC. > > *BSpline Variations:-* > *> Cardinal BSplines (BSplines with equidistant knots).* We need > representation, evaluation and construction. scipy.signal has > some. The task here is to have an interface which is both compatible with > regular b-splines and piecewise polynomials and is useful for > the signal processing folks. > For implementing this feature I will have to read some material to get > a good understanding of Cardinal BSplines and its > implementation > These are two of the places (for now) I would start my study from. The > implementation in scipy.signal can be of help. > https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline > http://dx.doi.org/10.1016/j.aml.2010.06.029 > This task could take a long time, maybe 3 weeks or more. > > *> Spline fitting to data.* ATM there's only FITPACK, and we want a more > controllable alternative. A first exploratory shot could be > to implement de Boors's newknot routine and see how it behaves in real life. > This too requires reading of some literature to understand the math and > also looking at the FITPACK code for guidance. > The following literature can be helpful. > https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/ > spline/B-spline/de-Boor.html > https://en.wikipedia.org/wiki/De_Boor%27s_algorithm > > Both these tasks would take over a month for complete implementation. > > > Generally I suggest to pick (at least for now) one subject most > interesting to you and think it through in more details. Now it's unclear > what exactly you want to program (which classes, functions, etc.) and time > estimates don't look reliable. For "Spline fitting to data" I'm somewhat > confused, because make_lsq_spline and make_interp_spline seem to already > solve fitting problems without relying on FITPACK. Evgeni, what is the > situation here? > > For now that's all from me. To sum up: focus on fewer things with more > details, you don't necessary need to go into the most advanced things. If > I'm not mistaken, the whole project can be organized as "get rid of some > things from fitpack" and it can be good. > > I think Evgeni will clarify/comment on things. > > > Nikolay. > > > After looking at the issue tracker #6730 > I have made my first > proposal for the GSoC project "Improve support for B-splines". This is in > continuation with my first mail regarding the same matter. > > This is a long list of what I think could be done as part of the project. > I don't have much knowledge of Splines or B-Splines hence the work plan is > not very detailed. Please have a look. > > *Minor changes:-* > ::: Documentation: > *> Rework the tutorial for scipy.interpolate:* The 1D interpolation > section prominently recommends interp1d. We likely want to > tone it down and recommend instead CubicSpline, Akima1DInterpolator, > BSpline etc. > This can be done after reading some basics about these algorithms and > making the changes as required. > > *> Come up with better names for make_interp_spline/make_lsq_spline > routines.* > This too can be handled after some discussion. > > Both these documentation changes can be done within a few days time, 3 to > 4 days should be enough. > > ::: Enhancements: > *> Add string aliases to the bc_type keyword of make_interp_spline* to > match what CubicSpline allows. I.e. > bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) > This requires mainly imitating whats there in CubicSpline so wont be very > difficult. > > *> Convert a CubicSpline object to a BSpline* > This could be done by assigning the __class__ attribute to the instance. > However, since this is not preferred, we may > construct a new method which will create a new BSpline object using a > CubicSpline object by assigning the corresponding attribute > values. (I need your suggestion, which approach to follow?) > > Both these tasks should not take more than 6 to 7 days, including all the > documentation and unit tests. > > > *Major Enhancements:-* > *> Knot insertion:* BSpline class should grow a method insert_knot (or > insert_knots). This can either wrap fitpack.insert > directly, or reuse parts of it (e.g. retain the buffer swapping plumbing > from _fitpack and reimplement the Fortran routine > fitpack/fpinst.f in C or Cython). > Since fitpack.insert is already implemented for BSplines, wrapping > around it would be the simplest solution as I perceive. > > *> Root-finding:* At a minimum, sproot should be wrapped into a method of > BSpline class. Better still, it should be generalized to allow > non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. > This too can be done by imitating PPoly.roots and PPoly.solve. > > Both these tasks can be completed within 10 days time including all the > documentation and tests. > > *BSpline Variations:-* > *> Cardinal BSplines (BSplines with equidistant knots).* We need > representation, evaluation and construction. scipy.signal has > some. The task here is to have an interface which is both compatible with > regular b-splines and piecewise polynomials and is useful for > the signal processing folks. > For implementing this feature I will have to read some material to get > a good understanding of Cardinal BSplines and its > implementation > These are two of the places (for now) I would start my study from. The > implementation in scipy.signal can be of help. > https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline > http://dx.doi.org/10.1016/j.aml.2010.06.029 > This task could take a long time, maybe 3 weeks or more. > > *> Spline fitting to data.* ATM there's only FITPACK, and we want a more > controllable alternative. A first exploratory shot could be > to implement de Boors's newknot routine and see how it behaves in real life. > This too requires reading of some literature to understand the math and > also looking at the FITPACK code for guidance. > The following literature can be helpful. > https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/ > spline/B-spline/de-Boor.html > https://en.wikipedia.org/wiki/De_Boor%27s_algorithm > > Both these tasks would take over a month for complete implementation. > > *optional:* > I am not sure about the scope of these additions. Also, if we decide to > implement these, I am not quite sure they can be > completed within the span of GSoC. Probably, we can just take up one of > these tasks. > > *> PSpline:* The term P-spline stands for "penalized B-spline". It refers > to using the B-spline representation where the > coefficients are determined partly by the data to be fitted, and partly by > an additional penalty function that aims to impose > smoothness to avoid overfitting. > Adequate literature can be found here, > https://projecteuclid.org/download/pdf_1/euclid.ss/1038425655 > > SAS have already implemented it, > https://support.sas.com/documentation/cdl/en/statug/ > 63033/HTML/default/viewer.htm#statug_transreg_sect020.htm > > This feature could take up a lot of time, including > documentation,tests,examples etc. > > *> Non-Uniform Rational B-splines (NURBS):* Unlike simple B-splines, the > control points each have a weight. > The following literature can be helpful, > https://en.wikipedia.org/wiki/Non-uniform_rational_B-spline > https://www.rhino3d.com/nurbs > The following articles can help with the implementation. > https://www.codeproject.com/Articles/996281/NURBS-curve-made-easy > http://www.nar-associates.com/nurbs/c_code.html#chapter4 > This feature too can take up lots of time for full completion. > > *> Constructing periodic splines*. Fitpack does it. With full matrices > it's straightforward, a naive implementation with banded > matrices has issues with numerical stability. > We can definitely look into the banded matrix issue if time permits. I > am not sure what we may find so cant say anything about > the time required. > > Final Assessment:- > After these tasks have been completed. All the new features,methods would > undergo rigorous tests and doctests. Benchmark tests could also be added > wherever necessary. Features having scope for work could be reported on > GitHub. > > I plan on reading as much as I can related to Cardinal BSplines,De Boor's > newknot routine,Penalized BSplines,NURBS etc. before the start of GSoC. > This will help me in saving time finding literature related to all these > new additions. > > Looking forward to feedback and corrections. > > > On 16 March 2017 at 20:03, Evgeni Burovski > wrote: > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > On Thu, Mar 16, 2017 at 11:06 AM, Aman Pratik > wrote: > > > Hello Developers, > > This is Aman Pratik. I am currently pursuing my B.Tech from Indian > Institute of Technology, Varanasi. I am a keen software developer and not > very new to the open source community. I am interested in your project > "Improve support for B-splines" for GSoC 2017. > > I have been working in Python for the past 2 years and have good idea > about Mathematical Methods and Techniques. I am quite familiar with scipy > mainly as a user and also as a developer. I also have some experience with > Cython. > > These are the PRs I have worked/working on for the past month. > > ENH: Add evaluation of periodic splines #6730 > > > DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) > > > ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 > > > My GitHub Profile: https://www.github.com/amanp10 > > I am interested in Mathematical models and functions and therefore willing > to work on splines. Also, I am familiar with Benchmark tests, Unit tests > and other technical knowledge I would require for this project. I have > started my study for the subject and am looking forward to guidance from > the potential mentors or other developers. > > Thank You > > > > Hi Aman, and welcome! > > I see that you already started with https://github.com/scipy/ > scipy/pull/7185, great! > > We'll also need a proposal. Since writing a good proposal typically takes > several iterations, I encourage you to start working on it, too. When you > have a draft, please send it to the list. > > All the best, > > Evgeni > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Mar 24 05:08:37 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 24 Mar 2017 22:08:37 +1300 Subject: [SciPy-Dev] mailing list moved Message-ID: Hi all, This mailing list moved, it's now scipy-dev at python.org. Please update the stored address in the contacts list of your email client - messages sent to @scipy.org will not arrive anymore. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Fri Mar 24 07:15:32 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Fri, 24 Mar 2017 16:15:32 +0500 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> Message-ID: <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Resent my answer to the new address. ============ Forwarded message ============ >From : Nikolay Mayorov <nikolay.mayorov at zoho.com> To : "SciPy Developers List"<scipy-dev at scipy.org> Date : Fri, 24 Mar 2017 01:07:10 +0500 Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support ============ Forwarded message ============ De Boor, "Practical guide to splines", Chapter IX should be the place for your. Likely you will need to thoroughly study several chapters in it, it's not an easy book to quickly pick up a recipe from it. Nikolay. ---- On Thu, 23 Mar 2017 19:31:32 +0500 Aman Pratik <amanpratik10 at gmail.com> wrote ---- Hello, I need some help with "Convert a CubicSpline/PPoly object to a BSpline". I am unable to find suitable literature for it, or maybe I am overlooking something. I would be really grateful if you could guide me to some reading material or code samples. Thank You On 20 March 2017 at 03:38, Nikolay Mayorov <nikolay.mayorov at zoho.com> wrote: _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev For future I suggest to put your proposal in markdown format somewhere on github (scipy wiki is a good place). I don't know what is the correct and productive way to review such proposal, so these are some thoughts on some of your points. > Rework the tutorial for scipy.interpolate: The 1D interpolation section prominently recommends interp1d. We likely want to tone it down and recommend instead CubicSpline, Akima1DInterpolator, BSpline etc. This can be done after reading some basics about these algorithms and making the changes as required. Honestly I don't think it is that easy, you definitely need to have a certain taste and interest to rewrite this tutorial. My guess is that you will be able to do it better after you finished the main algorithmic and coding work. > Come up with better names for make_interp_spline/make_lsq_spline routines. This too can be handled after some discussion. I don't think it is necessary to put things like that in the proposal, you'll figure it out (with Evgeni) as the work progresses. > Add string aliases to the bc_type keyword of make_interp_spline to match what CubicSpline allows. I.e. bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) This requires mainly imitating whats there in CubicSpline so wont be very difficult. This can be indeed your first small objective. The documentation for bc_type in make_interp_spline reads "Each of these must be an iterable of pairs (order, value) which gives the values of derivatives of specified orders at the given edge of the interpolation interval." So BSpline is quite general and allows specifying conditions for several derivatives at one end (as opposed to CubicSpline). Maybe it will need some special considerations, just something I noticed. > Convert a CubicSpline object to a BSpline This could be done by assigning the __class__ attribute to the instance. However, since this is not preferred, we may construct a new method which will create a new BSpline object using a CubicSpline object by assigning the corresponding attribute values. (I need your suggestion, which approach to follow?) I believe ideally there should be a factory method like `BSpline.from_ppoly(...)`, as a compromise something like `BSpline.from_cubic_spline`, but I'm not sure whether it will be that valuable in that form. The problem might be not that simple, do some research on now to make conversion between polynomial and B-spline basis. > Knot insertion: BSpline class should grow a method insert_knot (or insert_knots). This can either wrap fitpack.insert directly, or reuse parts of it (e.g. retain the buffer swapping plumbing from _fitpack and reimplement the Fortran routine fitpack/fpinst.f in C or Cython). Since fitpack.insert is already implemented for BSplines, wrapping around it would be the simplest solution as I perceive. As I see there is already a public function insert (implemented in fitpack.py). So just moving it in as a method pretty much does the job. However is the final aim is to move away from FITPACK, then reimplementing insertion algorithm in Cython is useful. Additionally it can be an entry into more serious problems to follow and doing things in Cython. (Btw, I'm not sure what Evgeni meant by "e.g. retain the buffer swapping plumbing".) > Root-finding: At a minimum, sproot should be wrapped into a method of BSpline class. Better still, it should be generalized to allow non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. This too can be done by imitating PPoly.roots and PPoly.solve. Writing a new implementation for arbitrary spline orders looks like an interesting and challenging problem. Consider doing it instead of just minimal wrapper. All in all the two previous problems, if done in full capacity, don't seem easy and can occupy you for the good part of the GSoC. BSpline Variations:- > Cardinal BSplines (BSplines with equidistant knots). We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks. For implementing this feature I will have to read some material to get a good understanding of Cardinal BSplines and its implementation These are two of the places (for now) I would start my study from. The implementation in scipy.signal can be of help. https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline http://dx.doi.org/10.1016/j.aml.2010.06.029 This task could take a long time, maybe 3 weeks or more. > Spline fitting to data. ATM there's only FITPACK, and we want a more controllable alternative. A first exploratory shot could be to implement de Boors's newknot routine and see how it behaves in real life. This too requires reading of some literature to understand the math and also looking at the FITPACK code for guidance. The following literature can be helpful. https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/B-spline/de-Boor.html https://en.wikipedia.org/wiki/De_Boor%27s_algorithm Both these tasks would take over a month for complete implementation. Generally I suggest to pick (at least for now) one subject most interesting to you and think it through in more details. Now it's unclear what exactly you want to program (which classes, functions, etc.) and time estimates don't look reliable. For "Spline fitting to data" I'm somewhat confused, because make_lsq_spline and make_interp_spline seem to already solve fitting problems without relying on FITPACK. Evgeni, what is the situation here? For now that's all from me. To sum up: focus on fewer things with more details, you don't necessary need to go into the most advanced things. If I'm not mistaken, the whole project can be organized as "get rid of some things from fitpack" and it can be good. I think Evgeni will clarify/comment on things. Nikolay. After looking at the issue tracker #6730 I have made my first proposal for the GSoC project "Improve support for B-splines". This is in continuation with my first mail regarding the same matter. This is a long list of what I think could be done as part of the project. I don't have much knowledge of Splines or B-Splines hence the work plan is not very detailed. Please have a look. Minor changes:- ::: Documentation: > Rework the tutorial for scipy.interpolate: The 1D interpolation section prominently recommends interp1d. We likely want to tone it down and recommend instead CubicSpline, Akima1DInterpolator, BSpline etc. This can be done after reading some basics about these algorithms and making the changes as required. > Come up with better names for make_interp_spline/make_lsq_spline routines. This too can be handled after some discussion. Both these documentation changes can be done within a few days time, 3 to 4 days should be enough. ::: Enhancements: > Add string aliases to the bc_type keyword of make_interp_spline to match what CubicSpline allows. I.e. bc_type="natural" should be equivalent to bc_type=((2, 0), (2, 0)) This requires mainly imitating whats there in CubicSpline so wont be very difficult. > Convert a CubicSpline object to a BSpline This could be done by assigning the __class__ attribute to the instance. However, since this is not preferred, we may construct a new method which will create a new BSpline object using a CubicSpline object by assigning the corresponding attribute values. (I need your suggestion, which approach to follow?) Both these tasks should not take more than 6 to 7 days, including all the documentation and unit tests. Major Enhancements:- > Knot insertion: BSpline class should grow a method insert_knot (or insert_knots). This can either wrap fitpack.insert directly, or reuse parts of it (e.g. retain the buffer swapping plumbing from _fitpack and reimplement the Fortran routine fitpack/fpinst.f in C or Cython). Since fitpack.insert is already implemented for BSplines, wrapping around it would be the simplest solution as I perceive. > Root-finding: At a minimum, sproot should be wrapped into a method of BSpline class. Better still, it should be generalized to allow non-cubic splines. The interface should mirror PPoly.solve and PPoly.roots. This too can be done by imitating PPoly.roots and PPoly.solve. Both these tasks can be completed within 10 days time including all the documentation and tests. BSpline Variations:- > Cardinal BSplines (BSplines with equidistant knots). We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks. For implementing this feature I will have to read some material to get a good understanding of Cardinal BSplines and its implementation These are two of the places (for now) I would start my study from. The implementation in scipy.signal can be of help. https://en.wikipedia.org/wiki/B-spline#Cardinal_B-spline http://dx.doi.org/10.1016/j.aml.2010.06.029 This task could take a long time, maybe 3 weeks or more. > Spline fitting to data. ATM there's only FITPACK, and we want a more controllable alternative. A first exploratory shot could be to implement de Boors's newknot routine and see how it behaves in real life. This too requires reading of some literature to understand the math and also looking at the FITPACK code for guidance. The following literature can be helpful. https://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/B-spline/de-Boor.html https://en.wikipedia.org/wiki/De_Boor%27s_algorithm Both these tasks would take over a month for complete implementation. optional: I am not sure about the scope of these additions. Also, if we decide to implement these, I am not quite sure they can be completed within the span of GSoC. Probably, we can just take up one of these tasks. > PSpline: The term P-spline stands for "penalized B-spline". It refers to using the B-spline representation where the coefficients are determined partly by the data to be fitted, and partly by an additional penalty function that aims to impose smoothness to avoid overfitting. Adequate literature can be found here, https://projecteuclid.org/download/pdf_1/euclid.ss/1038425655 SAS have already implemented it, https://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_transreg_sect020.htm This feature could take up a lot of time, including documentation,tests,examples etc. > Non-Uniform Rational B-splines (NURBS): Unlike simple B-splines, the control points each have a weight. The following literature can be helpful, https://en.wikipedia.org/wiki/Non-uniform_rational_B-spline https://www.rhino3d.com/nurbs The following articles can help with the implementation. https://www.codeproject.com/Articles/996281/NURBS-curve-made-easy http://www.nar-associates.com/nurbs/c_code.html#chapter4 This feature too can take up lots of time for full completion. > Constructing periodic splines. Fitpack does it. With full matrices it's straightforward, a naive implementation with banded matrices has issues with numerical stability. We can definitely look into the banded matrix issue if time permits. I am not sure what we may find so cant say anything about the time required. Final Assessment:- After these tasks have been completed. All the new features,methods would undergo rigorous tests and doctests. Benchmark tests could also be added wherever necessary. Features having scope for work could be reported on GitHub. I plan on reading as much as I can related to Cardinal BSplines,De Boor's newknot routine,Penalized BSplines,NURBS etc. before the start of GSoC. This will help me in saving time finding literature related to all these new additions. Looking forward to feedback and corrections. On 16 March 2017 at 20:03, Evgeni Burovski <evgeny.burovskiy at gmail.com> wrote: _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev On Thu, Mar 16, 2017 at 11:06 AM, Aman Pratik <amanpratik10 at gmail.com> wrote: Hello Developers, This is Aman Pratik. I am currently pursuing my B.Tech from Indian Institute of Technology, Varanasi. I am a keen software developer and not very new to the open source community. I am interested in your project "Improve support for B-splines" for GSoC 2017. I have been working in Python for the past 2 years and have good idea about Mathematical Methods and Techniques. I am quite familiar with scipy mainly as a user and also as a developer. I also have some experience with Cython. These are the PRs I have worked/working on for the past month. ENH: Add evaluation of periodic splines #6730 DOC: Bad documentation of k in sparse.linalg.eigs See #6990 (Merged) ENH: scipy.linalg.eigsh cannot get all eigenvalues #7004 My GitHub Profile: https://www.github.com/amanp10 I am interested in Mathematical models and functions and therefore willing to work on splines. Also, I am familiar with Benchmark tests, Unit tests and other technical knowledge I would require for this project. I have started my study for the subject and am looking forward to guidance from the potential mentors or other developers. Thank You Hi Aman, and welcome! I see that you already started with https://github.com/scipy/scipy/pull/7185, great! We'll also need a proposal. Since writing a good proposal typically takes several iterations, I encourage you to start working on it, too. When you have a draft, please send it to the list. All the best, Evgeni _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From tyler.je.reddy at gmail.com Fri Mar 24 12:35:47 2017 From: tyler.je.reddy at gmail.com (Tyler Reddy) Date: Fri, 24 Mar 2017 10:35:47 -0600 Subject: [SciPy-Dev] License / Copyright Stuff Message-ID: Hi, I'm just wondering about the 'lowest friction' way to continue contributing to scipy when your employer owns the copyright to your open source contributions (but respects the license used by the project). In short, Los Alamos would want me to put one line in one file that says they 'own' my contributions under the license of the project after Date xx/yy/zzzz. Hopefully that's a reasonable compromise (they originally wanted to put a line in each file & I said that definitely would not fly). Not sure how annoyed people would be by that, or what specific file would be used, but they have been accommodating so if there's a request on the scipy end I could bring it up with them. The biggest drawback I'm aware of is that changes to licenses may sometimes require (?) the involvement of the contributors (copyright holders)--if one such copyright holder is a national lab instead of a single individual, this might make the process a bit clunkier. Best wishes, Tyler -------------- next part -------------- An HTML attachment was scrubbed... URL: From peridot.faceted at gmail.com Fri Mar 24 12:48:02 2017 From: peridot.faceted at gmail.com (Anne Archibald) Date: Fri, 24 Mar 2017 16:48:02 +0000 Subject: [SciPy-Dev] License / Copyright Stuff In-Reply-To: References: Message-ID: Not a lawyer, but currently the AUTHORS file indicates who owns the copyrights to scipy code. Perhaps a note on your line in the AUTHORS file to indicate that your contributions (on some dates) are owned by Los Alamos? I would note in passing that many scipy contributors (including me) probably have or had similar constraints but are lucky enough that their employers don't care (or don't know). Many universities have rules about who owns intellectual property that can be surprising to their employees. For many open source projects, there are enough contributors already that contacting everyone who owns the copyright is extremely difficult, and getting them all to agree to anything is almost impossible; thus large open-source projects rarely make such license changes, or do so with rather dubious legality by getting agreement from current and major past contributors only. I don't know that a national lab is any more likely to be trouble than an individual open-source contributor; at the least they should be easier to get a hold of. Anne On Fri, Mar 24, 2017 at 5:36 PM Tyler Reddy wrote: > Hi, > > I'm just wondering about the 'lowest friction' way to continue > contributing to scipy when your employer owns the copyright to your open > source contributions (but respects the license used by the project). > > In short, Los Alamos would want me to put one line in one file that says > they 'own' my contributions under the license of the project after Date > xx/yy/zzzz. Hopefully that's a reasonable compromise (they originally > wanted to put a line in each file & I said that definitely would not fly). > > Not sure how annoyed people would be by that, or what specific file would > be used, but they have been accommodating so if there's a request on the > scipy end I could bring it up with them. > > The biggest drawback I'm aware of is that changes to licenses may > sometimes require (?) the involvement of the contributors (copyright > holders)--if one such copyright holder is a national lab instead of a > single individual, this might make the process a bit clunkier. > > Best wishes, > Tyler > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From shoyer at gmail.com Fri Mar 24 13:18:49 2017 From: shoyer at gmail.com (Stephan Hoyer) Date: Fri, 24 Mar 2017 10:18:49 -0700 Subject: [SciPy-Dev] License / Copyright Stuff In-Reply-To: References: Message-ID: The SciPy license says that it is "Copyright (c) 2003-2016 SciPy Developers", but you'd have to look in the git logs to figure out who those "SciPy Developers" are. That's good enough for some employers. For example, Google owns my recent contributions to NumPy but did not insist on adding a "Copyright Google" line to NumPy's license. On Fri, Mar 24, 2017 at 9:48 AM, Anne Archibald wrote: > Not a lawyer, but currently the AUTHORS file indicates who owns the > copyrights to scipy code. Perhaps a note on your line in the AUTHORS file > to indicate that your contributions (on some dates) are owned by Los > Alamos? > > I would note in passing that many scipy contributors (including me) > probably have or had similar constraints but are lucky enough that their > employers don't care (or don't know). Many universities have rules about > who owns intellectual property that can be surprising to their employees. > > For many open source projects, there are enough contributors already that > contacting everyone who owns the copyright is extremely difficult, and > getting them all to agree to anything is almost impossible; thus large > open-source projects rarely make such license changes, or do so with rather > dubious legality by getting agreement from current and major past > contributors only. I don't know that a national lab is any more likely to > be trouble than an individual open-source contributor; at the least they > should be easier to get a hold of. > > Anne > > On Fri, Mar 24, 2017 at 5:36 PM Tyler Reddy > wrote: > >> Hi, >> >> I'm just wondering about the 'lowest friction' way to continue >> contributing to scipy when your employer owns the copyright to your open >> source contributions (but respects the license used by the project). >> >> In short, Los Alamos would want me to put one line in one file that says >> they 'own' my contributions under the license of the project after Date >> xx/yy/zzzz. Hopefully that's a reasonable compromise (they originally >> wanted to put a line in each file & I said that definitely would not fly). >> >> Not sure how annoyed people would be by that, or what specific file would >> be used, but they have been accommodating so if there's a request on the >> scipy end I could bring it up with them. >> >> The biggest drawback I'm aware of is that changes to licenses may >> sometimes require (?) the involvement of the contributors (copyright >> holders)--if one such copyright holder is a national lab instead of a >> single individual, this might make the process a bit clunkier. >> >> Best wishes, >> Tyler >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jomsdev at gmail.com Fri Mar 24 13:23:25 2017 From: jomsdev at gmail.com (Jordi Montes) Date: Fri, 24 Mar 2017 10:23:25 -0700 Subject: [SciPy-Dev] Sketching-based Matrix Computations In-Reply-To: References: Message-ID: Short answer: Yes. Long answer: Yes but not for all kind of matrices. What I am proposing here is given a matrix A create a matrix S (using CWT) with the following property with probability of at least 9/10: ||SAx||_2 = (1+-epsilon)||Ax||_2 for all x in R. These matrices (S) are called subspace embeddings. SA can be computed in O(nnz(A)) and the property So basically it is an improvement for overconstrained least-squares regression, low-rank approximation, approximation all leverage scores and l_p-regession. On top of that, implementing Blendenpik would remove the uncertainty (I would use the result of CWT as a preconditioner). Blendenpik beats LAPACK's direct dense least-squares solver by a large margin on essentially any dense *tall *matrix. On 23 March 2017 at 12:24, Matt Haberland wrote: > Do I understand that this is intended to be a faster but otherwise drop-in > replacement for scipy.linalg.lstsq or scipy.sparse.linalg.lsqr? > I would definitely be interested in an alternative sparse least squares > solver. I could use it. > > On Mar 21, 2017 11:09 PM, "Jordi Montes" wrote: > >> *# I started this discussion on GitHub >> . This email is a digested >> version of the conversation to check what people think about this on the >> mailing list.* >> >> >> TL;DR: *I think having an implementation for some randomize matrix >> algorithms would be great. I would like to discuss if the community finds >> it useful too.* >> >> >> Hello, >> >> I am Jordi. I have been working on a xdata >> open source project for the last >> year called libSkylark . >> The library is suitable for general statistical data analysis and >> optimization applications, but it is heavily focused on distributed systems. >> >> *Randomized matrix algorithms* have been a hot topic in research the >> last years. Recent developments have shown their utility in large-scale >> machine learning and statistical data analysis applications. >> >> Although many ML applications would take advantage of the implementation >> of these algorithms the scope of them goes far beyond. I believe it >> would be a good idea integrate some of them in Scipy and I would like to >> know the opinion of others to check if it would be worth it. >> >> To my eyes, Randomized Numerical Linear Algebra is big/useful enough to >> has its submodule. However, this is something that the community has to >> decide. >> >> Meanwhile, to demonstrate that these methods can be useful I would >> implement them under, scipy.linalg where the scipy linear algebra >> functions live. The idea is to bring value since the first moment. I will >> implement Blendenpik , a least-squares >> solver based on these techniques that outperforms LAPACK by >> significant factors and scales significantly better than any QR-based >> solver. >> >> The proposal is: >> >> - >> >> Implement CWT >> >> . CWT lets us create a matrix *S* such as *SA* is shorter than *A >> but conservating some properties*. >> - >> >> Compute *QR = SA* using scipy. >> - >> >> From here, there are two potential features: >> - >> >> Implement Blendenpik for >> solving Linear Square problems faster without losing any accuracy. >> - >> >> Compute the squared row norms to obtain leverage scores. >> >> Would be better to start with Blendenpik, because it solves a more >> common problem by far. >> >> I am thinking about doing it for sparse matrices first because CWT can >> take advantage of it. However, I wonder if the QR method on script takes >> advantage of sparsity too. >> >> I have already set up the development environment so I could start as >> soon as we decide what to do. As I said, I am open to discussion, so anyone >> who could be interested is welcome. >> >> Thank you, >> >> Jordi. >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Fri Mar 24 13:42:42 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Fri, 24 Mar 2017 20:42:42 +0300 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov wrote: > Resent my answer to the new address. > > > ============ Forwarded message ============ > From : Nikolay Mayorov > To : "SciPy Developers List" > Date : Fri, 24 Mar 2017 01:07:10 +0500 > Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support > ============ Forwarded message ============ > > De Boor, "Practical guide to splines", Chapter IX should be the place for > your. Likely you will need to thoroughly study several chapters in it, it's > not an easy book to quickly pick up a recipe from it. > > Nikolay. > > > ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik > >* wrote ---- > > > > Hello, > I need some help with *"**Convert a CubicSpline/PPoly object to a > BSpline". *I am unable to find suitable literature for it, or maybe I am > overlooking something. I would be really grateful if you could guide me to > some reading material or code samples. > > Lyche and Morken, http://www.uio.no/studier/emner/matnat/ifi/INF-MAT5340/v05/undervisningsmateriale/ can be an easier first read than de Boor (which is, indeed, *the* reference). (Hat tip to Pauli who suggested this to me a while ago). Cheers, Evgeni -------------- next part -------------- An HTML attachment was scrubbed... URL: From tyler.je.reddy at gmail.com Fri Mar 24 14:14:04 2017 From: tyler.je.reddy at gmail.com (Tyler Reddy) Date: Fri, 24 Mar 2017 12:14:04 -0600 Subject: [SciPy-Dev] License / Copyright Stuff In-Reply-To: References: Message-ID: Thanks--a few minutes after posting this another scipy developer who works here (Nathan Woods) called me with another approach (!), so there might be a better solution I can look into. On 24 March 2017 at 11:18, Stephan Hoyer wrote: > The SciPy license says that it is "Copyright (c) 2003-2016 SciPy > Developers", but you'd have to look in the git logs to figure out who those > "SciPy Developers" are. That's good enough for some employers. For example, > Google owns my recent contributions to NumPy but did not insist on adding a > "Copyright Google" line to NumPy's license. > > On Fri, Mar 24, 2017 at 9:48 AM, Anne Archibald > wrote: > >> Not a lawyer, but currently the AUTHORS file indicates who owns the >> copyrights to scipy code. Perhaps a note on your line in the AUTHORS file >> to indicate that your contributions (on some dates) are owned by Los >> Alamos? >> >> I would note in passing that many scipy contributors (including me) >> probably have or had similar constraints but are lucky enough that their >> employers don't care (or don't know). Many universities have rules about >> who owns intellectual property that can be surprising to their employees. >> >> For many open source projects, there are enough contributors already that >> contacting everyone who owns the copyright is extremely difficult, and >> getting them all to agree to anything is almost impossible; thus large >> open-source projects rarely make such license changes, or do so with rather >> dubious legality by getting agreement from current and major past >> contributors only. I don't know that a national lab is any more likely to >> be trouble than an individual open-source contributor; at the least they >> should be easier to get a hold of. >> >> Anne >> >> On Fri, Mar 24, 2017 at 5:36 PM Tyler Reddy >> wrote: >> >>> Hi, >>> >>> I'm just wondering about the 'lowest friction' way to continue >>> contributing to scipy when your employer owns the copyright to your open >>> source contributions (but respects the license used by the project). >>> >>> In short, Los Alamos would want me to put one line in one file that says >>> they 'own' my contributions under the license of the project after Date >>> xx/yy/zzzz. Hopefully that's a reasonable compromise (they originally >>> wanted to put a line in each file & I said that definitely would not fly). >>> >>> Not sure how annoyed people would be by that, or what specific file >>> would be used, but they have been accommodating so if there's a request on >>> the scipy end I could bring it up with them. >>> >>> The biggest drawback I'm aware of is that changes to licenses may >>> sometimes require (?) the involvement of the contributors (copyright >>> holders)--if one such copyright holder is a national lab instead of a >>> single individual, this might make the process a bit clunkier. >>> >>> Best wishes, >>> Tyler >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Fri Mar 24 15:19:28 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Sat, 25 Mar 2017 00:49:28 +0530 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: Hello, This is my second Draft, Proposal : Second Draft This is somewhat incomplete in detailing. It will take me some time to make it more clear and obvious as to what I am going to do. I am facing a problem though, in getting my hands on the book "A Practical Guide to Splines" by Carl R de Boor. Also, I need some more time to get the algorithms clear in my head. So, please bear with me. Looking forward to your feedback. Thank You On 24 March 2017 at 23:12, Evgeni Burovski wrote: > > > On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov > wrote: > >> Resent my answer to the new address. >> >> >> ============ Forwarded message ============ >> From : Nikolay Mayorov >> To : "SciPy Developers List" >> Date : Fri, 24 Mar 2017 01:07:10 +0500 >> Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support >> ============ Forwarded message ============ >> >> De Boor, "Practical guide to splines", Chapter IX should be the place for >> your. Likely you will need to thoroughly study several chapters in it, it's >> not an easy book to quickly pick up a recipe from it. >> >> Nikolay. >> >> >> ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik >> >* wrote ---- >> >> >> >> Hello, >> I need some help with *"**Convert a CubicSpline/PPoly object to a >> BSpline". *I am unable to find suitable literature for it, or maybe I am >> overlooking something. I would be really grateful if you could guide me to >> some reading material or code samples. >> >> > > Lyche and Morken, http://www.uio.no/studier/emner/matnat/ifi/INF-MAT5340/ > v05/undervisningsmateriale/ > can be an easier first read than de Boor (which is, indeed, *the* > reference). > > (Hat tip to Pauli who suggested this to me a while ago). > > Cheers, > > Evgeni > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Fri Mar 24 16:25:30 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Sat, 25 Mar 2017 01:55:30 +0530 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: I will go through it properly. On 24 March 2017 at 23:12, Evgeni Burovski wrote: > > > On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov > wrote: > >> Resent my answer to the new address. >> >> >> ============ Forwarded message ============ >> From : Nikolay Mayorov >> To : "SciPy Developers List" >> Date : Fri, 24 Mar 2017 01:07:10 +0500 >> Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support >> ============ Forwarded message ============ >> >> De Boor, "Practical guide to splines", Chapter IX should be the place for >> your. Likely you will need to thoroughly study several chapters in it, it's >> not an easy book to quickly pick up a recipe from it. >> >> Nikolay. >> >> >> ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik >> >* wrote ---- >> >> >> >> Hello, >> I need some help with *"**Convert a CubicSpline/PPoly object to a >> BSpline". *I am unable to find suitable literature for it, or maybe I am >> overlooking something. I would be really grateful if you could guide me to >> some reading material or code samples. >> >> > > Lyche and Morken, http://www.uio.no/studier/emner/matnat/ifi/INF-MAT5340/ > v05/undervisningsmateriale/ > can be an easier first read than de Boor (which is, indeed, *the* > reference). > > (Hat tip to Pauli who suggested this to me a while ago). > > Cheers, > > Evgeni > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Mar 24 16:55:18 2017 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 24 Mar 2017 13:55:18 -0700 Subject: [SciPy-Dev] License / Copyright Stuff In-Reply-To: References: Message-ID: On Fri, Mar 24, 2017 at 9:48 AM, Anne Archibald wrote: > > Not a lawyer, but currently the AUTHORS file indicates who owns the copyrights to scipy code. Perhaps a note on your line in the AUTHORS file to indicate that your contributions (on some dates) are owned by Los Alamos? That would certainly work on scipy's end. We have plenty of space for each entry to include this note, either next to Tyler's name or under the Institutions subhead. https://github.com/scipy/scipy/blob/master/THANKS.txt > I would note in passing that many scipy contributors (including me) probably have or had similar constraints but are lucky enough that their employers don't care (or don't know). Many universities have rules about who owns intellectual property that can be surprising to their employees. > > For many open source projects, there are enough contributors already that contacting everyone who owns the copyright is extremely difficult, and getting them all to agree to anything is almost impossible; thus large open-source projects rarely make such license changes, or do so with rather dubious legality by getting agreement from current and major past contributors only. I don't know that a national lab is any more likely to be trouble than an individual open-source contributor; at the least they should be easier to get a hold of. Yeah, we're never going to change the overall license both because it's already more trouble than it's worth (adding LANL makes no difference) and our current license is so permissive that there would be no real reason to in any case. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Fri Mar 24 19:13:06 2017 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Fri, 24 Mar 2017 16:13:06 -0700 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: > Hello, > This is my second Draft, Proposal : Second Draft[1] Would this be a good opportunity to add subdivision splines to SciPy? Their implementation (at least in 1D) is simple, and they give good locality around nodes (i.e., yield predictable curves, useful for spline design). An example is given here: https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a St?fan Links: 1. https://github.com/amanp10/scipy/wiki/GSoC-2017-:-Improve-support-for-B-splines -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Sat Mar 25 00:29:22 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Sat, 25 Mar 2017 09:59:22 +0530 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: Seems interesting. Though adding this would require doing away with some of the other part(s) of the project. I will do some research on it. Meanwhile, lets wait for feedback from the mentors. On 25 March 2017 at 04:43, Stefan van der Walt wrote: > On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: > > Hello, > This is my second Draft, Proposal : Second Draft > > > > Would this be a good opportunity to add subdivision splines to SciPy? > Their implementation (at least in 1D) is simple, and they give good > locality around nodes (i.e., yield predictable curves, useful for spline > design). > > An example is given here: > > https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a > > St?fan > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xabart at gmail.com Sat Mar 25 02:23:33 2017 From: xabart at gmail.com (Xavier Barthelemy) Date: Sat, 25 Mar 2017 17:23:33 +1100 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: Dear everyone, While we are talking about splining, I would like to add a request if possible. I am a heavy user of the splprep and co routines, and many times I need to get the local polynomial coefficients to compute the derivatives, and found the analitical zeros. I had found a recipe on stackoverlord years ago, on how to get back to local coefficient from the spline description. Maybe a method, or an option, returning the local polynomial coefficients (and the interval?) could be implemented? it's my 2 cents as a user thanks for considering it and please, discard if you think it's not fit. All the best Xavier 2017-03-25 15:29 GMT+11:00 Aman Pratik : > Seems interesting. Though adding this would require doing away with some > of the other part(s) of the project. I will do some research on it. > Meanwhile, lets wait for feedback from the mentors. > > > > > On 25 March 2017 at 04:43, Stefan van der Walt > wrote: > >> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: >> >> Hello, >> This is my second Draft, Proposal : Second Draft >> >> >> >> Would this be a good opportunity to add subdivision splines to SciPy? >> Their implementation (at least in 1D) is simple, and they give good >> locality around nodes (i.e., yield predictable curves, useful for spline >> design). >> >> An example is given here: >> >> https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a >> >> St?fan >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -- ? Quand le gouvernement viole les droits du peuple, l'insurrection est, pour le peuple et pour chaque portion du peuple, le plus sacr? des droits et le plus indispensable des devoirs ? D?claration des droits de l'homme et du citoyen, article 35, 1793 -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Sat Mar 25 03:23:47 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Sat, 25 Mar 2017 12:53:47 +0530 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: This seems like a nice option, though I am not sure of its importance for computing derivatives. I mean you can use any one of the provided methods to compute the derivative. Please correct me if I am missing something. On 25 March 2017 at 11:53, Xavier Barthelemy wrote: > Dear everyone, > > While we are talking about splining, I would like to add a request if > possible. > I am a heavy user of the splprep and co routines, and many times I need to > get the local polynomial coefficients to compute the derivatives, and found > the analitical zeros. > I had found a recipe on stackoverlord years ago, on how to get back to > local coefficient from the spline description. > > > Maybe a method, or an option, returning the local polynomial coefficients > (and the interval?) could be implemented? > it's my 2 cents as a user > > thanks for considering it and please, discard if you think it's not fit. > > All the best > Xavier > > 2017-03-25 15:29 GMT+11:00 Aman Pratik : > >> Seems interesting. Though adding this would require doing away with some >> of the other part(s) of the project. I will do some research on it. >> Meanwhile, lets wait for feedback from the mentors. >> >> >> >> >> On 25 March 2017 at 04:43, Stefan van der Walt >> wrote: >> >>> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: >>> >>> Hello, >>> This is my second Draft, Proposal : Second Draft >>> >>> >>> >>> Would this be a good opportunity to add subdivision splines to SciPy? >>> Their implementation (at least in 1D) is simple, and they give good >>> locality around nodes (i.e., yield predictable curves, useful for spline >>> design). >>> >>> An example is given here: >>> >>> https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a >>> >>> St?fan >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > > -- > ? Quand le gouvernement viole les droits du peuple, l'insurrection est, > pour le peuple et pour chaque portion du peuple, le plus sacr? des droits > et le plus indispensable des devoirs ? > > D?claration des droits de l'homme et du citoyen, article 35, 1793 > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Sat Mar 25 04:30:45 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sat, 25 Mar 2017 11:30:45 +0300 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: On Sat, Mar 25, 2017 at 9:23 AM, Xavier Barthelemy wrote: > Dear everyone, > > While we are talking about splining, I would like to add a request if > possible. > I am a heavy user of the splprep and co routines, and many times I need to > get the local polynomial coefficients to compute the derivatives, and found > the analitical zeros. > I had found a recipe on stackoverlord years ago, on how to get back to > local coefficient from the spline description. > > > Maybe a method, or an option, returning the local polynomial coefficients > (and the interval?) could be implemented? > it's my 2 cents as a user > > You can convert to piecewise polynomials via >>> tck = splrep(x, y) >>> p = PPoly.from_spline(tck) > thanks for considering it and please, discard if you think it's not fit. > > All the best > Xavier > > 2017-03-25 15:29 GMT+11:00 Aman Pratik : > >> Seems interesting. Though adding this would require doing away with some >> of the other part(s) of the project. I will do some research on it. >> Meanwhile, lets wait for feedback from the mentors. >> >> >> >> >> On 25 March 2017 at 04:43, Stefan van der Walt >> wrote: >> >>> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: >>> >>> Hello, >>> This is my second Draft, Proposal : Second Draft >>> >>> >>> >>> Would this be a good opportunity to add subdivision splines to SciPy? >>> Their implementation (at least in 1D) is simple, and they give good >>> locality around nodes (i.e., yield predictable curves, useful for spline >>> design). >>> >>> An example is given here: >>> >>> https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a >>> >>> St?fan >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > > -- > ? Quand le gouvernement viole les droits du peuple, l'insurrection est, > pour le peuple et pour chaque portion du peuple, le plus sacr? des droits > et le plus indispensable des devoirs ? > > D?claration des droits de l'homme et du citoyen, article 35, 1793 > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashwin.pathak at students.iiit.ac.in Sat Mar 25 11:27:30 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Sat, 25 Mar 2017 08:27:30 -0700 Subject: [SciPy-Dev] GSoc-17: scipy.diff Message-ID: Hello everyone, I was writing proposal about my project and was stuck at some points as to what all has to be included in scipy.diff and what kind of mathematical tools and formulations are to be incorporated. I was thinking of using numdiff tools from scipy.optimize in order to calculate the jacobian. Can someone suggest me some reading material or useful links. Also, can someone also point out exactly what other features and mathematical formulations to include. From ralf.gommers at gmail.com Sat Mar 25 21:18:26 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 26 Mar 2017 14:18:26 +1300 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> References: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> Message-ID: (when replying to a thread, manually change scipy-dev at scipy.org to scipy-dev at python.org) On Tue, Mar 21, 2017 at 8:26 AM, Nikolay Mayorov wrote: > Hi! > > I heard about CUTEst, but it looked quite complex so I decided not to deal > with it. To your options: > > 1) It might be the most practical option, considering you'll have other > (main) parts of work to do. Also it's not clear whether to include test > problems in scipy or store them somewhere else. My opinion is that having a > collection of problems right in scipy.optimize is good. I don't exactly > remember why we decided not to include test problems for least_squares, I > guess because it required many considerations. In theory there are ASV > benchmarks in scipy, but they don't feel adequate for benchmarking > optimization problems. > Why do you say that? The optimize benchmarks are quite extensive, and came from http://infinity77.net/global_optimization/index.html which is doing exactly what you say our asv benchmarks are not adequate for. If it's about usability/runtime, how about just fixing that? That plus adding some problems relevant for Antonio's GSoC project seems to make the most sense to me. Ralf > So developing benchmarking infrastructure in scipy.optimize might be a > good side project, my concern is that it will distract you from the main > path greatly. > > To sum up: the simplest way is to pick some problems, write your own > benchmark suite and store it somewhere else. This way you will be able to > focus on the algorithms. Other options are possible if you have a really > good idea how to do them and ready to devote time to it. > > 2) My understanding is that you don't need any permissions as you won't > use CUTEst code (someone could correct me). As I see it: your add an > interface into scipy, a user install CUTEst and use this interface to work > with CUTEst. It doesn't seem very practical, because installing CUTEst > looks like a pain. Do you agree or I am mistaken? > > 3) I guess you can ask an author to reuse his code in scipy to implement > an interface to CUTEst. > > Generally about 2 and 3: it might be a good idea to use CUTEst for your > internal benchmarking during the development and optionally you can add an > interface to CUTEst into scipy. But I think it will be hardly ever used > after that. > > To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking > seems unpractical. It would be useful if you will actually try to work with > CUTEst, maybe it will turn out to be not that difficult. In such case many > of my points are not valid. > > I hope you'll be able to make some sense from my rambling. > > Nikolay. > > > ---- On Mon, 20 Mar 2017 23:24:14 +0500 *Antonio Ribeiro > >* wrote ---- > > Hello, > > I am developing my google of summer proposal about constrained > optimisation in Scipy > and it will be very important to have a good set of benchmarks. > > There is a great library with diverse optimisation benchmarks called > CUTEst . It is > under LGPL 2.1 license. > > This CUTEst library include a huge amount of problem sets > and is often used in optimisation papers. Furthermore, many of the > available > optimisation software provide some interface to it. I am interested in > using problems from this set in my project and I want to ask how > should I proceed? > > 1) Select a subset of tests from the CUTEst library and implement them in > native python under scipy. > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > license is > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > permission? > > 3) There is an interface for CUTEst implemented in the library pyOpus > (under GPL3 license) > . Since this is a > library external to Scipy (with an incompatible license) how should I > proceed in this case: can I make my benchmarks dependent on it? Can we ask > permission for include this interface in Scipy? > > Any suggestions on how to proceed and what to include in my proposal is > very welcome. > > Ant?nio > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Mar 25 21:21:07 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 26 Mar 2017 14:21:07 +1300 Subject: [SciPy-Dev] Adding a convenient method to create ufuncs in for scipy.stats In-Reply-To: References: Message-ID: On Fri, Mar 17, 2017 at 5:20 PM, Warren Weckesser < warren.weckesser at gmail.com> wrote: > > > On Thu, Mar 16, 2017 at 4:14 PM, Robert Kern > wrote: > >> On Thu, Mar 16, 2017 at 12:39 PM, Warren Weckesser < >> warren.weckesser at gmail.com> wrote: >> > >> > On Thu, Mar 16, 2017 at 3:19 PM, Warren Weckesser < >> warren.weckesser at gmail.com> wrote: >> >> >> >> I'm working on an update to the Frechet distribution in scipy.stats >> (see https://github.com/scipy/scipy/issues/3258 and >> https://github.com/scipy/scipy/pull/3275). >> >> >> >> Instead jumping through the "lazy_where" hoops that are required for >> conditional computations, it would be much easier to create a ufunc for the >> standard PDF, CDF and possibly other required functions. Easier, that is, >> if I use the ufunc generation tools that we have over in scipy.special. >> Would there be any objections to this? We already have quite a few >> functions for probability distributions in scipy.special: >> https://docs.scipy.org/doc/scipy/reference/special.html#raw- >> statistical-functions >> >> >> >> I wouldn't mind creating ufuncs for some of the other distributions, >> too. A ufunc implementation is more efficient, simplifies the code in >> scipy.stats, and automatically handles broadcasting. >> >> >> >> I'm bringing this up here to see if anyone has any objections to the >> expansion of the statistical functions in scipy.special. >> >> >> >> Warren >> > >> > In my previous email, the heading hints at an alternative that I didn't >> mention in the text. The question implied in the heading is: what do folks >> think about adding ufunc generation tools to scipy.stats, instead of >> generating the ufuncs in scipy.special. There are a lot of conditional >> computations in scipy.stats that would benefit from being implemented as >> ufuncs, but probably don't need to be public functions. So instead of >> adding more functions to scipy.special, perhaps we could add code in >> scipy.stats for generating ufuncs, many of which would be private. Of >> course, we could just generate private ufuncs in scipy.special, and only >> use them in scipy.stats. >> > The change to ufuncs instead of lazywhere usage looks good. If the ufuncs remain private I don't have much of a preference where they live. Putting the ufunc generation machinery in scipy._lib may be useful long term for other purposes as well, so in that case these ufucs could live in scipy.stats. > >> +1 for adding additional more standard PDF/CDF functions to scipy.special >> as needed. >> > If we'd do that for all or most distributions, that'd be several hundred more functions. I don't think those should all be added to the scipy.special namespace, it'll become too large. Admittedly it's already a mess, but let's not make it worse. Not sure this is too relevant though - we first need to decide on public vs private. Currently I don't see the point in exposing frechet_pdf and frechet_cdf as public functions. They'll get very limited use, and they don't add much over using stats.frechet.pdf/cdf. So my vote is for keeping them private. Ralf > >> There's already precedent for putting statistics-related but not >> distribution-related ufuncs into scipy.special, specifically for the >> conditional operations, e.g. boxcox(). On the other hand, if the functions >> you are thinking of would not be part of the public API, then I'd prefer to >> implement them in scipy.stats instead of scipy.special. >> >> What work do you think is entailed in implementing the ufuncs in >> scipy.stats? Is there infrastructure we need to duplicate? Can we abstract >> out the build infrastructure to a common place? I haven't looked at the >> build details for scipy.special in some time. >> > > > The code that generates the ufunc boilerplate code is in > scipy/special/generate_ufuncs.py. It generates the appropriate wrapper > code for a core scalar function that is written in Cython, C or C++. I > just submitted a pull request (https://github.com/scipy/scipy/pull/7190, > still WIP) in which I wrote the core distribution functions for the Frechet > distribution in Cython, added the signature information to the big honkin' > FUNCS string in generate_ufuncs.py, added placeholders for the docstrings > in add_newdocs.py, and then used the ufuncs in the implementation of the > `frechet` class in stats. > > For the moment, the Frechet distribution ufuncs are in scipy.special, and > they are private, but a trivial change will make them public, if there is > interest. I don't have a strong opinion either way, but as you say, there > is a precedent for including them as public functions in scipy.special. > If we start converting existing distribution implementations (which I think > would be a good thing for the stats code), we'll end up with a *lot* more > functions being added somewhere. > > Warren > > > > >> >> -- >> Robert Kern >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Mar 25 22:13:57 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 26 Mar 2017 15:13:57 +1300 Subject: [SciPy-Dev] CUTEst in Scipy Message-ID: On Tue, Mar 21, 2017 at 12:35 PM, Matt Newville wrote: > > > Date: Tue, 21 Mar 2017 09:21:59 +1100 >> From: Andrew Nelson >> To: SciPy Developers List >> Subject: Re: [SciPy-Dev] CUTEst in Scipy >> Message-ID: >> > ail.com> >> Content-Type: text/plain; charset="utf-8" >> >> There are some least squares problems from the NIST test suite in the >> benchmarks/benchmarks/go_benchmark_functions, but they're couched in >> terms >> of a simple scalar (chi2) minimisation problem, and they are all bounded >> problems. >> >> > The NIST StRD test suite (http://www.itl.nist.gov/ > div898/strd/nls/nls_main.shtml) could be considered for inclusion as a > test suite for scipy.optimize. These are generally small but non-trivial > problems for unconstrained non-linear optimization. Each of the 20+ data > sets comes with 2 sets of starting values for the variables and certified > values for both best fit values and estimated uncertainties. I believe > that none of the optimizers in scipy.optimize will get the right answer to > published precision in every case, though leastsq and least_squares can > solve most of these problems reasonably well from at least 1 of the 2 > starting points. This may imply that these problems are not "sparse" in > the sense of what (as far as I understand) the GSOC project intends to > focus on. But, they are a good set of test cases to include in any > assessment of an optimization algorithm. > > The NIST StRD datasets are public domain. There is existing test code in > lmfit to use these data sets with most of the scipy optimizers. This could > easily be modified to be part of scipy. See https://github.com/lmfit/ > lmfit-py/tree/master/NIST_STRD and https://github.com/lmfit/ > lmfit-py/blob/master/tests/test_NIST_Strd.py for more details. Also, > note that the tests focus only on the quality of the results. They may > record number of iterations, but make no attempt to benchmark the runtime. > That may be a good thing - I think that there's in some places a lack of verified accuracy tests while there are checks on no increases in number of iterations. In scipy.stats we also have NIST tests for anova and linregress, those were quite useful. > I think that migrating this test suite to scipy would not be a huge > undertaking, though I cannot say whether that is an appropriate use of time > for the proposed GSOC project. > Can you open an issue for this to not lose track of the idea? Could be a nice extra for GSoC or done separately. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Sat Mar 25 22:29:16 2017 From: andyfaff at gmail.com (Andrew Nelson) Date: Sun, 26 Mar 2017 13:29:16 +1100 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: References: Message-ID: Many of the NIST tests are in the global optimisation benchmark functions. It might be a good idea to have them all in one place though. On 26 Mar 2017 1:14 pm, "Ralf Gommers" wrote: > > > On Tue, Mar 21, 2017 at 12:35 PM, Matt Newville < > newville at cars.uchicago.edu> wrote: > >> >> >> Date: Tue, 21 Mar 2017 09:21:59 +1100 >>> From: Andrew Nelson >>> To: SciPy Developers List >>> Subject: Re: [SciPy-Dev] CUTEst in Scipy >>> Message-ID: >>> >> ail.com> >>> Content-Type: text/plain; charset="utf-8" >>> >>> There are some least squares problems from the NIST test suite in the >>> benchmarks/benchmarks/go_benchmark_functions, but they're couched in >>> terms >>> of a simple scalar (chi2) minimisation problem, and they are all bounded >>> problems. >>> >>> >> The NIST StRD test suite (http://www.itl.nist.gov/div89 >> 8/strd/nls/nls_main.shtml) could be considered for inclusion as a test >> suite for scipy.optimize. These are generally small but non-trivial >> problems for unconstrained non-linear optimization. Each of the 20+ data >> sets comes with 2 sets of starting values for the variables and certified >> values for both best fit values and estimated uncertainties. I believe >> that none of the optimizers in scipy.optimize will get the right answer to >> published precision in every case, though leastsq and least_squares can >> solve most of these problems reasonably well from at least 1 of the 2 >> starting points. This may imply that these problems are not "sparse" in >> the sense of what (as far as I understand) the GSOC project intends to >> focus on. But, they are a good set of test cases to include in any >> assessment of an optimization algorithm. >> >> The NIST StRD datasets are public domain. There is existing test code in >> lmfit to use these data sets with most of the scipy optimizers. This could >> easily be modified to be part of scipy. See >> https://github.com/lmfit/lmfit-py/tree/master/NIST_STRD and >> https://github.com/lmfit/lmfit-py/blob/master/tests/test_NIST_Strd.py >> for more details. Also, note that the tests focus only on the quality of >> the results. They may record number of iterations, but make no attempt to >> benchmark the runtime. >> > > That may be a good thing - I think that there's in some places a lack of > verified accuracy tests while there are checks on no increases in number of > iterations. In scipy.stats we also have NIST tests for anova and > linregress, those were quite useful. > > >> I think that migrating this test suite to scipy would not be a huge >> undertaking, though I cannot say whether that is an appropriate use of time >> for the proposed GSOC project. >> > > Can you open an issue for this to not lose track of the idea? Could be a > nice extra for GSoC or done separately. > > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Mar 26 01:38:10 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 26 Mar 2017 18:38:10 +1300 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: Hi Aman, On Sat, Mar 25, 2017 at 8:19 AM, Aman Pratik wrote: > Hello, > This is my second Draft, Proposal : Second Draft > > > This is somewhat incomplete in detailing. It will take me some time to > make it more clear and obvious as to what I am going to do. > My main feedback is that you need to try to focus the proposal more, it feels a bit scattergun. You state that the major topic is cardinal B-splines, however you don't plan to start on those until week 8 and only allocate 3 out of 12 weeks for it. Other minor comments: - no need to repeat for every week that you'll add docs and tests; instead just state that once somewhere. - string aliases for `bc_type` should not take a full week, that's less than a day of work (so consider removing it). A lot of your task descriptions are taken from https://github.com/scipy/scipy/issues/6730. I'd suggest to limit the eay-fix tasks on there to say the first two weeks, and in your proposal tackle one main topic and add enough content about it to show that you have an understanding of how to get that done. Cheers, Ralf > I am facing a problem though, in getting my hands on the book "A Practical > Guide to Splines" by Carl R de Boor. Also, I need some more time to get the > algorithms clear in my head. So, please bear with me. Looking forward to > your feedback. > > Thank You > > > > > On 24 March 2017 at 23:12, Evgeni Burovski > wrote: > >> >> >> On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov < >> nikolay.mayorov at zoho.com> wrote: >> >>> Resent my answer to the new address. >>> >>> >>> ============ Forwarded message ============ >>> From : Nikolay Mayorov >>> To : "SciPy Developers List" >>> Date : Fri, 24 Mar 2017 01:07:10 +0500 >>> Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support >>> ============ Forwarded message ============ >>> >>> De Boor, "Practical guide to splines", Chapter IX should be the place >>> for your. Likely you will need to thoroughly study several chapters in it, >>> it's not an easy book to quickly pick up a recipe from it. >>> >>> Nikolay. >>> >>> >>> ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik >>> >* wrote ---- >>> >>> >>> >>> Hello, >>> I need some help with *"**Convert a CubicSpline/PPoly object to a >>> BSpline". *I am unable to find suitable literature for it, or maybe I >>> am overlooking something. I would be really grateful if you could guide me >>> to some reading material or code samples. >>> >>> >> >> Lyche and Morken, http://www.uio.no/studier/emne >> r/matnat/ifi/INF-MAT5340/v05/undervisningsmateriale/ >> can be an easier first read than de Boor (which is, indeed, *the* >> reference). >> >> (Hat tip to Pauli who suggested this to me a while ago). >> >> Cheers, >> >> Evgeni >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Mar 26 03:55:57 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 26 Mar 2017 20:55:57 +1300 Subject: [SciPy-Dev] (no subject) In-Reply-To: References: Message-ID: On Thu, Mar 23, 2017 at 11:35 AM, Matt Haberland wrote: > Hi Qianhui, > > I'm Matt; I'll be co-mentoring the scipy.diff project if a proposal is > selected, so my answer is geared towards that. > > It seems that you have found https://github.com/ > scipy/scipy/wiki/GSoC-2017-project-ideas as you wrote "implement > scipy.diff (numerical differentiation)". If you haven't already followed > the recommended reading links, I would continue with those. Read carefully, > researching things you're not familiar with. > > I suggest that you adopt the liberal interpretation of 'numerical > differentiation' - the evaluation of the numerical values of derivatives - > rather than the more restrictive definition of 'finite differences'. Please > research and consider the pros and cons of the various methods of > evaluating the numerical values of derivatives, including automatic > differentiation and complex step methods. > I agree with the sentiment of looking at the big picture of differentiation methods first, but the discussion around https://github.com/scipy/ scipy/issues/2035#issuecomment-23638210 converged on automatic differentiation not being a priority. So I suggest quickly focusing on a finite differences / complex step approximation method. Note that complex-step is more or less a separate flavor of finite differences rather than a separate category of methods, and numdifftools does include it. Ralf > Before writing a proposal, consider the following: > What are some common applications that require numerical derivatives, what > differentiation methods are most suitable for these applications, and how > does that information suggest what capabilities scipy.diff should have? > What algorithms can you find (in academic literature and textbooks), and > under what determines which is best for a particular application? Which can > you hope to implement, given constraints on time and expertise? (Your > mentors may not be differentiation experts - I am not - so you might have > to find answers about complicated algorithms on your own!) > What existing code can you draw from, and what shortcomings of that code > will you need to address? (Check bug reports, for example.) > With all that in mind, synthesize a schedule for creating the most useful > scipy.diff in the time you'll have available, and outline a path for > future work. > > I'll say that I'm particularly interested in derivatives for nonlinear > programming, especially for solving optimal control problems using direct > methods. In particular, I have Python code for evaluating the objective > function and constraints, and accurate derivates can greatly improve > convergence rates, so automatic differentiation (AD) is the natural choice. > However, other applications may have black box functions, in which case AD > is not an option. Are there situations in which AD is not possible but > complex step methods can be used and would outperform (real) finite > differences? > > You can find more general thoughts about writing good GSoC proposals > online; I but these are the things that come to my mind when I think about > a scipy.diff proposal. > > Matt > > > > > On Sat, Mar 18, 2017 at 7:32 PM, Qianhui Wan > wrote: > >> Hi all, >> >> I'm a senior undergraduate in US. My direction is in applied math and >> numerical analysis so I want to contribute to our suborg this year :) This >> is my first time to apply for GSoC. >> >> As I mentioned above, I've already had some background in related math >> and I have some experience in solving PDEs numerically with classical >> schemes. In the ideas list, I'm more interested in "implement scipy.diff >> (numerical differentiation)" and "improve the parabolic cylinder >> functions". For programming languages, I'm a intermediate learner in >> Python, MATLAB and R, also having some skills in C/C++. >> >> If anyone can give me some tips in the requirements of these two projects >> and proposal writing I'll be very appreciate, and if you have suggestions >> in choosing projects or developing ideas please let me know. Thanks! >> >> Best, >> Qianhui >> >> Qianhui Wan, senior >> Fall 2016 - Spring 2017, visiting, Math Department, University of >> Wisconsin, Madison, US >> Spring 2016, visiting, Math Department, University of California, >> Berkeley, US >> Fall 2013 - Spring 2017, Math School, Sun Yat-sen University, Guangzhou, >> China >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > > -- > Matt Haberland > Assistant Adjunct Professor in the Program in Computing > Department of Mathematics > 7620E Math Sciences Building, UCLA > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolay.mayorov at zoho.com Sun Mar 26 09:18:30 2017 From: nikolay.mayorov at zoho.com (Nikolay Mayorov) Date: Sun, 26 Mar 2017 18:18:30 +0500 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: References: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> Message-ID: <15b0ac4a303.123ac38ca14424.451023420470098371@zoho.com> Hi, Ralf! Why do you say that? The optimize benchmarks are quite extensive, and came from http://infinity77.net/global_optimization/index.html which is doing exactly what you say our asv benchmarks are not adequate for. If it's about usability/runtime, how about just fixing that? That plus adding some problems relevant for Antonio's GSoC project seems to make the most sense to me. Ralf I thought about it later and realized that I didn't think through my answer properly (but later neglected to update it). I see that a large collection of global optimization problems was introduced, so asv approach works. I just see optimization benchmarks somewhat different. I believe they should be more user facing and enhance scipy.optimize module itself by providing examples and the to quickly run different methods with different settings. Now the benchmarks is completely internal thing and not very exciting for people to try them (in my opinion). Additionally they are not flexible as they fix most of the method parameters, i. e. it's better to decouple problem formulations with optimization algorithms and their settings. Other than that if I'm not mistaken ASV still requires to run benchmarks multiple times to measure different metrics (like number of function evaluations and Jacobian evaluations), it is probably the main obvious shortcoming. Also I'm not sure how "alive" the benchmarks code and how often it is used, for example BenchGlobal doesn't seem to work for me with the current ASV master. So my suggestion is to use ASV as a tool to maintain the library quality (similar to unit tests), but not to try put everything in here and maybe automate its runs on Travis. Specifically for scipy.optimize I suggest to have a submodule like "Benchmarks" or "Test problems" or "Example problems" which will contain problem definitions and some tools to run a selected subset for specific solvers and options. These problems can be used then in unit tests and maybe even in ASV benchmarks. There is "Rosenbrock function" already, but it is just not enough. My guess that it will be more convenient (compared to ASV) when making enhancement to the old code or working on new solvers and additionally will serve as a documentation improvement. I believe it will be good to have such thing. What do you think? Nikolay. (when replying to a thread, manually change scipy-dev at scipy.org to scipy-dev at python.org) On Tue, Mar 21, 2017 at 8:26 AM, Nikolay Mayorov <nikolay.mayorov at zoho.com> wrote: Hi! I heard about CUTEst, but it looked quite complex so I decided not to deal with it. To your options: 1) It might be the most practical option, considering you'll have other (main) parts of work to do. Also it's not clear whether to include test problems in scipy or store them somewhere else. My opinion is that having a collection of problems right in scipy.optimize is good. I don't exactly remember why we decided not to include test problems for least_squares, I guess because it required many considerations. In theory there are ASV benchmarks in scipy, but they don't feel adequate for benchmarking optimization problems. Why do you say that? The optimize benchmarks are quite extensive, and came from http://infinity77.net/global_optimization/index.html which is doing exactly what you say our asv benchmarks are not adequate for. If it's about usability/runtime, how about just fixing that? That plus adding some problems relevant for Antonio's GSoC project seems to make the most sense to me. Ralf So developing benchmarking infrastructure in scipy.optimize might be a good side project, my concern is that it will distract you from the main path greatly. To sum up: the simplest way is to pick some problems, write your own benchmark suite and store it somewhere else. This way you will be able to focus on the algorithms. Other options are possible if you have a really good idea how to do them and ready to devote time to it. 2) My understanding is that you don't need any permissions as you won't use CUTEst code (someone could correct me). As I see it: your add an interface into scipy, a user install CUTEst and use this interface to work with CUTEst. It doesn't seem very practical, because installing CUTEst looks like a pain. Do you agree or I am mistaken? 3) I guess you can ask an author to reuse his code in scipy to implement an interface to CUTEst. Generally about 2 and 3: it might be a good idea to use CUTEst for your internal benchmarking during the development and optionally you can add an interface to CUTEst into scipy. But I think it will be hardly ever used after that. To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking seems unpractical. It would be useful if you will actually try to work with CUTEst, maybe it will turn out to be not that difficult. In such case many of my points are not valid. I hope you'll be able to make some sense from my rambling. Nikolay. ---- On Mon, 20 Mar 2017 23:24:14 +0500 Antonio Ribeiro <antonior92 at gmail.com> wrote ---- Hello, I am developing my google of summer proposal about constrained optimisation in Scipy and it will be very important to have a good set of benchmarks. There is a great library with diverse optimisation benchmarks called CUTEst <https://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki/>. It is under LGPL 2.1 license. This CUTEst library include a huge amount of problem sets and is often used in optimisation papers. Furthermore, many of the available optimisation software provide some interface to it. I am interested in using problems from this set in my project and I want to ask how should I proceed? 1) Select a subset of tests from the CUTEst library and implement them in native python under scipy. 2) Include some interface to CUTEst in scipy. By what I understand LGPL license is more restrictive than BSD-3 used by Scipy. In this case, could we ask for permission? 3) There is an interface for CUTEst implemented in the library pyOpus (under GPL3 license) <http://fides.fe.uni-lj.si/pyopus/download1.html>. Since this is a library external to Scipy (with an incompatible license) how should I proceed in this case: can I make my benchmarks dependent on it? Can we ask permission for include this interface in Scipy? Any suggestions on how to proceed and what to include in my proposal is very welcome. Ant?nio _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org https://mail.scipy.org/mailman/listinfo/scipy-dev _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Mar 27 04:43:04 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 27 Mar 2017 21:43:04 +1300 Subject: [SciPy-Dev] Fwd: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated) In-Reply-To: References: <1489688042-5554705.54580375.fv2GIDbTc031721@rs159.luxsci.com> Message-ID: Hi all, This call may be of interest. If anyone has a good idea for SciPy that would fit in this budget, let's hear it! Cheers, Ralf ---------- Forwarded message ---------- From: Gina Helfrich Date: Fri, Mar 17, 2017 at 10:51 AM Subject: Re: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated) To: numfocus at googlegroups.com, devs at numfocus.org There is no specific template, but proposals should be kept under 2 pages. Maximum 1 submission per project. Required elements of the proposal are: - title - project description - benefit to project/community - project team - and budget Submit proposals to info at numfocus.org Best, Gina On Thu, Mar 16, 2017 at 1:13 PM, Bob Carpenter wrote: > Is there a format you're expecting for proposals? > > Are applications limited to one per project? > > Hard to imagine a project won't have something to do with $3K, > so I imagine you'll get applications from all of the projects if > the proposals take less than an hour or two to put together. > > - Bob > > > On Mar 16, 2017, at 12:14 PM, Gina Helfrich wrote: > > > > Call for Proposals - Small Development Grants > > > > NumFOCUS is asking for proposals from its sponsored and affiliated > projects for targeted small development projects with a clear benefit to > those projects. This call is motivated by the success of our 2016 > end-of-year fundraising drive; we want to direct the donated funds to our > projects in a way that has impact and visibility to donors and the wider > community. > > > > There are no restrictions on what the funding can be used for. Whether > it?s code development, documentation work, an educational, sustainability > or diversity initiative, or yet something else, we trust the projects > themselves to understand what they need and explain that need in the > proposal. > > > > Available Funding: > > ? Up to $3,000 per proposal > > ? Allocated funding is $9,000; depending on the number and quality > of proposals this may be adjusted up or down. > > > > Eligibility: > > ? Proposals must be approved by the leadership of a NumFOCUS > sponsored or affiliated project. > > ? Proposed work must have a clear outcome, achievable within 2017. > > ? The call is open to applicants from any nationality and can be > performed at any university, institute or business worldwide (US export > laws permitting). > > > > Timeline: > > ? Mid-March 2017: Call for Proposals released > > ? 3 April 2017: deadline for proposal submissions > > ? 17 April: successful proposals announced > > > > -- > > Dr. Gina Helfrich > > Communications Director, NumFOCUS > > gina at numfocus.org > > 512-222-5449 > > > > > > > > > > -- > > You received this message because you are subscribed to the Google > Groups "NumFOCUS" group. > > To unsubscribe from this group and stop receiving emails from it, send > an email to numfocus+unsubscribe at googlegroups.com. > > For more options, visit https://groups.google.com/d/optout. > > -- > You received this message because you are subscribed to the Google Groups > "NumFOCUS" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to numfocus+unsubscribe at googlegroups.com. > For more options, visit https://groups.google.com/d/optout. > -- Dr. Gina Helfrich Communications Director, NumFOCUS gina at numfocus.org 512-222-5449 <(512)%20222-5449> -- You received this message because you are subscribed to the Google Groups "NumFOCUS" group. To unsubscribe from this group and stop receiving emails from it, send an email to numfocus+unsubscribe at googlegroups.com. For more options, visit https://groups.google.com/d/optout. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Mar 27 05:16:02 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 27 Mar 2017 22:16:02 +1300 Subject: [SciPy-Dev] GSoC applications - next steps Message-ID: Hi prospective GSoC participants, The GSoC application deadline is just over a week away, so I thought it'd be useful to give a short overview of what will happen over the next few weeks. Until April 3 we'll do our best to help you write your proposals. Some of you have already shared a draft proposal; I strongly encourage the rest of you to do the same. If you have questions, please make them as concrete as possible (showing that you've done your own reading/research on the topics helps for getting answers). The week after proposal submission we'll interview you (over Skype or Google Hangouts) to help us pick the best applicants. If you are unavailable that week, please let me know now and we'll try to schedule an interview earlier (later is difficult, because we need to indicate how many slots we want to the PSF around the 11th). You'll find out on May 4th if your application was successful. If so, the "community bonding" period starts the next day - this is the time to get more familiar with the project and community: read the developer guide, start interacting on the mailing list, and ensure you're fully set up to hit the ground running on May 30th. April 3: student application deadline April 4-11: SciPy mentors interview students May 4: accepted students announced May 5-30: community bonding period If you have any questions about this, please do ask! Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Mar 27 06:39:51 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 27 Mar 2017 23:39:51 +1300 Subject: [SciPy-Dev] GSoC Proposal: Circular statistics In-Reply-To: References: Message-ID: On Sat, Mar 18, 2017 at 12:59 AM, Vladislav Iakovlev wrote: > Thanks to Josef and other colleagues for productive discussion! It > clarified a lot for me! > > This is a first iteration of my GSoC proposal. > > > > The goal of the project is to implement circular statistics functionality > in scipy.stats. > > > > Motivation: > > There exist several Python packages related to this theme, but most of > them has some shortcomings. > Would be good to list packages and shortcomings. > Also, it will be very conveniently to have this functionality in SciPy. It > will simplify users work because they will not have to search and install > other packages. > > > > What will be done: > > > > Base class rv_circular that will provide infrastructure for distributions > characteristics. List of supposed characteristics: > > pdf, cdf, moments, mean, dispersion, standard deviation, percentiles, > median, entropy, kurtosis, skewness. > > > > Derived classes for distributions. Methods for calculating values of > characteristics can be redefined in this classes. List of supposed > distributions: > > Uniform distribution, > > Triangular distribution, > > Cardioid distribution, > > Wrapped distributions: Cauchy, Normal, Von Mises. > Seems like a reasonable list. Triangular may be a bit less common. There's also https://en.wikipedia.org/wiki/Wrapped_exponential_distribution and https://en.wikipedia.org/wiki/Wrapped_asymmetric_Laplace_distribution, and the R package "circular" (useful for testing against) has even more. > > Functions for statistical tests: > > Tests of uniformity and Goodness-of-Fit. > > Tests related to Von Mises distribution: tests for mean and concentration > parameters, multi-sample tests. > > Some non-parametric tests. > > > > I have developed a little ?demo-version? of circular stats toolbox: > > https://github.com/yakovlevvs/Circular-Statistics > > It is very raw and will be rewritten; I created this just to demonstrate > my vision of how it will be arranged. > That seems quite sensible. It matches your rv_circular and derived classes description above. Did you think about whether to implement frozen versions, like rv_continuous and rv_discrete do? Probably makes sense to do this, if only for design symmetry. > > Approximate timeline: > > April: Reading books, understanding related mathematics; > > May ? June, 20: Implementing rv_circular class, documentation and tests > for it; > > June, 20 ? July, 10: Implementing distribution classes; > > July, 10 ? August, 20: Implementing statistical tests and point > estimations; > This will need a bit more details, normally you write something with granularity of one or two weeks per task. It's important to get that right, both because it helps you think about how much you can promise to do in 12 weeks and because there's an evaluation point halfway through the program. @Evgeni: were you interested in mentoring this topic? I could too potentially. Cheers, Ralf > > Related literature: > > Kanti V. Mardia, Peter E. Jupp: Directional statistics, 2000. > > Jammalamadaka, S. Rao. Topics in circular statistics / S. Rao > Jammalamadaka, A. SenGupta. > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Mon Mar 27 13:50:56 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Mon, 27 Mar 2017 20:50:56 +0300 Subject: [SciPy-Dev] GSoC Proposal: Circular statistics In-Reply-To: References: Message-ID: >> >> Thanks to Josef and other colleagues for productive discussion! It clarified a lot for me! >> >> This is a first iteration of my GSoC proposal. >> >> >> >> The goal of the project is to implement circular statistics functionality in scipy.stats. >> >> >> >> Motivation: >> >> There exist several Python packages related to this theme, but most of them has some shortcomings. > > > Would be good to list packages and shortcomings. > >> >> Also, it will be very conveniently to have this functionality in SciPy. It will simplify users work because they will not have to search and install other packages. >> >> >> >> What will be done: >> >> >> >> Base class rv_circular that will provide infrastructure for distributions characteristics. List of supposed characteristics: >> >> pdf, cdf, moments, mean, dispersion, standard deviation, percentiles, median, entropy, kurtosis, skewness. >> >> >> >> Derived classes for distributions. Methods for calculating values of characteristics can be redefined in this classes. List of supposed distributions: >> >> Uniform distribution, >> >> Triangular distribution, >> >> Cardioid distribution, >> >> Wrapped distributions: Cauchy, Normal, Von Mises. > > > Seems like a reasonable list. Triangular may be a bit less common. There's also https://en.wikipedia.org/wiki/Wrapped_exponential_distribution and https://en.wikipedia.org/wiki/Wrapped_asymmetric_Laplace_distribution, and the R package "circular" (useful for testing against) has even more. > >> >> >> Functions for statistical tests: >> >> Tests of uniformity and Goodness-of-Fit. >> >> Tests related to Von Mises distribution: tests for mean and concentration parameters, multi-sample tests. >> >> Some non-parametric tests. >> >> >> >> I have developed a little ?demo-version? of circular stats toolbox: >> >> https://github.com/yakovlevvs/Circular-Statistics >> >> It is very raw and will be rewritten; I created this just to demonstrate my vision of how it will be arranged. > > > That seems quite sensible. It matches your rv_circular and derived classes description above. Did you think about whether to implement frozen versions, like rv_continuous and rv_discrete do? Probably makes sense to do this, if only for design symmetry. > >> >> >> Approximate timeline: >> >> April: Reading books, understanding related mathematics; >> >> May ? June, 20: Implementing rv_circular class, documentation and tests for it; >> >> June, 20 ? July, 10: Implementing distribution classes; >> >> July, 10 ? August, 20: Implementing statistical tests and point estimations; > > > This will need a bit more details, normally you write something with granularity of one or two weeks per task. It's important to get that right, both because it helps you think about how much you can promise to do in 12 weeks and because there's an evaluation point halfway through the program. > > @Evgeni: were you interested in mentoring this topic? I could too potentially. I am, yes. Would be good to have a co-mentor though, so it's not an inside offline job. > Cheers, > Ralf > > >> >> >> Related literature: >> >> Kanti V. Mardia, Peter E. Jupp: Directional statistics, 2000. >> >> Jammalamadaka, S. Rao. Topics in circular statistics / S. Rao Jammalamadaka, A. SenGupta. >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xabart at gmail.com Mon Mar 27 18:53:42 2017 From: xabart at gmail.com (Xavier Barthelemy) Date: Tue, 28 Mar 2017 09:53:42 +1100 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: Thanks!! I didn't know this trick! Is it a recent one? All the best Xavier 2017-03-25 19:30 GMT+11:00 Evgeni Burovski : > > > On Sat, Mar 25, 2017 at 9:23 AM, Xavier Barthelemy > wrote: > >> Dear everyone, >> >> While we are talking about splining, I would like to add a request if >> possible. >> I am a heavy user of the splprep and co routines, and many times I need >> to get the local polynomial coefficients to compute the derivatives, and >> found the analitical zeros. >> I had found a recipe on stackoverlord years ago, on how to get back to >> local coefficient from the spline description. >> >> >> Maybe a method, or an option, returning the local polynomial coefficients >> (and the interval?) could be implemented? >> it's my 2 cents as a user >> >> > You can convert to piecewise polynomials via > > >>> tck = splrep(x, y) > >>> p = PPoly.from_spline(tck) > > > > >> thanks for considering it and please, discard if you think it's not fit. >> >> All the best >> Xavier >> >> 2017-03-25 15:29 GMT+11:00 Aman Pratik : >> >>> Seems interesting. Though adding this would require doing away with some >>> of the other part(s) of the project. I will do some research on it. >>> Meanwhile, lets wait for feedback from the mentors. >>> >>> >>> >>> >>> On 25 March 2017 at 04:43, Stefan van der Walt >>> wrote: >>> >>>> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: >>>> >>>> Hello, >>>> This is my second Draft, Proposal : Second Draft >>>> >>>> >>>> >>>> Would this be a good opportunity to add subdivision splines to SciPy? >>>> Their implementation (at least in 1D) is simple, and they give good >>>> locality around nodes (i.e., yield predictable curves, useful for spline >>>> design). >>>> >>>> An example is given here: >>>> >>>> https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a >>>> >>>> St?fan >>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> >> -- >> ? Quand le gouvernement viole les droits du peuple, l'insurrection est, >> pour le peuple et pour chaque portion du peuple, le plus sacr? des droits >> et le plus indispensable des devoirs ? >> >> D?claration des droits de l'homme et du citoyen, article 35, 1793 >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -- ? Quand le gouvernement viole les droits du peuple, l'insurrection est, pour le peuple et pour chaque portion du peuple, le plus sacr? des droits et le plus indispensable des devoirs ? D?claration des droits de l'homme et du citoyen, article 35, 1793 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashwin.pathak at students.iiit.ac.in Tue Mar 28 12:10:27 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Tue, 28 Mar 2017 21:40:27 +0530 Subject: [SciPy-Dev] GSoC 2017 : Implementation of scipy.diff Message-ID: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> Hello Developers, I am applying for GSoC 2017 under the project : Implementation of scipy.diff. Here is my first iteration of my proposal : https://github.com/ashwinpathak20/scipy/wiki/GSoc-2017:-Implement-scipy.diff-(numerical-differentiation) Please have a look at it and suggest me all the short-comings and changes to be included. Thanks in advance From ralf.gommers at gmail.com Wed Mar 29 04:41:39 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 29 Mar 2017 21:41:39 +1300 Subject: [SciPy-Dev] GSoC 2017 : Implementation of scipy.diff In-Reply-To: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> References: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> Message-ID: Hi Ashwin, That looks like a good start. It's clear that in the first sections you've drawn quite a bit on Maniteja's draft from two years ago, but that's fine. There's some places where more detail and proper referencing is required though. I'll comment first on the structure of your approach and timeline, that will need some work. Would be good if you can get a reworked version together with a few days left before the submission deadline. Overall I'd say you need to focus on the two finite differences methods, those will take the bulk of your time. With the rest being taken up by building a test set of problems to verify accuracy of the implementation, and working out the API. Detailed comments on timeline: Week 1-4: this initially confused me, can you please rename those to something else so "week 1" is the first week of the coding period (that's how all proposals are numbered)? Week 2: some of this should be part of your proposal, the idea is that you have something concrete ("algorithm X as described in paper/book Y") to implement in your proposal. Week 3: automatic differentiation, no need to decide on that - that is out of scope for this project. API definition is the first part of your proposed schedule that does fit in the coding period; it should not be at the beginning though. You start with working on one method, API can be fine-tuned later. Week 4: not quite sure what that means. Overall for the community bonding period: I think you can remove some code/API things, but add a few community related items. For example: set up your blog and do a first blog post, then share that on the mailing list to get more feedback from the community. Week 5-6: Things like "Checking of appropriate conditions ..." don't need to be in your proposal, it's pretty meaningless as written. Instead of putting such tasks in your timeline, just put a short description somewhere on when/how you plan to write tests, documentation and benchmarks. Why do you propose an adaptation of scipy.misc.derivative? I'm not sure there's much of value to be rescued from that function (but I haven't checked for a while). What I am missing is a way to validate your implementation. Can you be more concrete about that? Will you use functions with known analytical derivatives only, is there a standard benchmark set that's commonly used in papers that you can use, etc.? Week 7: don't do that, always write tests and docstrings as you go. You can reserve some days to say write an extended tuturial, but not docs/tests. Week 8: some more detail on the complex step size algorithm is needed. And this needs referencing (applies to the adaptive step size algorithm too). Timing: you need to reserve more time for this algorithm, and include testing/integration. Week 10-11: adding a buffer period is fine, but midterm evaluation can't be in your schedule (it doesn't take time). Week 12-13: no autodiff please Week 14-15: I like the cross-testing with other implementations. I'd like to see some more detail here. You could also use packages from other languages to test against if that makes sense. Cheers, Ralf On Wed, Mar 29, 2017 at 5:10 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: > Hello Developers, > I am applying for GSoC 2017 under the project : Implementation of > scipy.diff. Here is my first iteration of my proposal : > https://github.com/ashwinpathak20/scipy/wiki/GSoc-2017:- > Implement-scipy.diff-(numerical-differentiation) > > Please have a look at it and suggest me all the short-comings and > changes to be included. > > Thanks in advance > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Mar 29 05:38:28 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 29 Mar 2017 22:38:28 +1300 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: <15b0ac4a303.123ac38ca14424.451023420470098371@zoho.com> References: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> <15b0ac4a303.123ac38ca14424.451023420470098371@zoho.com> Message-ID: On Mon, Mar 27, 2017 at 2:18 AM, Nikolay Mayorov wrote: > Hi, Ralf! > > Why do you say that? The optimize benchmarks are quite extensive, and came > from http://infinity77.net/global_optimization/index.html which is doing > exactly what you say our asv benchmarks are not adequate for. If it's about > usability/runtime, how about just fixing that? That plus adding some > problems relevant for Antonio's GSoC project seems to make the most sense > to me. > Ralf > > > I thought about it later and realized that I didn't think through my > answer properly (but later neglected to update it). I see that a large > collection of global optimization problems was introduced, so asv approach > works. I just see optimization benchmarks somewhat different. I believe > they should be more user facing and enhance scipy.optimize module itself by > providing examples and the to quickly run different methods with different > settings. Now the benchmarks is completely internal thing and not very > exciting for people to try them (in my opinion). Additionally they are not > flexible as they fix most of the method parameters, i. e. it's better to > decouple problem formulations with optimization algorithms and their > settings. > That's a good point. It would indeed be nice to take a representative subset of problem functions and expose them as `optimize.something`. Then they can be used from both the asv benchmarks and by users. > Other than that if I'm not mistaken ASV still requires to run benchmarks > multiple times to measure different metrics (like number of function > evaluations and Jacobian evaluations), it is probably the main obvious > shortcoming. > This is about `result_type = ["average time", "nfev", "success"]` right? That construction is a bit odd, there probably isn't a fundamental reason why that can't be fixed. Also I'm not sure how "alive" the benchmarks code and how often it is used, > for example BenchGlobal doesn't seem to work for me with the current ASV > master. > The problem with the optimize benchmarks is that they're so extensive that they take forever, and hence are excluded from the default benchmark run (IIRC). > > So my suggestion is to use ASV as a tool to maintain the library quality > (similar to unit tests), but not to try put everything in here and maybe > automate its runs on Travis. Specifically for scipy.optimize I suggest to > have a submodule like "Benchmarks" or "Test problems" or "Example problems" > which will contain problem definitions and some tools to run a selected > subset for specific solvers and options. These problems can be used then in > unit tests and maybe even in ASV benchmarks. There is "Rosenbrock function" > already, but it is just not enough. My guess that it will be more > convenient (compared to ASV) when making enhancement to the old code or > working on new solvers and additionally will serve as a documentation > improvement. I believe it will be good to have such thing. > Agreed. The functions should be selected carefully, it can't be 100+ funcs I think (too much work/maintenance). Ralf > What do you think? > > Nikolay. > > > (when replying to a thread, manually change scipy-dev at scipy.org to > scipy-dev at python.org) > > > > On Tue, Mar 21, 2017 at 8:26 AM, Nikolay Mayorov > wrote: > > > Hi! > > I heard about CUTEst, but it looked quite complex so I decided not to deal > with it. To your options: > > 1) It might be the most practical option, considering you'll have other > (main) parts of work to do. Also it's not clear whether to include test > problems in scipy or store them somewhere else. My opinion is that having a > collection of problems right in scipy.optimize is good. I don't exactly > remember why we decided not to include test problems for least_squares, I > guess because it required many considerations. In theory there are ASV > benchmarks in scipy, but they don't feel adequate for benchmarking > optimization problems. > > > Why do you say that? The optimize benchmarks are quite extensive, and came > from http://infinity77.net/global_optimization/index.html which is doing > exactly what you say our asv benchmarks are not adequate for. If it's about > usability/runtime, how about just fixing that? That plus adding some > problems relevant for Antonio's GSoC project seems to make the most sense > to me. > Ralf > > > So developing benchmarking infrastructure in scipy.optimize might be a > good side project, my concern is that it will distract you from the main > path greatly. > > To sum up: the simplest way is to pick some problems, write your own > benchmark suite and store it somewhere else. This way you will be able to > focus on the algorithms. Other options are possible if you have a really > good idea how to do them and ready to devote time to it. > > 2) My understanding is that you don't need any permissions as you won't > use CUTEst code (someone could correct me). As I see it: your add an > interface into scipy, a user install CUTEst and use this interface to work > with CUTEst. It doesn't seem very practical, because installing CUTEst > looks like a pain. Do you agree or I am mistaken? > > 3) I guess you can ask an author to reuse his code in scipy to implement > an interface to CUTEst. > > Generally about 2 and 3: it might be a good idea to use CUTEst for your > internal benchmarking during the development and optionally you can add an > interface to CUTEst into scipy. But I think it will be hardly ever used > after that. > > To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking > seems unpractical. It would be useful if you will actually try to work with > CUTEst, maybe it will turn out to be not that difficult. In such case many > of my points are not valid. > > I hope you'll be able to make some sense from my rambling. > > Nikolay. > > > ---- On Mon, 20 Mar 2017 23:24:14 +0500 *Antonio Ribeiro > >* wrote ---- > > Hello, > > I am developing my google of summer proposal about constrained > optimisation in Scipy > and it will be very important to have a good set of benchmarks. > > There is a great library with diverse optimisation benchmarks called > CUTEst . It is > under LGPL 2.1 license. > > This CUTEst library include a huge amount of problem sets > and is often used in optimisation papers. Furthermore, many of the > available > optimisation software provide some interface to it. I am interested in > using problems from this set in my project and I want to ask how > should I proceed? > > 1) Select a subset of tests from the CUTEst library and implement them in > native python under scipy. > > 2) Include some interface to CUTEst in scipy. By what I understand LGPL > license is > more restrictive than BSD-3 used by Scipy. In this case, could we ask for > permission? > > 3) There is an interface for CUTEst implemented in the library pyOpus > (under GPL3 license) > . Since this is a > library external to Scipy (with an incompatible license) how should I > proceed in this case: can I make my benchmarks dependent on it? Can we ask > permission for include this interface in Scipy? > > Any suggestions on how to proceed and what to include in my proposal is > very welcome. > > Ant?nio > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Wed Mar 29 06:13:05 2017 From: andyfaff at gmail.com (Andrew Nelson) Date: Wed, 29 Mar 2017 21:13:05 +1100 Subject: [SciPy-Dev] CUTEst in Scipy In-Reply-To: References: <15aed2fc740.ec8f556f6477.4506434177608850793@zoho.com> <15b0ac4a303.123ac38ca14424.451023420470098371@zoho.com> Message-ID: When any benchmark functions are added it would be good to spend some thought on standardising an interface to such classes/functions, including: (1) There are least squares problems where one might want to return residuals for some solvers, but a single value for other solvers (2) There are problems which just have to minimize a single value, so they are suitable for some solvers (optimize.minimize), but not least squares solvers. (3) How does one specify bounds for the problem - a problem would have to advertise what level of bounds they provide so a solver can be chosen. (4) it can be difficult to standardise documentation for such a wide range of functions. Can the documentation be autogenerated from the class or function? (5) I found that several global benchmark functions from the literature had incorrect solutions/minima (varying from article to article), and the function definition can vary between articles. It's a good idea to provide a literature reference for the function (6) different ways of choosing starting values are useful. These include certified starting values (a. la. NIST test problems), or randomised starting values. If they're going to go into scipy.optimize then perhaps they would deserve their own submodule. On 29 March 2017 at 20:38, Ralf Gommers wrote: > > > On Mon, Mar 27, 2017 at 2:18 AM, Nikolay Mayorov > wrote: > >> Hi, Ralf! >> >> Why do you say that? The optimize benchmarks are quite extensive, and >> came from http://infinity77.net/global_optimization/index.html which is >> doing exactly what you say our asv benchmarks are not adequate for. If it's >> about usability/runtime, how about just fixing that? That plus adding some >> problems relevant for Antonio's GSoC project seems to make the most sense >> to me. >> Ralf >> >> >> I thought about it later and realized that I didn't think through my >> answer properly (but later neglected to update it). I see that a large >> collection of global optimization problems was introduced, so asv approach >> works. I just see optimization benchmarks somewhat different. I believe >> they should be more user facing and enhance scipy.optimize module itself by >> providing examples and the to quickly run different methods with different >> settings. Now the benchmarks is completely internal thing and not very >> exciting for people to try them (in my opinion). Additionally they are not >> flexible as they fix most of the method parameters, i. e. it's better to >> decouple problem formulations with optimization algorithms and their >> settings. >> > > That's a good point. It would indeed be nice to take a representative > subset of problem functions and expose them as `optimize.something`. Then > they can be used from both the asv benchmarks and by users. > > >> Other than that if I'm not mistaken ASV still requires to run benchmarks >> multiple times to measure different metrics (like number of function >> evaluations and Jacobian evaluations), it is probably the main obvious >> shortcoming. >> > > This is about `result_type = ["average time", "nfev", "success"]` right? > That construction is a bit odd, there probably isn't a fundamental reason > why that can't be fixed. > > Also I'm not sure how "alive" the benchmarks code and how often it is >> used, for example BenchGlobal doesn't seem to work for me with the current >> ASV master. >> > > The problem with the optimize benchmarks is that they're so extensive that > they take forever, and hence are excluded from the default benchmark run > (IIRC). > >> >> So my suggestion is to use ASV as a tool to maintain the library quality >> (similar to unit tests), but not to try put everything in here and maybe >> automate its runs on Travis. Specifically for scipy.optimize I suggest to >> have a submodule like "Benchmarks" or "Test problems" or "Example problems" >> which will contain problem definitions and some tools to run a selected >> subset for specific solvers and options. These problems can be used then in >> unit tests and maybe even in ASV benchmarks. There is "Rosenbrock function" >> already, but it is just not enough. My guess that it will be more >> convenient (compared to ASV) when making enhancement to the old code or >> working on new solvers and additionally will serve as a documentation >> improvement. I believe it will be good to have such thing. >> > > Agreed. The functions should be selected carefully, it can't be 100+ funcs > I think (too much work/maintenance). > > Ralf > > > > >> What do you think? >> >> Nikolay. >> >> >> (when replying to a thread, manually change scipy-dev at scipy.org to >> scipy-dev at python.org) >> >> >> >> On Tue, Mar 21, 2017 at 8:26 AM, Nikolay Mayorov < >> nikolay.mayorov at zoho.com> wrote: >> >> >> Hi! >> >> I heard about CUTEst, but it looked quite complex so I decided not to >> deal with it. To your options: >> >> 1) It might be the most practical option, considering you'll have other >> (main) parts of work to do. Also it's not clear whether to include test >> problems in scipy or store them somewhere else. My opinion is that having a >> collection of problems right in scipy.optimize is good. I don't exactly >> remember why we decided not to include test problems for least_squares, I >> guess because it required many considerations. In theory there are ASV >> benchmarks in scipy, but they don't feel adequate for benchmarking >> optimization problems. >> >> >> Why do you say that? The optimize benchmarks are quite extensive, and >> came from http://infinity77.net/global_optimization/index.html which is >> doing exactly what you say our asv benchmarks are not adequate for. If it's >> about usability/runtime, how about just fixing that? That plus adding some >> problems relevant for Antonio's GSoC project seems to make the most sense >> to me. >> Ralf >> >> >> So developing benchmarking infrastructure in scipy.optimize might be a >> good side project, my concern is that it will distract you from the main >> path greatly. >> >> To sum up: the simplest way is to pick some problems, write your own >> benchmark suite and store it somewhere else. This way you will be able to >> focus on the algorithms. Other options are possible if you have a really >> good idea how to do them and ready to devote time to it. >> >> 2) My understanding is that you don't need any permissions as you won't >> use CUTEst code (someone could correct me). As I see it: your add an >> interface into scipy, a user install CUTEst and use this interface to work >> with CUTEst. It doesn't seem very practical, because installing CUTEst >> looks like a pain. Do you agree or I am mistaken? >> >> 3) I guess you can ask an author to reuse his code in scipy to implement >> an interface to CUTEst. >> >> Generally about 2 and 3: it might be a good idea to use CUTEst for your >> internal benchmarking during the development and optionally you can add an >> interface to CUTEst into scipy. But I think it will be hardly ever used >> after that. >> >> To sum the idea: adding CUTEst dependency for scipy.optimize benchmarking >> seems unpractical. It would be useful if you will actually try to work with >> CUTEst, maybe it will turn out to be not that difficult. In such case many >> of my points are not valid. >> >> I hope you'll be able to make some sense from my rambling. >> >> Nikolay. >> >> >> ---- On Mon, 20 Mar 2017 23:24:14 +0500 *Antonio Ribeiro >> >* wrote ---- >> >> Hello, >> >> I am developing my google of summer proposal about constrained >> optimisation in Scipy >> and it will be very important to have a good set of benchmarks. >> >> There is a great library with diverse optimisation benchmarks called >> CUTEst . It is >> under LGPL 2.1 license. >> >> This CUTEst library include a huge amount of problem sets >> and is often used in optimisation papers. Furthermore, many of the >> available >> optimisation software provide some interface to it. I am interested in >> using problems from this set in my project and I want to ask how >> should I proceed? >> >> 1) Select a subset of tests from the CUTEst library and implement them in >> native python under scipy. >> >> 2) Include some interface to CUTEst in scipy. By what I understand LGPL >> license is >> more restrictive than BSD-3 used by Scipy. In this case, could we ask for >> permission? >> >> 3) There is an interface for CUTEst implemented in the library pyOpus >> (under GPL3 license) >> . Since this is a >> library external to Scipy (with an incompatible license) how should I >> proceed in this case: can I make my benchmarks dependent on it? Can we ask >> permission for include this interface in Scipy? >> >> Any suggestions on how to proceed and what to include in my proposal is >> very welcome. >> >> Ant?nio >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Mar 29 07:17:25 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 29 Mar 2017 14:17:25 +0300 Subject: [SciPy-Dev] GSoC 2017 : Implementation of scipy.diff In-Reply-To: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> References: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> Message-ID: On Tue, Mar 28, 2017 at 7:10 PM, ashwin.pathak wrote: > Hello Developers, > I am applying for GSoC 2017 under the project : Implementation of > scipy.diff. Here is my first iteration of my proposal : > > https://github.com/ashwinpathak20/scipy/wiki/GSoc-2017:-Implement-scipy.diff-(numerical-differentiation) > > Please have a look at it and suggest me all the short-comings and changes > to be included. > > Thanks in advance Something happened to the proposal, the link above gives 404. A couple of very general comments: * `scipy.optimize._numdiff.approx_derivative`, from https://github.com/scipy/scipy/blob/master/scipy/optimize/_numdiff.py implements several finite-difference schemes, including the complex-step differentiation. * It would be good to spec out the n-dim behavior from the very beginning. IIRC, one issue is the distinction between a Jacobian of a vector function of a vector argument and a vectorized evaluation of a derivative of a scalar function of a scalar argument. `approx_derivative` is designed for the former use case in mind, while e.g. `scipy.misc.derivative` is meant for the latter. * `scipy.misc.derivative` and `scipy.optimize.approx_fprime` are semi-broken. It would be good to unify them and the new functionality, fix what can be fixed and deprecate what cannot be. As a first step, it'd be nice to clean up the internal usage of `misc.derivative` and `approx_fprime`. Evgeni From ashwin.pathak at students.iiit.ac.in Wed Mar 29 08:56:23 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Wed, 29 Mar 2017 18:26:23 +0530 Subject: [SciPy-Dev] GSoC 2017 : Implementation of scipy.diff In-Reply-To: References: <2b4781672bfc729a71dca93f63c89b89@students.iiit.ac.in> Message-ID: <9b28c84d20ef33e73bb081aeba7ea170@students.iiit.ac.in> On 2017-03-29 16:47, Evgeni Burovski wrote: > On Tue, Mar 28, 2017 at 7:10 PM, ashwin.pathak > wrote: >> Hello Developers, >> I am applying for GSoC 2017 under the project : Implementation of >> scipy.diff. Here is my first iteration of my proposal : >> >> >> https://github.com/ashwinpathak20/scipy/wiki/GSoc-2017:-Implement-scipy.diff-(numerical-differentiation) >> >> Please have a look at it and suggest me all the short-comings and >> changes >> to be included. >> >> Thanks in advance > > > Something happened to the proposal, the link above gives 404. A > couple > of very general comments: > > * `scipy.optimize._numdiff.approx_derivative`, from > https://github.com/scipy/scipy/blob/master/scipy/optimize/_numdiff.py > implements several finite-difference schemes, including the > complex-step differentiation. > > * It would be good to spec out the n-dim behavior from the very > beginning. IIRC, one issue is the distinction between a Jacobian of a > vector function of a vector argument and a vectorized evaluation of a > derivative of a scalar function of a scalar argument. > `approx_derivative` is designed for the former use case in mind, > while > e.g. `scipy.misc.derivative` is meant for the latter. > > * `scipy.misc.derivative` and `scipy.optimize.approx_fprime` are > semi-broken. It would be good to unify them and the new > functionality, > fix what can be fixed and deprecate what cannot be. As a first step, > it'd be nice to clean up the internal usage of `misc.derivative` and > `approx_fprime`. > > Evgeni > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev The link is still there but apparently you need to also include the '(numerical-differentiation)' part just after the hyperlink. I also noticed this issue after posting it. Sorry for the inconvenience. Please suggest me the changes. Thank you From evgeny.burovskiy at gmail.com Wed Mar 29 11:38:27 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 29 Mar 2017 18:38:27 +0300 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: Hello Aman, Hello, > This is my second Draft, Proposal : Second Draft > > > I'd echo Ralf's comments on that the proposal needs some more focus. Several further comments: * Removing FITPACK is not a goal in itself. It would be better to focus a bit more on adding what is not available otherwise. In particular: - spending two weeks on reimplementing the knot insertion is not very useful, I'd say. - ditto for spending a whole week on the root finding. Sure, root-finding for non cubic splines is something we do not have currently, so adding it could be useful to have and fun to work on. However it's lower priority IMO. * If you say your main focus is cardinal splines, then just as Ralf said, it's better to start working on them earlier. However just adding a BSpline subclass would not be enough: there are some b-splines in scipy.signal, and some of it is in not too good of a shape. It would be nice to ask scipy.signal folks for feedback to drive your design: both on cleaning up scipy.signal, and on what sort of API / functionality they would find useful. * I would suggest to expand a bit the algorithmic part of the proposal. >From the current state of the proposal it's not even clear that you know or will learn how to implement algorithms you are discussing (conversions between bases, periodic interpolation). * In my POV, the project could/should lean a bit more towards writing algorithmic code (this is what Nikolay was saying a while ago in a parallel thread). -- Rereading the thread, I now find that I missed a prompt for comments from Nikolay: > For "Spline fitting to data" I'm somewhat confused, because make_lsq_spline and make_interp_spline > seem to already solve fitting problems without relying on FITPACK. Evgeni, what is the situation here * make_interp_spline needs an algorithmic improvement to handle periodic boundary conditions. It uses banded linear algebra, but with periodic BC, the collocation matrix has additional off-band elements, because first and last data points are close to each other. Chuck was saying at some point that a while ago he did something like that with a variant of the Sherman-Morrison formula for cubic splines. * make_lsq_spline is basically a stub: (i) it only works with user-supplied knots. Knot selection is a major thing which FITPACK does, and we don't have any alternatives. IMO looking into alternative algorithms could a major improvement from this sort of project. I'm not aware of *the* method, so possibly there's room for offering several alternatives and/or better user control. (cf https://github.com/scipy/scipy/issues/2579, https://github.com/scipy/scipy/issues/1753) (ii) it explicitly forms the normal equations (A.T @ A, where A is the collocation matrix). A better way would be to do some form of banded SVD or QR. I just recently learned (from a student) that LAPACK has routines for that (*gbbrd, *bdsqr), and we even expose them via cython_lapack. To sum up, I see at least three directions the project can focus on: - Low-level algorithms: some of the knot selection, periodic interpolation, banded SVD etc. - Cardinal b-splines and their uses in scipy.signal - What Stefan suggested upthread. Either of these can be coupled with easier-fix tasks for the first few weeks. Cheers, Evgeni > This is somewhat incomplete in detailing. It will take me some time to > make it more clear and obvious as to what I am going to do. > I am facing a problem though, in getting my hands on the book "A Practical > Guide to Splines" by Carl R de Boor. Also, I need some more time to get the > algorithms clear in my head. So, please bear with me. Looking forward to > your feedback. > > Thank You > > > > > On 24 March 2017 at 23:12, Evgeni Burovski > wrote: > >> >> >> On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov < >> nikolay.mayorov at zoho.com> wrote: >> >>> Resent my answer to the new address. >>> >>> >>> ============ Forwarded message ============ >>> From : Nikolay Mayorov >>> To : "SciPy Developers List" >>> Date : Fri, 24 Mar 2017 01:07:10 +0500 >>> Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support >>> ============ Forwarded message ============ >>> >>> De Boor, "Practical guide to splines", Chapter IX should be the place >>> for your. Likely you will need to thoroughly study several chapters in it, >>> it's not an easy book to quickly pick up a recipe from it. >>> >>> Nikolay. >>> >>> >>> ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik >>> >* wrote ---- >>> >>> >>> >>> Hello, >>> I need some help with *"**Convert a CubicSpline/PPoly object to a >>> BSpline". *I am unable to find suitable literature for it, or maybe I >>> am overlooking something. I would be really grateful if you could guide me >>> to some reading material or code samples. >>> >>> >> >> Lyche and Morken, http://www.uio.no/studier/emne >> r/matnat/ifi/INF-MAT5340/v05/undervisningsmateriale/ >> can be an easier first read than de Boor (which is, indeed, *the* >> reference). >> >> (Hat tip to Pauli who suggested this to me a while ago). >> >> Cheers, >> >> Evgeni >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Wed Mar 29 11:43:39 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 29 Mar 2017 18:43:39 +0300 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> <1490397186.1364376.922758360.0A878D64@webmail.messagingengine.com> Message-ID: On Tue, Mar 28, 2017 at 1:53 AM, Xavier Barthelemy wrote: > Thanks!! > I didn't know this trick! Is it a recent one? > > Yeah, PPoly is missing the `.. versionadded::` note --- it is new in scipy 0.14.0. (this and many other things, Pauli FTW) > All the best > Xavier > > 2017-03-25 19:30 GMT+11:00 Evgeni Burovski : > >> >> >> On Sat, Mar 25, 2017 at 9:23 AM, Xavier Barthelemy >> wrote: >> >>> Dear everyone, >>> >>> While we are talking about splining, I would like to add a request if >>> possible. >>> I am a heavy user of the splprep and co routines, and many times I need >>> to get the local polynomial coefficients to compute the derivatives, and >>> found the analitical zeros. >>> I had found a recipe on stackoverlord years ago, on how to get back to >>> local coefficient from the spline description. >>> >>> >>> Maybe a method, or an option, returning the local polynomial >>> coefficients (and the interval?) could be implemented? >>> it's my 2 cents as a user >>> >>> >> You can convert to piecewise polynomials via >> >> >>> tck = splrep(x, y) >> >>> p = PPoly.from_spline(tck) >> >> >> >> >>> thanks for considering it and please, discard if you think it's not fit. >>> >>> All the best >>> Xavier >>> >>> 2017-03-25 15:29 GMT+11:00 Aman Pratik : >>> >>>> Seems interesting. Though adding this would require doing away with >>>> some of the other part(s) of the project. I will do some research on it. >>>> Meanwhile, lets wait for feedback from the mentors. >>>> >>>> >>>> >>>> >>>> On 25 March 2017 at 04:43, Stefan van der Walt >>>> wrote: >>>> >>>>> On Fri, Mar 24, 2017, at 12:19, Aman Pratik wrote: >>>>> >>>>> Hello, >>>>> This is my second Draft, Proposal : Second Draft >>>>> >>>>> >>>>> >>>>> Would this be a good opportunity to add subdivision splines to SciPy? >>>>> Their implementation (at least in 1D) is simple, and they give good >>>>> locality around nodes (i.e., yield predictable curves, useful for spline >>>>> design). >>>>> >>>>> An example is given here: >>>>> >>>>> https://gist.github.com/stefanv/a28d61694fe8f3ef2325b23cbc51312a >>>>> >>>>> St?fan >>>>> >>>>> >>>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> >>> -- >>> ? Quand le gouvernement viole les droits du peuple, l'insurrection est, >>> pour le peuple et pour chaque portion du peuple, le plus sacr? des droits >>> et le plus indispensable des devoirs ? >>> >>> D?claration des droits de l'homme et du citoyen, article 35, 1793 >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > > -- > ? Quand le gouvernement viole les droits du peuple, l'insurrection est, > pour le peuple et pour chaque portion du peuple, le plus sacr? des droits > et le plus indispensable des devoirs ? > > D?claration des droits de l'homme et du citoyen, article 35, 1793 > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashwin.pathak at students.iiit.ac.in Wed Mar 29 13:17:25 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Wed, 29 Mar 2017 22:47:25 +0530 Subject: [SciPy-Dev] Github account got deleted Message-ID: I don't know how but my github account got deleted without my consent. I have reported this issue to Github and I am waiting for their response. Since, GSoC deadline is also fast approaching , can anyone help me suggesting alternatives to submit proposal or how to get the account back and running as soon as possible ? From amanpratik10 at gmail.com Wed Mar 29 13:40:40 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Wed, 29 Mar 2017 23:10:40 +0530 Subject: [SciPy-Dev] Fwd: Re: GSoC 2017: BSpline support In-Reply-To: References: <15ae89d14e6.f79c012d28922.1571864151183624651@zoho.com> <15afcc7b472.e71d6ecb20535.1486332461443151429@zoho.com> <15b000755f1.c4355fc52947.1795299634416699239@zoho.com> Message-ID: Hi Evgeni, After looking into the issues page, I found this issue I could work on for 'scipy.signal'. https://github.com/scipy/scipy/issues/6393 Is there anything else you want me to have a look at, apart from the comments for 'scipy.signal' which I will be collecting in some time? Thank You On 29 March 2017 at 21:08, Evgeni Burovski wrote: > > Hello Aman, > > > Hello, >> This is my second Draft, Proposal : Second Draft >> >> >> > I'd echo Ralf's comments on that the proposal needs some more focus. > Several further comments: > > * Removing FITPACK is not a goal in itself. It would be better to focus a > bit more on adding what is not available otherwise. In particular: > > - spending two weeks on reimplementing the knot insertion is not very > useful, I'd say. > - ditto for spending a whole week on the root finding. Sure, root-finding > for non cubic splines is something we do not have currently, so adding it > could be useful to have and fun to work on. However it's lower priority IMO. > > * If you say your main focus is cardinal splines, then just as Ralf said, > it's better to start working on them earlier. However just adding a BSpline > subclass would not be enough: there are some b-splines in scipy.signal, and > some of it is in not too good of a shape. It would be nice to ask > scipy.signal folks for feedback to drive your design: both on cleaning up > scipy.signal, and on what sort of API / functionality they would find > useful. > > * I would suggest to expand a bit the algorithmic part of the proposal. > From the current state of the proposal it's not even clear that you know or > will learn how to implement algorithms you are discussing (conversions > between bases, periodic interpolation). > > * In my POV, the project could/should lean a bit more towards writing > algorithmic code (this is what Nikolay was saying a while ago in a parallel > thread). > > -- Rereading the thread, I now find that I missed a prompt for comments > from Nikolay: > > > For "Spline fitting to data" I'm somewhat confused, because > make_lsq_spline and make_interp_spline > > seem to already solve fitting problems without relying on FITPACK. > Evgeni, what is the situation here > > * make_interp_spline needs an algorithmic improvement to handle periodic > boundary conditions. It uses banded linear algebra, but with periodic BC, > the collocation matrix has additional off-band elements, because first and > last data points are close to each other. Chuck was saying at some point > that a while ago he did something like that with a variant of the > Sherman-Morrison formula for cubic splines. > > * make_lsq_spline is basically a stub: > > (i) it only works with user-supplied knots. Knot selection is a major > thing which FITPACK does, and we don't have any alternatives. IMO looking > into alternative algorithms could a major improvement from this sort of > project. I'm not aware of *the* method, so possibly there's room for > offering several alternatives and/or better user control. (cf > https://github.com/scipy/scipy/issues/2579, https:// > github.com/scipy/scipy/issues/1753) > > (ii) it explicitly forms the normal equations (A.T @ A, where A is the > collocation matrix). A better way would be to do some form of banded SVD or > QR. I just recently learned (from a student) that LAPACK has routines for > that (*gbbrd, *bdsqr), and we even expose them via cython_lapack. > > To sum up, I see at least three directions the project can focus on: > > - Low-level algorithms: some of the knot selection, periodic > interpolation, banded SVD etc. > - Cardinal b-splines and their uses in scipy.signal > - What Stefan suggested upthread. > > Either of these can be coupled with easier-fix tasks for the first few > weeks. > > Cheers, > > Evgeni > > > >> This is somewhat incomplete in detailing. It will take me some time to >> make it more clear and obvious as to what I am going to do. >> I am facing a problem though, in getting my hands on the book "A >> Practical Guide to Splines" by Carl R de Boor. Also, I need some more time >> to get the algorithms clear in my head. So, please bear with me. Looking >> forward to your feedback. >> >> Thank You >> >> >> >> >> On 24 March 2017 at 23:12, Evgeni Burovski >> wrote: >> >>> >>> >>> On Fri, Mar 24, 2017 at 2:15 PM, Nikolay Mayorov < >>> nikolay.mayorov at zoho.com> wrote: >>> >>>> Resent my answer to the new address. >>>> >>>> >>>> ============ Forwarded message ============ >>>> From : Nikolay Mayorov >>>> To : "SciPy Developers List" >>>> Date : Fri, 24 Mar 2017 01:07:10 +0500 >>>> Subject : Re: [SciPy-Dev] GSoC 2017: BSpline support >>>> ============ Forwarded message ============ >>>> >>>> De Boor, "Practical guide to splines", Chapter IX should be the place >>>> for your. Likely you will need to thoroughly study several chapters in it, >>>> it's not an easy book to quickly pick up a recipe from it. >>>> >>>> Nikolay. >>>> >>>> >>>> ---- On Thu, 23 Mar 2017 19:31:32 +0500 *Aman Pratik >>>> >* wrote ---- >>>> >>>> >>>> >>>> Hello, >>>> I need some help with *"**Convert a CubicSpline/PPoly object to a >>>> BSpline". *I am unable to find suitable literature for it, or maybe I >>>> am overlooking something. I would be really grateful if you could guide me >>>> to some reading material or code samples. >>>> >>>> >>> >>> Lyche and Morken, http://www.uio.no/studier/emne >>> r/matnat/ifi/INF-MAT5340/v05/undervisningsmateriale/ >>> can be an easier first read than de Boor (which is, indeed, *the* >>> reference). >>> >>> (Hat tip to Pauli who suggested this to me a while ago). >>> >>> Cheers, >>> >>> Evgeni >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amanpratik10 at gmail.com Wed Mar 29 13:46:50 2017 From: amanpratik10 at gmail.com (Aman Pratik) Date: Wed, 29 Mar 2017 23:16:50 +0530 Subject: [SciPy-Dev] 'scipy.signal' : Issues/Comments (for GSoC 2017) Message-ID: Hello developers, I am working on Improving support and features in BSpline in 'scipy.interpolate'. I want to know any issues or difficulties you face when working with bsplines in 'scipy.signal'. If you want to see any changes of modifications that you want related to b-splines, specifically in 'scipy.signal', then please comment on this thread. Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Mar 29 15:17:31 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 30 Mar 2017 08:17:31 +1300 Subject: [SciPy-Dev] Github account got deleted In-Reply-To: References: Message-ID: On Thu, Mar 30, 2017 at 6:17 AM, ashwin.pathak < ashwin.pathak at students.iiit.ac.in> wrote: > I don't know how but my github account got deleted without my consent. Sorry to hear that. > I have reported this issue to Github and I am waiting for their response. > Since, GSoC deadline is also fast approaching , can anyone help me > suggesting alternatives to submit proposal or how to get the account back > and running as soon as possible ? > You can put your proposal on https://github.com/scipy/scipy/wiki, or on Google Docs for example. Ralf > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Wed Mar 29 15:23:28 2017 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 29 Mar 2017 21:23:28 +0200 Subject: [SciPy-Dev] Github account got deleted In-Reply-To: References: Message-ID: <87a2ca6b-6ca3-3dd5-4b21-a2895a10d0cb@iki.fi> 29.03.2017, 19:17, ashwin.pathak kirjoitti: > I don't know how but my github account got deleted without my consent. I > have reported this issue to Github and I am waiting for their response. > Since, GSoC deadline is also fast approaching , can anyone help me > suggesting alternatives to submit proposal or how to get the account > back and running as soon as possible ? In addition to what Ralf wrote: The final proposal needs to be submitted according to the official Google SoC instructions. You don't need a Github account to do this, but be sure to read Google's instructions. -- Pauli Virtanen From ralf.gommers at gmail.com Thu Mar 30 06:24:34 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 30 Mar 2017 23:24:34 +1300 Subject: [SciPy-Dev] 'scipy.signal' : Issues/Comments (for GSoC 2017) In-Reply-To: References: Message-ID: On Thu, Mar 30, 2017 at 6:46 AM, Aman Pratik wrote: > > Hello developers, > > I am working on Improving support and features in BSpline in > 'scipy.interpolate'. I want to know any issues or difficulties you face > when working with bsplines in 'scipy.signal'. If you want to see any > changes of modifications that you want related to b-splines, specifically > in 'scipy.signal', then please comment on this thread. > I suspect that no one will respond to this question with useful feedback, because almost no one uses the current B-spline stuff in scipy.signal. I see that Evgeni's description on https://github.com/scipy/scipy/issues/6730 mentions signal, but I don't think it's very relevant. From this piece of text: "Cardinal splines (splines with equidistant knots). We need representation, evaluation and construction. scipy.signal has some. The task here is to have an interface which is both compatible with regular b-splines and piecewise polynomials and is useful for the signal processing folks." you can ignore the "scipy.signal has some". The rest seems valid. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From iavlaserg at gmail.com Thu Mar 30 16:59:36 2017 From: iavlaserg at gmail.com (Vladislav Iakovlev) Date: Thu, 30 Mar 2017 23:59:36 +0300 Subject: [SciPy-Dev] Fwd: GSoC Proposal: Circular statistics In-Reply-To: References: Message-ID: Hello, I have improved my proposal, now it is placed here: https://github.com/yakovlevvs/scipy/wiki/GSoC-proposal:- Implementation-of-circular-statistics If there are any shortcomings, please, let me know. ---------- Forwarded message ---------- From: Vladislav Iakovlev Date: 2017-03-17 14:59 GMT+03:00 Subject: GSoC Proposal: Circular statistics To: scipy-dev at scipy.org Thanks to Josef and other colleagues for productive discussion! It clarified a lot for me! This is a first iteration of my GSoC proposal. The goal of the project is to implement circular statistics functionality in scipy.stats. Motivation: There exist several Python packages related to this theme, but most of them has some shortcomings. Also, it will be very conveniently to have this functionality in SciPy. It will simplify users work because they will not have to search and install other packages. What will be done: Base class rv_circular that will provide infrastructure for distributions characteristics. List of supposed characteristics: pdf, cdf, moments, mean, dispersion, standard deviation, percentiles, median, entropy, kurtosis, skewness. Derived classes for distributions. Methods for calculating values of characteristics can be redefined in this classes. List of supposed distributions: Uniform distribution, Triangular distribution, Cardioid distribution, Wrapped distributions: Cauchy, Normal, Von Mises. Functions for statistical tests: Tests of uniformity and Goodness-of-Fit. Tests related to Von Mises distribution: tests for mean and concentration parameters, multi-sample tests. Some non-parametric tests. I have developed a little ?demo-version? of circular stats toolbox: https://github.com/yakovlevvs/Circular-Statistics It is very raw and will be rewritten; I created this just to demonstrate my vision of how it will be arranged. Approximate timeline: April: Reading books, understanding related mathematics; May ? June, 20: Implementing rv_circular class, documentation and tests for it; June, 20 ? July, 10: Implementing distribution classes; July, 10 ? August, 20: Implementing statistical tests and point estimations; Related literature: Kanti V. Mardia, Peter E. Jupp: Directional statistics, 2000. Jammalamadaka, S. Rao. Topics in circular statistics / S. Rao Jammalamadaka, A. SenGupta. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Mar 31 03:18:02 2017 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 31 Mar 2017 20:18:02 +1300 Subject: [SciPy-Dev] Fwd: GSoC Proposal: Circular statistics In-Reply-To: References: Message-ID: Hi Vladislav, On Fri, Mar 31, 2017 at 9:59 AM, Vladislav Iakovlev wrote: > Hello, > > I have improved my proposal, now it is placed here: > https://github.com/yakovlevvs/scipy/wiki/GSoC-proposal:-Impl > ementation-of-circular-statistics > The reasons you list for ignoring the three existing packages don't look convincing when I read the text. They're also all license-compatible. So I think you want to explain that your proposed implementation will have more functionality or have certain advantages (beyond being in SciPy), and how you can use those packages (for testing against, or taking some code from them, or ...?). Where you say "This is a short draft of the future functionality", please be clear what that is and is not (e.g. a quick and dirty prototype of the API, not meant to be a basis for future work); you don't want a proposal reviewer to judge that code on its quality. And a trivial but nevertheless important comment: please spell check your proposal before submission. week 1: "initiation method"? not clear what you mean, __init__, or an initial implementation? week 4: it's fine to take time out to prepare for and do your exams, but it's then expected that you make up that time - for example by starting the coding period earlier. Overall comments on Planned functionality and Timeline sections: there's too many bullet points, it doesn't read well and there aren't many details. You may well understand all the details, but that won't be clear to a reviewer. You should insert a few details about key concepts. For example, for someone who doesn't know scipy.stats in-depth, what rv_circular is supposed to be will be unclear. What is special about rv_* classes or what functionality do they provide? Cheers, Ralf > If there are any shortcomings, please, let me know. > > ---------- Forwarded message ---------- > From: Vladislav Iakovlev > Date: 2017-03-17 14:59 GMT+03:00 > Subject: GSoC Proposal: Circular statistics > To: scipy-dev at scipy.org > > > Thanks to Josef and other colleagues for productive discussion! It > clarified a lot for me! > > This is a first iteration of my GSoC proposal. > > > > The goal of the project is to implement circular statistics functionality > in scipy.stats. > > > > Motivation: > > There exist several Python packages related to this theme, but most of > them has some shortcomings. Also, it will be very conveniently to have this > functionality in SciPy. It will simplify users work because they will not > have to search and install other packages. > > > > What will be done: > > > > Base class rv_circular that will provide infrastructure for distributions > characteristics. List of supposed characteristics: > > pdf, cdf, moments, mean, dispersion, standard deviation, percentiles, > median, entropy, kurtosis, skewness. > > > > Derived classes for distributions. Methods for calculating values of > characteristics can be redefined in this classes. List of supposed > distributions: > > Uniform distribution, > > Triangular distribution, > > Cardioid distribution, > > Wrapped distributions: Cauchy, Normal, Von Mises. > > > > Functions for statistical tests: > > Tests of uniformity and Goodness-of-Fit. > > Tests related to Von Mises distribution: tests for mean and concentration > parameters, multi-sample tests. > > Some non-parametric tests. > > > > I have developed a little ?demo-version? of circular stats toolbox: > > https://github.com/yakovlevvs/Circular-Statistics > > It is very raw and will be rewritten; I created this just to demonstrate > my vision of how it will be arranged. > > > > Approximate timeline: > > April: Reading books, understanding related mathematics; > > May ? June, 20: Implementing rv_circular class, documentation and tests > for it; > > June, 20 ? July, 10: Implementing distribution classes; > > July, 10 ? August, 20: Implementing statistical tests and point > estimations; > > > > Related literature: > > Kanti V. Mardia, Peter E. Jupp: Directional statistics, 2000. > > Jammalamadaka, S. Rao. Topics in circular statistics / S. Rao > Jammalamadaka, A. SenGupta. > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From perimosocordiae at gmail.com Fri Mar 31 11:55:59 2017 From: perimosocordiae at gmail.com (CJ Carey) Date: Fri, 31 Mar 2017 11:55:59 -0400 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: Message-ID: With the 1.0 release coming up, I figured this is a good time to take the pulse of the community regarding Bento. The project seems to have been abandoned before reaching stability, and it seems that the core problems it was trying to address have been at least partially fixed by the more official build systems. If you use Bento and would like to keep using it post 1.0, or if my previous sentence is wrong, please let me know! -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashwin.pathak at students.iiit.ac.in Fri Mar 31 18:23:25 2017 From: ashwin.pathak at students.iiit.ac.in (ashwin.pathak) Date: Sat, 01 Apr 2017 03:53:25 +0530 Subject: [SciPy-Dev] GSoC 2017 : Implementation of scipy.diff Message-ID: Hello developers, I have made suggested changes in my proposal. Here is the link of the second iteration of my proposal. Please suggested further changes to the proposal. https://docs.google.com/document/d/1WQwpD4VU3cewBH99a_2-3CcTmhtbclgS6JR1lVXmcYA/edit?usp=sharing Thanks in advance