From ralf.gommers at gmail.com Mon Jul 2 13:18:30 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 2 Jul 2018 10:18:30 -0700 Subject: [SciPy-Dev] welcome Christoph to the core team Message-ID: Hi all, On behalf of the SciPy developers I'd like to welcome Christoph Baumgarten as a member of the core dev team. Christoph has been contributing to scipy.stats for half a year; among other things he has added several new distributions: https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite active in maintenance & PR review. I'm looking forward to his continued contributions! Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Mon Jul 2 13:39:45 2018 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Mon, 2 Jul 2018 13:39:45 -0400 Subject: [SciPy-Dev] welcome Christoph to the core team In-Reply-To: References: Message-ID: On Mon, Jul 2, 2018 at 1:18 PM, Ralf Gommers wrote: > Hi all, > > On behalf of the SciPy developers I'd like to welcome Christoph Baumgarten as > a member of the core dev team. > > Christoph has been contributing to scipy.stats for half a year; among > other things he has added several new distributions: > https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite > active in maintenance & PR review. > > I'm looking forward to his continued contributions! > > Cheers, > Ralf > > Welcome, Christoph! Thanks for the great work you've done so far. Warren _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Mon Jul 2 13:48:26 2018 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Mon, 2 Jul 2018 20:48:26 +0300 Subject: [SciPy-Dev] welcome Christoph to the core team In-Reply-To: References: Message-ID: Welcome! On Mon, Jul 2, 2018, 8:40 PM Warren Weckesser wrote: > > > On Mon, Jul 2, 2018 at 1:18 PM, Ralf Gommers > wrote: > >> Hi all, >> >> On behalf of the SciPy developers I'd like to welcome Christoph >> Baumgarten as a member of the core dev team. >> >> Christoph has been contributing to scipy.stats for half a year; among >> other things he has added several new distributions: >> https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite >> active in maintenance & PR review. >> >> I'm looking forward to his continued contributions! >> >> Cheers, >> Ralf >> >> > > Welcome, Christoph! Thanks for the great work you've done so far. > > Warren > > > _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dieter at werthmuller.org Mon Jul 2 14:03:36 2018 From: dieter at werthmuller.org (=?UTF-8?Q?Dieter_Werthm=c3=bcller?=) Date: Mon, 2 Jul 2018 13:03:36 -0500 Subject: [SciPy-Dev] np.any(), np.all() Message-ID: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> Dear devs, Sorry for posting a numpy-issue on the scipy-list. I am not (yet) subscribed to the numpy list, but I believe that the two community have a big enough overlap so it shouldn't matter too much where I post it. I recently encountered the issue that I found that np.any(x) is sort of veeeery slow for big arrays, even if every element has a non-zero value and I only need a True/False response. So my thinking was that if np.any(x) encounters the first non-zero value it should simply return True, which should take basically no time at all if every element in the array is non-zero. Searching for it I found the following two old issues on numpy: 1. https://github.com/numpy/numpy/issues/2269 It is an issue from 2010, but got some traffic again in 2016 and 2017. 2. https://github.com/numpy/numpy/issues/3446 This is a related issue from 2013, reporting a potential performance regression between numpy 1.6.2 and 1.7.0, that got some traffic in 2016 again as well. I just wanted to ask about the opinions of devs more familiar with these two functions (np.all(), np.any()). Would there be better ways to check if any element in a big (1D or higher dimensions)-array is non-zero (or the reverse with np.all)? Thanks, Dieter From haberland at ucla.edu Mon Jul 2 14:18:30 2018 From: haberland at ucla.edu (Matt Haberland) Date: Mon, 2 Jul 2018 11:18:30 -0700 Subject: [SciPy-Dev] welcome Christoph to the core team In-Reply-To: References: Message-ID: Hi Christoph! On Mon, Jul 2, 2018 at 10:48 AM, Evgeni Burovski wrote: > Welcome! > > On Mon, Jul 2, 2018, 8:40 PM Warren Weckesser > wrote: > >> >> >> On Mon, Jul 2, 2018 at 1:18 PM, Ralf Gommers >> wrote: >> >>> Hi all, >>> >>> On behalf of the SciPy developers I'd like to welcome Christoph >>> Baumgarten as a member of the core dev team. >>> >>> Christoph has been contributing to scipy.stats for half a year; among >>> other things he has added several new distributions: >>> https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite >>> active in maintenance & PR review. >>> >>> I'm looking forward to his continued contributions! >>> >>> Cheers, >>> Ralf >>> >>> >> >> Welcome, Christoph! Thanks for the great work you've done so far. >> >> Warren >> >> >> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -- Matt Haberland Assistant Adjunct Professor in the Program in Computing Department of Mathematics 6617A Math Sciences Building, UCLA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Mon Jul 2 14:48:27 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Mon, 2 Jul 2018 20:48:27 +0200 Subject: [SciPy-Dev] welcome Christoph to the core team In-Reply-To: References: Message-ID: Welcome Christoph. On Mon, Jul 2, 2018 at 8:18 PM, Matt Haberland wrote: > Hi Christoph! > > On Mon, Jul 2, 2018 at 10:48 AM, Evgeni Burovski < > evgeny.burovskiy at gmail.com> wrote: > >> Welcome! >> >> On Mon, Jul 2, 2018, 8:40 PM Warren Weckesser >> wrote: >> >>> >>> >>> On Mon, Jul 2, 2018 at 1:18 PM, Ralf Gommers >>> wrote: >>> >>>> Hi all, >>>> >>>> On behalf of the SciPy developers I'd like to welcome Christoph >>>> Baumgarten as a member of the core dev team. >>>> >>>> Christoph has been contributing to scipy.stats for half a year; among >>>> other things he has added several new distributions: >>>> https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite >>>> active in maintenance & PR review. >>>> >>>> I'm looking forward to his continued contributions! >>>> >>>> Cheers, >>>> Ralf >>>> >>>> >>> >>> Welcome, Christoph! Thanks for the great work you've done so far. >>> >>> Warren >>> >>> >>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > > > -- > Matt Haberland > Assistant Adjunct Professor in the Program in Computing > Department of Mathematics > 6617A Math Sciences Building, UCLA > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Mon Jul 2 14:50:28 2018 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Mon, 2 Jul 2018 13:50:28 -0500 Subject: [SciPy-Dev] welcome Christoph to the core team In-Reply-To: References: Message-ID: Welcome! On Mon, Jul 2, 2018 at 1:48 PM Ilhan Polat wrote: > Welcome Christoph. > > On Mon, Jul 2, 2018 at 8:18 PM, Matt Haberland wrote: > >> Hi Christoph! >> >> On Mon, Jul 2, 2018 at 10:48 AM, Evgeni Burovski < >> evgeny.burovskiy at gmail.com> wrote: >> >>> Welcome! >>> >>> On Mon, Jul 2, 2018, 8:40 PM Warren Weckesser < >>> warren.weckesser at gmail.com> wrote: >>> >>>> >>>> >>>> On Mon, Jul 2, 2018 at 1:18 PM, Ralf Gommers >>>> wrote: >>>> >>>>> Hi all, >>>>> >>>>> On behalf of the SciPy developers I'd like to welcome Christoph >>>>> Baumgarten as a member of the core dev team. >>>>> >>>>> Christoph has been contributing to scipy.stats for half a year; among >>>>> other things he has added several new distributions: >>>>> https://github.com/scipy/scipy/pulls/chrisb83. He has also been quite >>>>> active in maintenance & PR review. >>>>> >>>>> I'm looking forward to his continued contributions! >>>>> >>>>> Cheers, >>>>> Ralf >>>>> >>>>> >>>> >>>> Welcome, Christoph! Thanks for the great work you've done so far. >>>> >>>> Warren >>>> >>>> >>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>> >>>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> >> >> >> -- >> Matt Haberland >> Assistant Adjunct Professor in the Program in Computing >> Department of Mathematics >> 6617A Math Sciences Building, UCLA >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From phillip.m.feldman at gmail.com Wed Jul 4 00:39:36 2018 From: phillip.m.feldman at gmail.com (Phillip Feldman) Date: Tue, 3 Jul 2018 21:39:36 -0700 Subject: [SciPy-Dev] np.any(), np.all() In-Reply-To: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> References: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> Message-ID: Performing a test (to determine whether the loop should be aborted) after looking at each element would cause `np.any` to run slower if all elements are zero, or if the first non-zero element is near the end of the array. Possibly a separate method that operates in this fashion is needed? Phillip On Mon, Jul 2, 2018 at 11:03 AM, Dieter Werthm?ller wrote: > Dear devs, > > Sorry for posting a numpy-issue on the scipy-list. I am not (yet) > subscribed to the numpy list, but I believe that the two community have a > big enough overlap so it shouldn't matter too much where I post it. > > I recently encountered the issue that I found that np.any(x) is sort of > veeeery slow for big arrays, even if every element has a non-zero value and > I only need a True/False response. So my thinking was that if np.any(x) > encounters the first non-zero value it should simply return True, which > should take basically no time at all if every element in the array is > non-zero. > > Searching for it I found the following two old issues on numpy: > > 1. https://github.com/numpy/numpy/issues/2269 > > It is an issue from 2010, but got some traffic again in 2016 and 2017. > > > 2. https://github.com/numpy/numpy/issues/3446 > > This is a related issue from 2013, reporting a potential performance > regression between numpy 1.6.2 and 1.7.0, that got some traffic in 2016 > again as well. > > > I just wanted to ask about the opinions of devs more familiar with these > two functions (np.all(), np.any()). Would there be better ways to check if > any element in a big (1D or higher dimensions)-array is non-zero (or the > reverse with np.all)? > > > Thanks, > Dieter > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benny.malengier at gmail.com Wed Jul 4 03:43:26 2018 From: benny.malengier at gmail.com (Benny Malengier) Date: Wed, 4 Jul 2018 09:43:26 +0200 Subject: [SciPy-Dev] np.any(), np.all() In-Reply-To: References: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> Message-ID: There is an open issue for this since 2012: https://github.com/numpy/numpy/issues/2269 Some python and c solutions are suggested in that thread. Op wo 4 jul. 2018 om 06:44 schreef Phillip Feldman < phillip.m.feldman at gmail.com>: > Performing a test (to determine whether the loop should be aborted) after > looking at each element would cause `np.any` to run slower if all elements > are zero, or if the first non-zero element is near the end of the array. > Possibly a separate method that operates in this fashion is needed? > > Phillip > > > On Mon, Jul 2, 2018 at 11:03 AM, Dieter Werthm?ller < > dieter at werthmuller.org> wrote: > >> Dear devs, >> >> Sorry for posting a numpy-issue on the scipy-list. I am not (yet) >> subscribed to the numpy list, but I believe that the two community have a >> big enough overlap so it shouldn't matter too much where I post it. >> >> I recently encountered the issue that I found that np.any(x) is sort of >> veeeery slow for big arrays, even if every element has a non-zero value and >> I only need a True/False response. So my thinking was that if np.any(x) >> encounters the first non-zero value it should simply return True, which >> should take basically no time at all if every element in the array is >> non-zero. >> >> Searching for it I found the following two old issues on numpy: >> >> 1. https://github.com/numpy/numpy/issues/2269 >> >> It is an issue from 2010, but got some traffic again in 2016 and 2017. >> >> >> 2. https://github.com/numpy/numpy/issues/3446 >> >> This is a related issue from 2013, reporting a potential performance >> regression between numpy 1.6.2 and 1.7.0, that got some traffic in 2016 >> again as well. >> >> >> I just wanted to ask about the opinions of devs more familiar with these >> two functions (np.all(), np.any()). Would there be better ways to check if >> any element in a big (1D or higher dimensions)-array is non-zero (or the >> reverse with np.all)? >> >> >> Thanks, >> Dieter >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benny.malengier at gmail.com Wed Jul 4 03:56:30 2018 From: benny.malengier at gmail.com (Benny Malengier) Date: Wed, 4 Jul 2018 09:56:30 +0200 Subject: [SciPy-Dev] np.any(), np.all() In-Reply-To: References: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> Message-ID: Sorry, I see you mention that issue. Anyway, I would suggest going for a cython or C workaround at the moment and commenting on the tickets. As Phillip says, a separate method would be needed as any and all do a reduce. Some methods like firstnotvalue and firstvalue, returning index of a difference from value or the value, so like https://pypi.org/project/py_find_1st/ mentioned in the ticket. For multidim arrays, as this operates on 1D arrays, perhaps the firstvalue applied per axis... Op wo 4 jul. 2018 om 09:43 schreef Benny Malengier < benny.malengier at gmail.com>: > There is an open issue for this since 2012: > https://github.com/numpy/numpy/issues/2269 > > Some python and c solutions are suggested in that thread. > > Op wo 4 jul. 2018 om 06:44 schreef Phillip Feldman < > phillip.m.feldman at gmail.com>: > >> Performing a test (to determine whether the loop should be aborted) after >> looking at each element would cause `np.any` to run slower if all elements >> are zero, or if the first non-zero element is near the end of the array. >> Possibly a separate method that operates in this fashion is needed? >> >> Phillip >> >> >> On Mon, Jul 2, 2018 at 11:03 AM, Dieter Werthm?ller < >> dieter at werthmuller.org> wrote: >> >>> Dear devs, >>> >>> Sorry for posting a numpy-issue on the scipy-list. I am not (yet) >>> subscribed to the numpy list, but I believe that the two community have a >>> big enough overlap so it shouldn't matter too much where I post it. >>> >>> I recently encountered the issue that I found that np.any(x) is sort of >>> veeeery slow for big arrays, even if every element has a non-zero value and >>> I only need a True/False response. So my thinking was that if np.any(x) >>> encounters the first non-zero value it should simply return True, which >>> should take basically no time at all if every element in the array is >>> non-zero. >>> >>> Searching for it I found the following two old issues on numpy: >>> >>> 1. https://github.com/numpy/numpy/issues/2269 >>> >>> It is an issue from 2010, but got some traffic again in 2016 and 2017. >>> >>> >>> 2. https://github.com/numpy/numpy/issues/3446 >>> >>> This is a related issue from 2013, reporting a potential performance >>> regression between numpy 1.6.2 and 1.7.0, that got some traffic in 2016 >>> again as well. >>> >>> >>> I just wanted to ask about the opinions of devs more familiar with these >>> two functions (np.all(), np.any()). Would there be better ways to check if >>> any element in a big (1D or higher dimensions)-array is non-zero (or the >>> reverse with np.all)? >>> >>> >>> Thanks, >>> Dieter >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dieter at werthmuller.org Wed Jul 4 08:36:24 2018 From: dieter at werthmuller.org (=?UTF-8?Q?Dieter_Werthm=c3=bcller?=) Date: Wed, 4 Jul 2018 07:36:24 -0500 Subject: [SciPy-Dev] np.any(), np.all() In-Reply-To: References: <0e0563dd-fd67-fd51-cdae-0b9a0fcb8448@werthmuller.org> Message-ID: Thanks Phillip and Benny for your replies. Yes, I specifically saw the issue you mentioned Benny. It is actually from 2010, and was moved to GitHub in 2012. This is why I thought I quickly ask on the mailing list if anything happened or is planned in this regard, because I imagine that very old issues are easily forgotten. It looks like this didn't change in the last 8 years, and probably won't in the (near) future. Which is fine, I just wanted to check before I search for other solutions. Thanks, Dieter On 04/07/18 02:56, Benny Malengier wrote: > Sorry, I see you mention that issue. Anyway, I would suggest going for a > cython or C workaround at the moment and commenting on the tickets. As > Phillip says, a separate method would be needed as any and all do a reduce. > Some methods like firstnotvalue and firstvalue, returning index of a > difference from value or the value, so like > https://pypi.org/project/py_find_1st/? mentioned in the ticket. > > For multidim arrays, as this operates on 1D arrays, perhaps the > firstvalue applied per axis... > > Op wo 4 jul. 2018 om 09:43 schreef Benny Malengier > >: > > There is an open issue for this since 2012: > https://github.com/numpy/numpy/issues/2269 > > Some python and c solutions are suggested in that thread. > > Op wo 4 jul. 2018 om 06:44 schreef Phillip Feldman > >: > > Performing a test (to determine whether the loop should be > aborted) after looking at each element would cause `np.any` to > run slower if all elements are zero, or if the first non-zero > element is near the end of the array.? Possibly a separate > method that operates in this fashion is needed? > > Phillip > > > On Mon, Jul 2, 2018 at 11:03 AM, Dieter Werthm?ller > > wrote: > > Dear devs, > > Sorry for posting a numpy-issue on the scipy-list. I am not > (yet) subscribed to the numpy list, but I believe that the > two community have a big enough overlap so it shouldn't > matter too much where I post it. > > I recently encountered the issue that I found that np.any(x) > is sort of veeeery slow for big arrays, even if every > element has a non-zero value and I only need a True/False > response. So my thinking was that if np.any(x) encounters > the first non-zero value it should simply return True, which > should take basically no time at all if every element in the > array is non-zero. > > Searching for it I found the following two old issues on numpy: > > 1. https://github.com/numpy/numpy/issues/2269 > > It is an issue from 2010, but got some traffic again in 2016 > and 2017. > > > 2. https://github.com/numpy/numpy/issues/3446 > > This is a related issue from 2013, reporting a potential > performance regression between numpy 1.6.2 and 1.7.0, that > got some traffic in 2016 again as well. > > > I just wanted to ask about the opinions of devs more > familiar with these two functions (np.all(), np.any()). > Would there be better ways to check if any element in a big > (1D or higher dimensions)-array is non-zero (or the reverse > with np.all)? > > > Thanks, > Dieter > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > From sivsankar977 at gmail.com Thu Jul 5 18:24:44 2018 From: sivsankar977 at gmail.com (Siva Sankar) Date: Fri, 6 Jul 2018 01:24:44 +0300 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy Message-ID: Hello, Automatic differentiation of noisy data has not been very accurate and does not have a lot of different algorithms that give consistent and accurate results. I have added an algorithm that estimates the derivative of noisy data using linear Gaussian state-space smoothing and square root formulas. Link to the pull request. https://github.com/scipy/scipy/pull/9004 The algorithm in its current state provides the smoothed signal from the noisy measurements, estimation of the first and second order derivatives and optionally also provides the dense output for the same. It accepts non-equally spaced data abscissas and is able to compute the state parameters between the abscissas, hence being able to provide the dense output. The algorithm was tested with data from bioanalytics and provides equal or better accuracy of the derivatives compared to the other automatic derivative algorithms. Real life readings and measurements of data in any field are prone to noise, thereby making the normal differentiation algorithms less reliable. Having an algorithm like this can significantly help users from a variety of fields where they need a good differentiation estimation without having to tweak with the parameters to differentiate data. Best Regards, Siva Sankar Kannan. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Jul 6 01:43:06 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 5 Jul 2018 22:43:06 -0700 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar wrote: > Hello, > > Automatic differentiation of noisy data has not been very accurate and > does not have a lot of different algorithms that give consistent and > accurate results. > > I have added an algorithm that estimates the derivative of noisy data > using linear Gaussian state-space smoothing and square root formulas. > > Link to the pull request. https://github.com/scipy/scipy/pull/9004 > > The algorithm in its current state provides the smoothed signal from the > noisy measurements, estimation of the first and second order derivatives > and optionally also provides the dense output for the same. It accepts > non-equally spaced data abscissas and is able to compute the state > parameters between the abscissas, hence being able to provide the dense > output. > > The algorithm was tested with data from bioanalytics and provides equal or > better accuracy of the derivatives compared to the other automatic > derivative algorithms. > > Real life readings and measurements of data in any field are prone to > noise, thereby making the normal differentiation algorithms less reliable. > Having an algorithm like this can significantly help users from a variety > of fields where they need a good differentiation estimation without having > to tweak with the parameters to differentiate data. > Hi Siva, thank you for trying to improve the situation with differentiation functionality. In SciPy we aim to include algorithms that are well known and have good performance - that means ideally a paper with enough citations showing real-world value. Have you based this on a publication? Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From sivsankar977 at gmail.com Fri Jul 6 05:07:41 2018 From: sivsankar977 at gmail.com (Siva Sankar) Date: Fri, 6 Jul 2018 12:07:41 +0300 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: Hello, Thanks for the swift response. Yes, I have based this on a paper that the professor I am working for was working on. The paper, however, is not yet published in a Journal yet but is available in arXiv. The algorithm was tested with real-world data and synthetic data, comparing against other algorithms such as Woltring's B-Spline (GCVSPL) and Savitzky Golay with manual parameter tuning. The real world test data included using an XSENS IMU to get inertial data readings and then comparing them against those from a VectorNAV imu which has significantly accurate readings. The XSENS IMU that was used not being very accurate simulated the noisy measurements, while the VectorNAV unit measurements were considered as the reference measurements. The algorithm was then used to estimate the derivates and then results were compared with the reference values. Best Regards, Siva Sankar Kannan. On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers wrote: > On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar > wrote: > >> Hello, >> >> Automatic differentiation of noisy data has not been very accurate and >> does not have a lot of different algorithms that give consistent and >> accurate results. >> >> I have added an algorithm that estimates the derivative of noisy data >> using linear Gaussian state-space smoothing and square root formulas. >> >> Link to the pull request. https://github.com/scipy/scipy/pull/9004 >> >> The algorithm in its current state provides the smoothed signal from the >> noisy measurements, estimation of the first and second order derivatives >> and optionally also provides the dense output for the same. It accepts >> non-equally spaced data abscissas and is able to compute the state >> parameters between the abscissas, hence being able to provide the dense >> output. >> >> The algorithm was tested with data from bioanalytics and provides equal >> or better accuracy of the derivatives compared to the other automatic >> derivative algorithms. >> >> Real life readings and measurements of data in any field are prone to >> noise, thereby making the normal differentiation algorithms less reliable. >> Having an algorithm like this can significantly help users from a variety >> of fields where they need a good differentiation estimation without having >> to tweak with the parameters to differentiate data. >> > > Hi Siva, thank you for trying to improve the situation with > differentiation functionality. In SciPy we aim to include algorithms that > are well known and have good performance - that means ideally a paper with > enough citations showing real-world value. Have you based this on a > publication? > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Fri Jul 6 12:19:39 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 6 Jul 2018 09:19:39 -0700 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: On Fri, Jul 6, 2018 at 2:07 AM, Siva Sankar wrote: > Hello, > > Thanks for the swift response. Yes, I have based this on a paper that the > professor I am working for was working on. The paper, however, is not yet > published in a Journal yet but is available in arXiv. > Can you share a link to the paper? > The algorithm was tested with real-world data and synthetic data, > comparing against other algorithms such as Woltring's B-Spline (GCVSPL) and > Savitzky Golay with manual parameter tuning. > Your PR adds code to scipy.misc, which is not the right place (we'll get rid of misc soon). It sounds more like it belongs in scipy.signal. That said, don't do a lot of work to move things now - first we need to decide whether this makes sense to include in SciPy. At the moment I'd say it's too early; we probably want to wait a year or two until it's clear that the paper has been accepted and gets citations that show the algorithm is valuable. Cheers, Ralf > The real world test data included using an XSENS IMU to get inertial data > readings and then comparing them against those from a VectorNAV imu which > has significantly accurate readings. The XSENS IMU that was used not being > very accurate simulated the noisy measurements, while the VectorNAV unit > measurements were considered as the reference measurements. The algorithm > was then used to estimate the derivates and then results were compared with > the reference values. > > Best Regards, > Siva Sankar Kannan. > > On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers > wrote: > >> On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar >> wrote: >> >>> Hello, >>> >>> Automatic differentiation of noisy data has not been very accurate and >>> does not have a lot of different algorithms that give consistent and >>> accurate results. >>> >>> I have added an algorithm that estimates the derivative of noisy data >>> using linear Gaussian state-space smoothing and square root formulas. >>> >>> Link to the pull request. https://github.com/scipy/scipy/pull/9004 >>> >>> The algorithm in its current state provides the smoothed signal from the >>> noisy measurements, estimation of the first and second order derivatives >>> and optionally also provides the dense output for the same. It accepts >>> non-equally spaced data abscissas and is able to compute the state >>> parameters between the abscissas, hence being able to provide the dense >>> output. >>> >>> The algorithm was tested with data from bioanalytics and provides equal >>> or better accuracy of the derivatives compared to the other automatic >>> derivative algorithms. >>> >>> Real life readings and measurements of data in any field are prone to >>> noise, thereby making the normal differentiation algorithms less reliable. >>> Having an algorithm like this can significantly help users from a variety >>> of fields where they need a good differentiation estimation without having >>> to tweak with the parameters to differentiate data. >>> >> >> Hi Siva, thank you for trying to improve the situation with >> differentiation functionality. In SciPy we aim to include algorithms that >> are well known and have good performance - that means ideally a paper with >> enough citations showing real-world value. Have you based this on a >> publication? >> >> Cheers, >> Ralf >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sivsankar977 at gmail.com Fri Jul 6 17:17:51 2018 From: sivsankar977 at gmail.com (Siva Sankar) Date: Sat, 7 Jul 2018 00:17:51 +0300 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: The paper can be found here. https://arxiv.org/abs/1610.04397 Ok, I will wait before I do anything more. BR, Siva. On Fri, Jul 6, 2018 at 7:20 PM Ralf Gommers wrote: > On Fri, Jul 6, 2018 at 2:07 AM, Siva Sankar > wrote: > >> Hello, >> >> Thanks for the swift response. Yes, I have based this on a paper that the >> professor I am working for was working on. The paper, however, is not yet >> published in a Journal yet but is available in arXiv. >> > > Can you share a link to the paper? > > >> The algorithm was tested with real-world data and synthetic data, >> comparing against other algorithms such as Woltring's B-Spline (GCVSPL) and >> Savitzky Golay with manual parameter tuning. >> > > Your PR adds code to scipy.misc, which is not the right place (we'll get > rid of misc soon). It sounds more like it belongs in scipy.signal. That > said, don't do a lot of work to move things now - first we need to decide > whether this makes sense to include in SciPy. At the moment I'd say it's > too early; we probably want to wait a year or two until it's clear that the > paper has been accepted and gets citations that show the algorithm is > valuable. > > Cheers, > Ralf > > >> The real world test data included using an XSENS IMU to get inertial data >> readings and then comparing them against those from a VectorNAV imu which >> has significantly accurate readings. The XSENS IMU that was used not being >> very accurate simulated the noisy measurements, while the VectorNAV unit >> measurements were considered as the reference measurements. The algorithm >> was then used to estimate the derivates and then results were compared with >> the reference values. >> >> Best Regards, >> Siva Sankar Kannan. >> >> On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers >> wrote: >> >>> On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar >>> wrote: >>> >>>> Hello, >>>> >>>> Automatic differentiation of noisy data has not been very accurate and >>>> does not have a lot of different algorithms that give consistent and >>>> accurate results. >>>> >>>> I have added an algorithm that estimates the derivative of noisy data >>>> using linear Gaussian state-space smoothing and square root formulas. >>>> >>>> Link to the pull request. https://github.com/scipy/scipy/pull/9004 >>>> >>>> The algorithm in its current state provides the smoothed signal from >>>> the noisy measurements, estimation of the first and second order >>>> derivatives and optionally also provides the dense output for the same. It >>>> accepts non-equally spaced data abscissas and is able to compute the state >>>> parameters between the abscissas, hence being able to provide the dense >>>> output. >>>> >>>> The algorithm was tested with data from bioanalytics and provides equal >>>> or better accuracy of the derivatives compared to the other automatic >>>> derivative algorithms. >>>> >>>> Real life readings and measurements of data in any field are prone to >>>> noise, thereby making the normal differentiation algorithms less reliable. >>>> Having an algorithm like this can significantly help users from a variety >>>> of fields where they need a good differentiation estimation without having >>>> to tweak with the parameters to differentiate data. >>>> >>> >>> Hi Siva, thank you for trying to improve the situation with >>> differentiation functionality. In SciPy we aim to include algorithms that >>> are well known and have good performance - that means ideally a paper with >>> enough citations showing real-world value. Have you based this on a >>> publication? >>> >>> Cheers, >>> Ralf >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jul 7 12:55:47 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 7 Jul 2018 09:55:47 -0700 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: On Fri, Jul 6, 2018 at 2:17 PM, Siva Sankar wrote: > The paper can be found here. https://arxiv.org/abs/1610.04397 > Thanks Siva. Given that that paper is quite technical and terse, that it's 2 years old with only one self-citation, and that it doesn't fit nicely in a scipy submodule, my opinion is that we unfortunately cannot accept your PR. Please don't let that discourage you though - I would suggest to release your algorithm standalone - if it matures and gets update we can reconsider at some point. Cheers, Ralf > > Ok, I will wait before I do anything more. > > BR, Siva. > > On Fri, Jul 6, 2018 at 7:20 PM Ralf Gommers > wrote: > >> On Fri, Jul 6, 2018 at 2:07 AM, Siva Sankar >> wrote: >> >>> Hello, >>> >>> Thanks for the swift response. Yes, I have based this on a paper that >>> the professor I am working for was working on. The paper, however, is not >>> yet published in a Journal yet but is available in arXiv. >>> >> >> Can you share a link to the paper? >> >> >>> The algorithm was tested with real-world data and synthetic data, >>> comparing against other algorithms such as Woltring's B-Spline (GCVSPL) and >>> Savitzky Golay with manual parameter tuning. >>> >> >> Your PR adds code to scipy.misc, which is not the right place (we'll get >> rid of misc soon). It sounds more like it belongs in scipy.signal. That >> said, don't do a lot of work to move things now - first we need to decide >> whether this makes sense to include in SciPy. At the moment I'd say it's >> too early; we probably want to wait a year or two until it's clear that the >> paper has been accepted and gets citations that show the algorithm is >> valuable. >> >> Cheers, >> Ralf >> >> >>> The real world test data included using an XSENS IMU to get inertial >>> data readings and then comparing them against those from a VectorNAV imu >>> which has significantly accurate readings. The XSENS IMU that was used not >>> being very accurate simulated the noisy measurements, while the VectorNAV >>> unit measurements were considered as the reference measurements. The >>> algorithm was then used to estimate the derivates and then results were >>> compared with the reference values. >>> >>> Best Regards, >>> Siva Sankar Kannan. >>> >>> On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers >>> wrote: >>> >>>> On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar >>>> wrote: >>>> >>>>> Hello, >>>>> >>>>> Automatic differentiation of noisy data has not been very accurate and >>>>> does not have a lot of different algorithms that give consistent and >>>>> accurate results. >>>>> >>>>> I have added an algorithm that estimates the derivative of noisy data >>>>> using linear Gaussian state-space smoothing and square root formulas. >>>>> >>>>> Link to the pull request. https://github.com/scipy/scipy/pull/9004 >>>>> >>>>> The algorithm in its current state provides the smoothed signal from >>>>> the noisy measurements, estimation of the first and second order >>>>> derivatives and optionally also provides the dense output for the same. It >>>>> accepts non-equally spaced data abscissas and is able to compute the state >>>>> parameters between the abscissas, hence being able to provide the dense >>>>> output. >>>>> >>>>> The algorithm was tested with data from bioanalytics and provides >>>>> equal or better accuracy of the derivatives compared to the other automatic >>>>> derivative algorithms. >>>>> >>>>> Real life readings and measurements of data in any field are prone to >>>>> noise, thereby making the normal differentiation algorithms less reliable. >>>>> Having an algorithm like this can significantly help users from a variety >>>>> of fields where they need a good differentiation estimation without having >>>>> to tweak with the parameters to differentiate data. >>>>> >>>> >>>> Hi Siva, thank you for trying to improve the situation with >>>> differentiation functionality. In SciPy we aim to include algorithms that >>>> are well known and have good performance - that means ideally a paper with >>>> enough citations showing real-world value. Have you based this on a >>>> publication? >>>> >>>> Cheers, >>>> Ralf >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >>> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sivsankar977 at gmail.com Sun Jul 8 08:14:05 2018 From: sivsankar977 at gmail.com (Siva Sankar) Date: Sun, 8 Jul 2018 15:14:05 +0300 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: Ok, thanks for the response. I will release it as a standalone algorithm somewhere. BR, Siva. On Sat, Jul 7, 2018 at 7:56 PM Ralf Gommers wrote: > On Fri, Jul 6, 2018 at 2:17 PM, Siva Sankar > wrote: > >> The paper can be found here. https://arxiv.org/abs/1610.04397 >> > > Thanks Siva. Given that that paper is quite technical and terse, that it's > 2 years old with only one self-citation, and that it doesn't fit nicely in > a scipy submodule, my opinion is that we unfortunately cannot accept your > PR. Please don't let that discourage you though - I would suggest to > release your algorithm standalone - if it matures and gets update we can > reconsider at some point. > > Cheers, > Ralf > > >> >> Ok, I will wait before I do anything more. >> >> BR, Siva. >> >> On Fri, Jul 6, 2018 at 7:20 PM Ralf Gommers >> wrote: >> >>> On Fri, Jul 6, 2018 at 2:07 AM, Siva Sankar >>> wrote: >>> >>>> Hello, >>>> >>>> Thanks for the swift response. Yes, I have based this on a paper that >>>> the professor I am working for was working on. The paper, however, is not >>>> yet published in a Journal yet but is available in arXiv. >>>> >>> >>> Can you share a link to the paper? >>> >>> >>>> The algorithm was tested with real-world data and synthetic data, >>>> comparing against other algorithms such as Woltring's B-Spline (GCVSPL) and >>>> Savitzky Golay with manual parameter tuning. >>>> >>> >>> Your PR adds code to scipy.misc, which is not the right place (we'll get >>> rid of misc soon). It sounds more like it belongs in scipy.signal. That >>> said, don't do a lot of work to move things now - first we need to decide >>> whether this makes sense to include in SciPy. At the moment I'd say it's >>> too early; we probably want to wait a year or two until it's clear that the >>> paper has been accepted and gets citations that show the algorithm is >>> valuable. >>> >>> Cheers, >>> Ralf >>> >>> >>>> The real world test data included using an XSENS IMU to get inertial >>>> data readings and then comparing them against those from a VectorNAV imu >>>> which has significantly accurate readings. The XSENS IMU that was used not >>>> being very accurate simulated the noisy measurements, while the VectorNAV >>>> unit measurements were considered as the reference measurements. The >>>> algorithm was then used to estimate the derivates and then results were >>>> compared with the reference values. >>>> >>>> Best Regards, >>>> Siva Sankar Kannan. >>>> >>>> On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers >>>> wrote: >>>> >>>>> On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar >>>>> wrote: >>>>> >>>>>> Hello, >>>>>> >>>>>> Automatic differentiation of noisy data has not been very accurate >>>>>> and does not have a lot of different algorithms that give consistent and >>>>>> accurate results. >>>>>> >>>>>> I have added an algorithm that estimates the derivative of noisy data >>>>>> using linear Gaussian state-space smoothing and square root formulas. >>>>>> >>>>>> Link to the pull request. https://github.com/scipy/scipy/pull/9004 >>>>>> >>>>>> The algorithm in its current state provides the smoothed signal from >>>>>> the noisy measurements, estimation of the first and second order >>>>>> derivatives and optionally also provides the dense output for the same. It >>>>>> accepts non-equally spaced data abscissas and is able to compute the state >>>>>> parameters between the abscissas, hence being able to provide the dense >>>>>> output. >>>>>> >>>>>> The algorithm was tested with data from bioanalytics and provides >>>>>> equal or better accuracy of the derivatives compared to the other automatic >>>>>> derivative algorithms. >>>>>> >>>>>> Real life readings and measurements of data in any field are prone to >>>>>> noise, thereby making the normal differentiation algorithms less reliable. >>>>>> Having an algorithm like this can significantly help users from a variety >>>>>> of fields where they need a good differentiation estimation without having >>>>>> to tweak with the parameters to differentiate data. >>>>>> >>>>> >>>>> Hi Siva, thank you for trying to improve the situation with >>>>> differentiation functionality. In SciPy we aim to include algorithms that >>>>> are well known and have good performance - that means ideally a paper with >>>>> enough citations showing real-world value. Have you based this on a >>>>> publication? >>>>> >>>>> Cheers, >>>>> Ralf >>>>> >>>>> _______________________________________________ >>>>> SciPy-Dev mailing list >>>>> SciPy-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-dev >>>> >>>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at python.org >>> https://mail.python.org/mailman/listinfo/scipy-dev >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jjstickel at gmail.com Sun Jul 8 16:36:58 2018 From: jjstickel at gmail.com (Jonathan Stickel) Date: Sun, 8 Jul 2018 14:36:58 -0600 Subject: [SciPy-Dev] Adding algorithm for automatic differentiation of noisy data to SciPy In-Reply-To: References: Message-ID: <2ca7da43-2a34-a7f6-55fd-ee3df9fb6967@gmail.com> Siva Some years ago I created "scikit-datasmooth": https://github.com/jjstickel/scikit-datasmooth It has contained smoothing (and differentiation) by the regularization method, but I always hoped it would generate interest and that more methods would be added. You would be welcome to add your method there. We can discuss off list. Regards, Jonathan On 7/8/18 06:14 , Siva Sankar wrote: > Ok, thanks for the response. I will release it as a standalone algorithm > somewhere. > > BR, > Siva. > > On Sat, Jul 7, 2018 at 7:56 PM Ralf Gommers > wrote: > > On Fri, Jul 6, 2018 at 2:17 PM, Siva Sankar > wrote: > > The paper can be found here. https://arxiv.org/abs/1610.04397 > > > Thanks Siva. Given that that paper is quite technical and terse, > that it's 2 years old with only one self-citation, and that it > doesn't fit nicely in a scipy submodule, my opinion is that we > unfortunately cannot accept your PR. Please don't let that > discourage you though - I would suggest to release your algorithm > standalone - if it matures and gets update we can reconsider at some > point. > > Cheers, > Ralf > > > > Ok, I will wait before I do anything more. > > BR, Siva. > > On Fri, Jul 6, 2018 at 7:20 PM Ralf Gommers > > wrote: > > On Fri, Jul 6, 2018 at 2:07 AM, Siva Sankar > > wrote: > > Hello, > > Thanks for the swift response. Yes, I have based this on > a paper that the professor I am working for was working > on. The paper, however, is not yet published in a > Journal yet but is available in arXiv. > > > Can you share a link to the paper? > > The algorithm was tested with real-world data and > synthetic data, comparing against other algorithms such > as Woltring's B-Spline (GCVSPL) and Savitzky Golay with > manual parameter tuning. > > > Your PR adds code to scipy.misc, which is not the right > place (we'll get rid of misc soon). It sounds more like it > belongs in scipy.signal. That said, don't do a lot of work > to move things now - first we need to decide whether this > makes sense to include in SciPy. At the moment I'd say it's > too early; we probably want to wait a year or two until it's > clear that the paper has been accepted and gets citations > that show the algorithm is valuable. > > Cheers, > Ralf > > > The real world test data included using an XSENS IMU to > get inertial data readings and then comparing them > against those from a VectorNAV imu which has > significantly accurate readings. The XSENS IMU that was > used not being very accurate simulated the noisy > measurements, while the VectorNAV unit measurements were > considered as the reference measurements. The algorithm > was then used to estimate the derivates and then results > were compared with the reference values. > > Best Regards, > Siva Sankar Kannan. > > On Fri, Jul 6, 2018 at 8:43 AM Ralf Gommers > > > wrote: > > On Thu, Jul 5, 2018 at 3:24 PM, Siva Sankar > > wrote: > > Hello, > > Automatic differentiation of noisy data has not > been very accurate and does not have a lot of > different algorithms that give consistent and > accurate results. > > I have added an algorithm that estimates the > derivative of noisy data using linear Gaussian > state-space smoothing and square root formulas. > > Link to the?pull request. > https://github.com/scipy/scipy/pull/9004 > > The algorithm in its current state provides the > smoothed signal from the noisy measurements, > estimation of the first and second order > derivatives and optionally also provides the > dense output for the same. It accepts > non-equally spaced data abscissas and is able to > compute the state parameters between the > abscissas, hence being able to provide the dense > output. > > The algorithm was tested with data from > bioanalytics and provides equal or better > accuracy of the derivatives compared to the > other automatic derivative algorithms. > > Real life readings and measurements of data in > any field are prone to noise, thereby making the > normal differentiation algorithms less reliable. > Having an algorithm like this can significantly > help users from a variety of fields where they > need a good differentiation estimation without > having to tweak with the parameters to > differentiate data. > > > Hi Siva, thank you for trying to improve the > situation with differentiation functionality. In > SciPy we aim to include algorithms that are well > known and have good performance - that means ideally > a paper with enough citations showing real-world > value. Have you based this on a publication? > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Mon Jul 9 18:08:12 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 9 Jul 2018 16:08:12 -0600 Subject: [SciPy-Dev] NumPy 1.15.0rc2 released. Message-ID: Hi All, On behalf of the NumPy team I'm pleased to announce the release of NumPy 1.15.0rc2. This release has an unusual number of cleanups, many deprecations of old functions, and improvements to many existing functions. A total of 435 pull reguests were merged for this release, please look at the release notes for details. Some highlights are: - NumPy has switched to pytest for testing. - A new `numpy.printoptions` context manager. - Many improvements to the histogram functions. - Support for unicode field names in python 2.7. - Improved support for PyPy. - Fixes and improvements to `numpy.einsum`. The Python versions supported by this release are 2.7, 3.4-3.7. The wheels are linked with OpenBLAS v0.3.0, which should fix some of the linalg problems reported for NumPy 1.14. Wheels for this release can be downloaded from PyPI , source archives are available from Github . A total of 131 people contributed to this release. People with a "+" by their names contributed a patch for the first time. - Aaron Critchley + - Aarthi + - Aarthi Agurusa + - Alex Thomas + - Alexander Belopolsky - Allan Haldane - Anas Khan + - Andras Deak - Andrey Portnoy + - Anna Chiara - Aurelien Jarno + - Baurzhan Muftakhidinov - Berend Kapelle + - Bernhard M. Wiedemann - Bjoern Thiel + - Bob Eldering - Cenny Wenner + - Charles Harris - ChloeColeongco + - Chris Billington + - Christopher + - Chun-Wei Yuan + - Claudio Freire + - Daniel Smith - Darcy Meyer + - David Abdurachmanov + - David Freese - Deepak Kumar Gouda + - Dennis Weyland + - Derrick Williams + - Dmitriy Shalyga + - Eric Cousineau + - Eric Larson - Eric Wieser - Evgeni Burovski - Frederick Lefebvre + - Gaspar Karm + - Geoffrey Irving - Gerhard Hobler + - Gerrit Holl - Guo Ci + - Hameer Abbasi + - Han Shen - Hiroyuki V. Yamazaki + - Hong Xu - Ihor Melnyk + - Jaime Fernandez - Jake VanderPlas + - James Tocknell + - Jarrod Millman - Jeff VanOss + - John Kirkham - Jonas Rauber + - Jonathan March + - Joseph Fox-Rabinovitz - Julian Taylor - Junjie Bai + - Juris Bogusevs + - J?rg D?pfert - Kenichi Maehashi + - Kevin Sheppard - Kimikazu Kato + - Kirit Thadaka + - Kritika Jalan + - Lakshay Garg + - Lars G + - Licht Takeuchi - Louis Potok + - Luke Zoltan Kelley - MSeifert04 + - Mads R. B. Kristensen + - Malcolm Smith + - Mark Harfouche + - Marten H. van Kerkwijk + - Marten van Kerkwijk - Matheus Vieira Portela + - Mathieu Lamarre - Mathieu Sornay + - Matthew Brett - Matthew Rocklin + - Matthias Bussonnier - Matti Picus - Michael Droettboom - Miguel S?nchez de Le?n Peque + - Mike Toews + - Milo + - Nathaniel J. Smith - Nelle Varoquaux - Nicholas Nadeau, P.Eng., AVS + - Nick Minkyu Lee + - Nikita + - Nikita Kartashov + - Nils Becker + - Oleg Zabluda - Orestis Floros + - Pat Gunn + - Paul van Mulbregt + - Pauli Virtanen - Pierre Chanial + - Ralf Gommers - Raunak Shah + - Robert Kern - Russell Keith-Magee + - Ryan Soklaski + - Samuel Jackson + - Sebastian Berg - Siavash Eliasi + - Simon Conseil - Simon Gibbons - Stefan Krah + - Stefan van der Walt - Stephan Hoyer - Subhendu + - Subhendu Ranjan Mishra + - Tai-Lin Wu + - Tobias Fischer + - Toshiki Kataoka + - Tyler Reddy + - Unknown + - Varun Nayyar - Victor Rodriguez + - Warren Weckesser - William D. Irons + - Zane Bradley + - fo40225 + - lapack_lite code generator + - lumbric + - luzpaz + - mamrehn + - tynn + - xoviat Cheers Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From sseibert at anaconda.com Mon Jul 9 21:56:28 2018 From: sseibert at anaconda.com (Stanley Seibert) Date: Mon, 9 Jul 2018 20:56:28 -0500 Subject: [SciPy-Dev] Recent Numba updates of interest Message-ID: Hi all, Following up from conversation in March about Numba's progress toward being a dependency for SciPy-related projects, I wanted to give an update on some recent progress on items of interest. Some of these items will be not be available until Numba 0.39, which is tagged now and will be officially released in the next day or so: - ppc64le (POWER8 and 9) is now supported with the release of LLVM 6.0.1. (This may be of interest to those of you applying for time on the Summit and Sierra supercomputers in the US.) Since there is no wheel standard for ppc64le, I would recommend using the conda packages in the numba channel on anaconda.org. - Testing Numba on ARMv7 (Raspberry Pi) support flushed out some bugs in NumPy on ARM (related to alignment) and OpenBLAS. NumPy 1.16 should have the fixes that Numba needs to pass its unit tests on ARM. Once that happens, we should have some Berryconda-compatible conda packages in the numba channel. - We've made a variety of changes to compiler error messages to hide giant tracebacks, provide suggestions and links to relevant documentation, and optionally colorize errors similar to gcc. There's much more to do here, and we'll be adding more heuristics to make error messages more helpful. - Thanks to Matthias Bussonnier, you can now see an HTML annotated version of your function in a Jupyter notebook with the `inspect_types(pretty=True)` method. We plan to use this as the core of an improved Numba annotation feature that will give you more info about how your function was optimized (or not) by ParallelAccelerator and LLVM in the future. - We have an open PR for support for accessing the attributes of a scipy.LowLevelCallable (https://github.com/numba/numba/pull/2999) in nopython mode, but we're not sure we understand how it is being used in practice. Anyone who can comment on that PR with example code would be much appreciated. - There is now a documentation page on how to use and combine the various performance features in Numba: https://numba.pydata.org/numba-doc/dev/user/performance-tips.html Those of you at SciPy 2018 this week can see Siu's 3 minute talk on Numba in the Tools plenary session on Thursday, where he will speedrun all the other Numba improvements that have happened lately. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Jul 10 05:59:17 2018 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 10 Jul 2018 11:59:17 +0200 Subject: [SciPy-Dev] Recent Numba updates of interest In-Reply-To: References: Message-ID: ma, 2018-07-09 kello 20:56 -0500, Stanley Seibert kirjoitti: [clip] > - We have an open PR for support for accessing the attributes of a > scipy.LowLevelCallable (https://github.com/numba/numba/pull/2999) in > nopython mode, but we're not sure we understand how it is being used > in practice. Anyone who can comment on that PR with example code > would be much appreciated. The discussion seems somewhat unclear on the use case of LowLevelCallable. What it does may become more clear by reading https://github.com/scipy/scipy/blob/master/scipy/_lib/src/ccallback.h The point is to provide a lowest-common-denominator compatibility wrapper for several Python FFI libraries plus Cython (with possibility to avoid ctypes/cffi speed costs), for shipping around callable function pointers that have fixed signatures from a small set, for use also from hand-written C without code generation. (The last point means some things make less sense in the Numba context.) There's no intended way to call a generic LowLevelCallable. You are supposed to cast it to a function pointer with signature(s) you know beforehand (and you can check if the LowLevelCallable in question satisfies the signature). Going the other direction of trying to dynamically find the function pointer signature given a LowLevelCallable is not in the design. What Numba in principle might do is to provide a function that casts a LowLevelCallable to a function pointer(s) with given signature(s), checking the signature (which is intended to contain C type names, with integer types usually not unique). The converse operation of casting a numba.cfunc to a LowLevelCallable works currently via the .ctypes attribute. In principle, scipy.LowLevelCallable could be made to understand numba.cfunc directly (PR welcome). Pauli From pav at iki.fi Tue Jul 10 06:08:51 2018 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 10 Jul 2018 12:08:51 +0200 Subject: [SciPy-Dev] Recent Numba updates of interest In-Reply-To: References: Message-ID: <49aa131e4ebbe28717e17d23d274b461a826105a.camel@iki.fi> ti, 2018-07-10 kello 11:59 +0200, Pauli Virtanen kirjoitti: > ma, 2018-07-09 kello 20:56 -0500, Stanley Seibert kirjoitti: > [clip] > > - We have an open PR for support for accessing the attributes of a > > scipy.LowLevelCallable (https://github.com/numba/numba/pull/2999) > > in > > nopython mode, but we're not sure we understand how it is being > > used > > in practice. Anyone who can comment on that PR with example code > > would be much appreciated. > > The discussion seems somewhat unclear on the use case of > LowLevelCallable. What it does may become more clear by reading > https://github.com/scipy/scipy/blob/master/scipy/_lib/src/ccallback.h Note also that the "function" attribute of LowLevelCallable exists mainly for object life cycle management --- cffi/ctypes/Cython inputs are normalized to a function pointer + signature in a PyCapsule, and the C code only operates on that. Pauli From ralf.gommers at gmail.com Fri Jul 13 19:31:35 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 13 Jul 2018 18:31:35 -0500 Subject: [SciPy-Dev] sprint at SciPy'18 tomorrow (14 July) Message-ID: Hi all, This is a bit last minute: we've decided to participate in the SciPy'18 sprint that are happening this weekend. We'll share a room with NumPy and other projects. I'll be around on Saturday 9am-2pm to get people started and help out as needed. If anyone has bandwidth to contribute remotely (e.g. PR review, comment on issues/emails, that could be quite useful too. I've written a summary for how to get started if you're new to the project or to sprints or open source development here: https://github.com/scipy/scipy/issues/9030 Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.vanmulbregt at comcast.net Sun Jul 15 23:24:31 2018 From: p.vanmulbregt at comcast.net (Paul van Mulbregt) Date: Sun, 15 Jul 2018 23:24:31 -0400 Subject: [SciPy-Dev] Adding TOMS748 Algorithm for 1-d root-solving. Message-ID: <7103E143-1E3B-4341-80E1-F0C0F5B96170@comcast.net> Hi all, I propose adding Algorithm 748 of Alefeld, Potro and Shi [APS1995] for root-finding in 1 dimension. The algorithm is also referred to as TOMS748. TOMS748 uses a mixture of inverse cubic interpolation and newton-quadratic steps to find a root of a function inside an interval. For functions with 4 continuous derivatives, the asymptotic efficiency index is about 1.65, making it converge more quickly than Brent?s method, and it may even converge in situations where Brent?s method has difficulty. This would be a new function toms748(f, a, b, ...) in scipy.optimize, sitting alongside optimize.{bisect,brentq,brenth,ridder,newton} External to SciPy, this algorithm is used by Boost as its default 1-d root finding algorithm. (See https://www.boost.org/doc/libs/1_63_0/libs/math/doc/html/math_toolkit/roots/roots_noderiv/bracket_solve.html ) The relevant PR is https://github.com/scipy/scipy/pull/8876 , which also includes the addition of 15 test families used in the APS1993 paper, and two additional test families defined on the complex plane. Reference .. [APS1995] Alefeld, G. E. and Potra, F. A. and Shi, Yixun, *Algorithm 748: Enclosing Zeros of Continuous Functions*, ACM Trans. Math. Softw. Volume 221(1995) doi = {10.1145/210089.210111} -Paul -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Jul 16 02:13:12 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 16 Jul 2018 01:13:12 -0500 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: On Wed, Apr 5, 2017 at 3:18 PM, Ralf Gommers wrote: > > > On Thu, Apr 6, 2017 at 6:27 AM, Stefan van der Walt > wrote: > >> On Wed, Apr 5, 2017, at 04:05, Evgeni Burovski wrote: >> > > I am using it all the time, but I suspect I'm the only one. Bento is >> way >> > >> > I seem to remember Stefan van der Walt saying the same thing :-). >> >> So, I guess there are two of us ;) >> >> Ralf wrote: >> >> > But it is hard to install because it depends on a specific version of >> Waf (which is not on PyPI) and has some unresolved issues on Python 3.x. >> >> How hard would it be to vendor Waf into a pip package? If this worked >> out the box, it would be great to development (repeated builds execute >> much faster). >> > > You mean vendor Waf in Bento and make a Bento pip-installable PyPI package > right ? David and I discussed that already 2 years ago, that'd be the way > to go. Shouldn't be too hard, Waf is designed to be vendored. > > What's a little more work is fixing some obvious issues: it doesn't work > on Python 3.x right now, there was a recurring issue with -fPIC going > missing, finish the open PR on producing wheels. > Okay, these fixes are probably not going to materialize. And I've finally ditched Python 2.7 for good a few months ago, and have dealt with my distutils aversion by getting better hardware. I haven't maintained the Bento build over the last few months, so it's not quite working anymore. Therefore I now propose to remove Bento support for v1.2.0 Related: scikit-build seems to have quite a bit of momentum. I've never used CMake before, and while I know opinions on CMake aren't 100% positive, scikit-build looks like a significant improvement over distutils. I think we should try to integrate it at some point in the coming year. Thoughts? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Jul 16 02:31:23 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 16 Jul 2018 16:31:23 +1000 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: > Related: scikit-build seems to have quite a bit of momentum. I've never > used CMake before, and while I know opinions on CMake aren't 100% positive, > scikit-build looks like a significant improvement over distutils. I think > we should try to integrate it at some point in the coming year. > It looks like you have to have CMake installed though ( http://scikit-build.readthedocs.io/en/latest/installation.html). This is a hurdle in itself. -------------- next part -------------- An HTML attachment was scrubbed... URL: From einstein.edison at gmail.com Mon Jul 16 09:49:16 2018 From: einstein.edison at gmail.com (Hameer Abbasi) Date: Mon, 16 Jul 2018 06:49:16 -0700 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: Hi! I?ll be honest. One thing that has prevented me from getting into the C/C++ meat of NumPy/SciPy is exactly what scikit-build is trying to solve. I?m all for extra dev dependencies if they make development easier. Best Regards, Hameer Abbasi Sent from Astro for iOS On 16. Jul 2018 at 01:13, Ralf Gommers wrote: On Wed, Apr 5, 2017 at 3:18 PM, Ralf Gommers wrote: > > > On Thu, Apr 6, 2017 at 6:27 AM, Stefan van der Walt > wrote: > >> On Wed, Apr 5, 2017, at 04:05, Evgeni Burovski wrote: >> > > I am using it all the time, but I suspect I'm the only one. Bento is >> way >> > >> > I seem to remember Stefan van der Walt saying the same thing :-). >> >> So, I guess there are two of us ;) >> >> Ralf wrote: >> >> > But it is hard to install because it depends on a specific version of >> Waf (which is not on PyPI) and has some unresolved issues on Python 3.x. >> >> How hard would it be to vendor Waf into a pip package? If this worked >> out the box, it would be great to development (repeated builds execute >> much faster). >> > > You mean vendor Waf in Bento and make a Bento pip-installable PyPI package > right ? David and I discussed that already 2 years ago, that'd be the way > to go. Shouldn't be too hard, Waf is designed to be vendored. > > What's a little more work is fixing some obvious issues: it doesn't work > on Python 3.x right now, there was a recurring issue with -fPIC going > missing, finish the open PR on producing wheels. > Okay, these fixes are probably not going to materialize. And I've finally ditched Python 2.7 for good a few months ago, and have dealt with my distutils aversion by getting better hardware. I haven't maintained the Bento build over the last few months, so it's not quite working anymore. Therefore I now propose to remove Bento support for v1.2.0 Related: scikit-build seems to have quite a bit of momentum. I've never used CMake before, and while I know opinions on CMake aren't 100% positive, scikit-build looks like a significant improvement over distutils. I think we should try to integrate it at some point in the coming year. Thoughts? Ralf _______________________________________________ SciPy-Dev mailing list SciPy-Dev at python.org https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Jul 16 10:45:49 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 16 Jul 2018 09:45:49 -0500 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: On Mon, Jul 16, 2018 at 1:31 AM, Andrew Nelson wrote: > > Related: scikit-build seems to have quite a bit of momentum. I've never >> used CMake before, and while I know opinions on CMake aren't 100% positive, >> scikit-build looks like a significant improvement over distutils. I think >> we should try to integrate it at some point in the coming year. >> > > It looks like you have to have CMake installed though ( > http://scikit-build.readthedocs.io/en/latest/installation.html). This is > a hurdle in itself. > There are binary wheels (https://pypi.org/project/cmake) and it's packaged for conda, plus good binary installers - for a development setup that's about as easy as it is going to get. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Jul 16 11:22:07 2018 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 16 Jul 2018 17:22:07 +0200 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: <82d03fea4953e006c26e42c8d52891ac04c581aa.camel@iki.fi> Hi, ma, 2018-07-16 kello 06:49 -0700, Hameer Abbasi kirjoitti: > I?ll be honest. One thing that has prevented me from getting into the > C/C++ > meat of NumPy/SciPy is exactly what scikit-build is trying to solve. > I?m > all for extra dev dependencies if they make development easier. Can you (and maybe Ralf) be more specific --- in what way there are problems? What advantages scikit-build (which glues CMake and setuptools together) brings? Typically, `python runtests.py` or `python setup.py build_ext -i` works out of the box. Of course, distutils forces you to do CFLAGS/etc customization via environment variables, and BLAS/LAPACK selection via site.cfg which is not so convenient. Pauli > > Best Regards, > Hameer Abbasi > Sent from Astro for iOS > > > On 16. Jul 2018 at 01:13, Ralf Gommers > wrote: > > > > > On Wed, Apr 5, 2017 at 3:18 PM, Ralf Gommers > wrote: > > > > > > > On Thu, Apr 6, 2017 at 6:27 AM, Stefan van der Walt > ey.edu> > > wrote: > > > > > On Wed, Apr 5, 2017, at 04:05, Evgeni Burovski wrote: > > > > > I am using it all the time, but I suspect I'm the only one. > > > > > Bento is > > > > > > way > > > > > > > > I seem to remember Stefan van der Walt saying the same thing :- > > > > ). > > > > > > So, I guess there are two of us ;) > > > > > > Ralf wrote: > > > > > > > But it is hard to install because it depends on a specific > > > > version of > > > > > > Waf (which is not on PyPI) and has some unresolved issues on > > > Python 3.x. > > > > > > How hard would it be to vendor Waf into a pip package? If this > > > worked > > > out the box, it would be great to development (repeated builds > > > execute > > > much faster). > > > > > > > You mean vendor Waf in Bento and make a Bento pip-installable PyPI > > package > > right ? David and I discussed that already 2 years ago, that'd be > > the way > > to go. Shouldn't be too hard, Waf is designed to be vendored. > > > > What's a little more work is fixing some obvious issues: it doesn't > > work > > on Python 3.x right now, there was a recurring issue with -fPIC > > going > > missing, finish the open PR on producing wheels. > > > > Okay, these fixes are probably not going to materialize. And I've > finally > ditched Python 2.7 for good a few months ago, and have dealt with my > distutils aversion by getting better hardware. I haven't maintained > the > Bento build over the last few months, so it's not quite working > anymore. > Therefore I now propose to remove Bento support for v1.2.0 > > Related: scikit-build seems to have quite a bit of momentum. I've > never > used CMake before, and while I know opinions on CMake aren't 100% > positive, > scikit-build looks like a significant improvement over distutils. I > think > we should try to integrate it at some point in the coming year. > > Thoughts? > > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev From einstein.edison at gmail.com Mon Jul 16 11:26:51 2018 From: einstein.edison at gmail.com (Hameer Abbasi) Date: Mon, 16 Jul 2018 08:26:51 -0700 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: Related: scikit-build seems to have quite a bit of momentum. I've never used CMake before, and while I know opinions on CMake aren't 100% positive, scikit-build looks like a significant improvement over distutils. I think we should try to integrate it at some point in the coming year. Hi! As a potential contributor to SciPy I would say that one of the things I?m most excited about, and one of the things that has kept me from getting too deeply involved in the C/C++ parts of NumPy/SciPy is the lack of proper IDE support. CMake solves this to an extent. I?ve found myself to be relatively comfortable with projects that used CMake rather than (for example) GNU Make or Autotools for the same reason. Best Regards, Hameer Abbasi Sent from Astro for Mac -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.m.mccormick at gmail.com Mon Jul 16 13:42:26 2018 From: matthew.m.mccormick at gmail.com (Matthew McCormick) Date: Mon, 16 Jul 2018 13:42:26 -0400 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> Message-ID: > It looks like you have to have CMake installed though (http://scikit-build.readthedocs.io/en/latest/installation.html). This is a hurdle in itself. The docs are a little out-of-date. This patch will bring them up to speed: https://github.com/scikit-build/scikit-build/pull/339 With pip 10, there is no need to worry about installing CMake since there is a cmake PyPI package and the dependency can be added to pyproject.toml. > There are binary wheels (https://pypi.org/project/cmake) and it's packaged for conda, plus good binary installers - for a development setup that's about as easy as it is going to get. Yes, and quite a few folks are successfully using the cmake PyPI package, over 8,000 a month according to current stats: https://pypistats.org/packages/cmake > Can you (and maybe Ralf) be more specific --- in what way there are problems? > > What advantages scikit-build (which glues CMake and setuptools together) brings? A good summary is given in the recent SciPy 2018 Conference talk by @jcfr (in CC): https://www.youtube.com/watch?v=QVkg-cC5oe4 > Of course, distutils forces you to do CFLAGS/etc customization via > environment variables, and BLAS/LAPACK selection via site.cfg which is > not so convenient. Yes -- scikit-build supports CFLAGS, etc. environmental variables but you can also pass flags and build configuration options more generally and more explicitly in the `python setup.py [...]` call. CMake has good discovery of BLAS/LAPACK, but you can also easily and robustly override what would be discovered through system introspection. Reference LAPACK uses CMake for its build system, so it fits in nicely. > Related: scikit-build seems to have quite a bit of momentum. I've never used CMake before, and while I know opinions on CMake aren't 100% positive, scikit-build looks like a significant improvement over distutils. I think we should try to integrate it at some point in the coming year. > > > Hi! As a potential contributor to SciPy I would say that one of the things I?m most excited about, and one of the things that has kept me from getting too deeply involved in the C/C++ parts of NumPy/SciPy is the lack of proper IDE support. CMake solves this to an extent. I?ve found myself to be relatively comfortable with projects that used CMake rather than (for example) GNU Make or Autotools for the same reason. Yes, with CMake you can generate a build system or IDE of preference, i.e. Visual Studio, XCode, Sublime Text, Eclipse, various Makefile's, Ninja files, ... The full list can be found here: https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html Hope this helps, Matt From ralf.gommers at gmail.com Mon Jul 16 15:47:15 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 16 Jul 2018 14:47:15 -0500 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: <82d03fea4953e006c26e42c8d52891ac04c581aa.camel@iki.fi> References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> <82d03fea4953e006c26e42c8d52891ac04c581aa.camel@iki.fi> Message-ID: On Mon, Jul 16, 2018 at 10:22 AM, Pauli Virtanen wrote: > Hi, > > ma, 2018-07-16 kello 06:49 -0700, Hameer Abbasi kirjoitti: > > I?ll be honest. One thing that has prevented me from getting into the > > C/C++ > > meat of NumPy/SciPy is exactly what scikit-build is trying to solve. > > I?m > > all for extra dev dependencies if they make development easier. > > Can you (and maybe Ralf) be more specific --- in what way there are > problems? > The most painful problem for me is the long build time - that (plus inertia) is what kept me developing on Python 2.7 for ages just so I could use Bento, while as a user I had already moved to Python 3. Proper parallel builds and incremental rebuilds are very nice to have. > What advantages scikit-build (which glues CMake and setuptools > together) brings? > If I understand correctly, scikit-build doesn't use anything from setuptools - it just copied the setup.py user interface. That's still a significant mistake imho, it should have a saner UI (declarative). But that's still minor compared to the advantages it has. Beyond better builds and IDE support, also support for cross-compiling could be valuable. Finally, on a bigger picture level - distutils is badly maintained and good patches can go unmerged, while numpy.distutils is a pain to maintain. An actively developed sane build tool will be valuable to both us and the wider ecosystem. > Typically, `python runtests.py` or `python setup.py build_ext -i` works > out of the box. > > Of course, distutils forces you to do CFLAGS/etc customization via > environment variables, and BLAS/LAPACK selection via site.cfg which is > not so convenient. > Yes these are issues as well. And of course we'd start with it as an optional second build system, and I volunteer to do most of the required maintenance. Ralf > > Pauli > > > > > > Best Regards, > > Hameer Abbasi > > Sent from Astro for iOS > > > > > > On 16. Jul 2018 at 01:13, Ralf Gommers > > wrote: > > > > > > > > > > On Wed, Apr 5, 2017 at 3:18 PM, Ralf Gommers > > wrote: > > > > > > > > > > > On Thu, Apr 6, 2017 at 6:27 AM, Stefan van der Walt > > ey.edu> > > > wrote: > > > > > > > On Wed, Apr 5, 2017, at 04:05, Evgeni Burovski wrote: > > > > > > I am using it all the time, but I suspect I'm the only one. > > > > > > Bento is > > > > > > > > way > > > > > > > > > > I seem to remember Stefan van der Walt saying the same thing :- > > > > > ). > > > > > > > > So, I guess there are two of us ;) > > > > > > > > Ralf wrote: > > > > > > > > > But it is hard to install because it depends on a specific > > > > > version of > > > > > > > > Waf (which is not on PyPI) and has some unresolved issues on > > > > Python 3.x. > > > > > > > > How hard would it be to vendor Waf into a pip package? If this > > > > worked > > > > out the box, it would be great to development (repeated builds > > > > execute > > > > much faster). > > > > > > > > > > You mean vendor Waf in Bento and make a Bento pip-installable PyPI > > > package > > > right ? David and I discussed that already 2 years ago, that'd be > > > the way > > > to go. Shouldn't be too hard, Waf is designed to be vendored. > > > > > > What's a little more work is fixing some obvious issues: it doesn't > > > work > > > on Python 3.x right now, there was a recurring issue with -fPIC > > > going > > > missing, finish the open PR on producing wheels. > > > > > > > Okay, these fixes are probably not going to materialize. And I've > > finally > > ditched Python 2.7 for good a few months ago, and have dealt with my > > distutils aversion by getting better hardware. I haven't maintained > > the > > Bento build over the last few months, so it's not quite working > > anymore. > > Therefore I now propose to remove Bento support for v1.2.0 > > > > Related: scikit-build seems to have quite a bit of momentum. I've > > never > > used CMake before, and while I know opinions on CMake aren't 100% > > positive, > > scikit-build looks like a significant improvement over distutils. I > > think > > we should try to integrate it at some point in the coming year. > > > > Thoughts? > > > > Ralf > > > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at python.org > > https://mail.python.org/mailman/listinfo/scipy-dev > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Jul 16 16:22:24 2018 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 16 Jul 2018 13:22:24 -0700 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> <82d03fea4953e006c26e42c8d52891ac04c581aa.camel@iki.fi> Message-ID: On Mon, Jul 16, 2018, 12:53 Ralf Gommers wrote: > >> What advantages scikit-build (which glues CMake and setuptools >> together) brings? >> > > If I understand correctly, scikit-build doesn't use anything from > setuptools - it just copied the setup.py user interface. That's still a > significant mistake imho, it should have a saner UI (declarative). But > that's still minor compared to the advantages it has. > For a bit of background here... When you run 'pip install ', and pip can't find a wheel, then it automatically downloads the source release and then invokes setup.py, while making various undocumented assumptions about exactly how setup.py works. So alternative build systems actually *have* to provide some kind of awkward setup.py interface to keep 'pip install' working. This is a significant part of why so few people try to replace distutils. (I think.) > Beyond better builds and IDE support, also support for cross-compiling > could be valuable. > > Finally, on a bigger picture level - distutils is badly maintained and > good patches can go unmerged, while numpy.distutils is a pain to maintain. > An actively developed sane build tool will be valuable to both us and the > wider ecosystem. > The way distutils plugin system works, almost any changes can break people, plus the biggest issues are with the fundamental architecture. So yeah, it's pretty much impossible to make distutils better. Instead, the general goal of the packaging maintainers at this point is to get rid of distutils. The big thing that will enable this is PEP 517, which replaces setup.py with a much smaller and properly-specified interface for new build systems to implement. The hope is that if creating alternative build systems becomes an easy and supported thing to do, then better options will emerge, and we'll get out of this situation where there's one tool everyone has to use and which isn't really good for anyone. Unfortunately, support for this hasn't quite landed in pip yet. But that's the direction that things are going. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Jul 16 16:33:01 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 16 Jul 2018 15:33:01 -0500 Subject: [SciPy-Dev] Are you using Bento to build scipy? In-Reply-To: References: <1491416843.651959.935389952.0AFAB016@webmail.messagingengine.com> <82d03fea4953e006c26e42c8d52891ac04c581aa.camel@iki.fi> Message-ID: On Mon, Jul 16, 2018 at 3:22 PM, Nathaniel Smith wrote: > On Mon, Jul 16, 2018, 12:53 Ralf Gommers wrote: > >> >>> What advantages scikit-build (which glues CMake and setuptools >>> together) brings? >>> >> >> If I understand correctly, scikit-build doesn't use anything from >> setuptools - it just copied the setup.py user interface. That's still a >> significant mistake imho, it should have a saner UI (declarative). But >> that's still minor compared to the advantages it has. >> > > For a bit of background here... > > When you run 'pip install ', and pip can't find a wheel, then it > automatically downloads the source release and then invokes setup.py, while > making various undocumented assumptions about exactly how setup.py works. > So alternative build systems actually *have* to provide some kind of > awkward setup.py interface to keep 'pip install' working. > Indeed. But this is way nicer, just a minimal shim: https://cournape.github.io/Bento/html/transition.html#adding-bento-based-setup-py-for-compatibility-with-pip-etc Anyway, I don't have time to spend on this, so I'm not worrying about it until after we have PEP 517. This is a significant part of why so few people try to replace distutils. > (I think.) > > >> Beyond better builds and IDE support, also support for cross-compiling >> could be valuable. >> >> Finally, on a bigger picture level - distutils is badly maintained and >> good patches can go unmerged, while numpy.distutils is a pain to maintain. >> An actively developed sane build tool will be valuable to both us and the >> wider ecosystem. >> > > The way distutils plugin system works, almost any changes can break > people, plus the biggest issues are with the fundamental architecture. So > yeah, it's pretty much impossible to make distutils better. > Agreed. I was more thinking about obviously correct 1 line patches that go unmerged. Ralf > Instead, the general goal of the packaging maintainers at this point is to > get rid of distutils. The big thing that will enable this is PEP 517, which > replaces setup.py with a much smaller and properly-specified interface for > new build systems to implement. The hope is that if creating alternative > build systems becomes an easy and supported thing to do, then better > options will emerge, and we'll get out of this situation where there's one > tool everyone has to use and which isn't really good for anyone. > > Unfortunately, support for this hasn't quite landed in pip yet. But that's > the direction that things are going. > > -n > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Wed Jul 18 16:14:47 2018 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 18 Jul 2018 21:14:47 +0100 Subject: [SciPy-Dev] Link error on the website Message-ID: Hi all, It seems that the link http://www.scipy.org/Tentative_NumPy_Tutorial is redirecting to a wrong page. I would suppose it's supposed to be https://docs.scipy.org/doc/numpy/user/quickstart.html instead of https://docs.scipy.org/doc/numpy-dev/user/quickstart.html Can someone fix it? Cheers, Matthieu -- Quantitative analyst, Ph.D. Blog: http://blog.audio-tk.com/ LinkedIn: http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Wed Jul 18 19:39:25 2018 From: stefanv at berkeley.edu (Stefan van der Walt) Date: Wed, 18 Jul 2018 16:39:25 -0700 Subject: [SciPy-Dev] Link error on the website In-Reply-To: References: Message-ID: <20180718233925.cad452jibfhzicvp@carbo> On Wed, 18 Jul 2018 21:14:47 +0100, Matthieu Brucher wrote: > It seems that the link http://www.scipy.org/Tentative_NumPy_Tutorial is > redirecting to a wrong page. I would suppose it's supposed to be > https://docs.scipy.org/doc/numpy/user/quickstart.html instead of > https://docs.scipy.org/doc/numpy-dev/user/quickstart.html > > Can someone fix it? This should do the trick: https://github.com/scipy/scipy.org/pull/256 Best regards, St?fan From matthieu.brucher at gmail.com Thu Jul 19 02:39:21 2018 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 19 Jul 2018 07:39:21 +0100 Subject: [SciPy-Dev] Link error on the website In-Reply-To: <20180718233925.cad452jibfhzicvp@carbo> References: <20180718233925.cad452jibfhzicvp@carbo> Message-ID: Thanks a lot Stefan! Cheers, Matthieu Le jeu. 19 juil. 2018 ? 00:39, Stefan van der Walt a ?crit : > On Wed, 18 Jul 2018 21:14:47 +0100, Matthieu Brucher wrote: > > It seems that the link http://www.scipy.org/Tentative_NumPy_Tutorial is > > redirecting to a wrong page. I would suppose it's supposed to be > > https://docs.scipy.org/doc/numpy/user/quickstart.html instead of > > https://docs.scipy.org/doc/numpy-dev/user/quickstart.html > > > > Can someone fix it? > > This should do the trick: > > https://github.com/scipy/scipy.org/pull/256 > > Best regards, > St?fan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -- Quantitative analyst, Ph.D. Blog: http://blog.audio-tk.com/ LinkedIn: http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dickson41152 at protonmail.com Sun Jul 22 02:08:04 2018 From: dickson41152 at protonmail.com (dickson41152) Date: Sun, 22 Jul 2018 02:08:04 -0400 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm Message-ID: Hi all, I have been working on optimisation problems with expensive black-box functions. As part of this work, I have made two modifications to the Nelder-Mead Simplex solver implementation provided by Scipy. These are: 1. Allowing bound constraints in the same manner as NLopt's Nelder-Mead implementation which is itself originally based upon the method of satisfying bound constraints in Box's Complex solver [1]. 2. Allowing the user to specify initial objective function values for any or all points in a user's initial simplex. I am also considering adding a further modification: 3. Checking the objective function evaluation count against the user-specified evaluation budget at each and every function evaluation in the solver, then terminating if the budget has been reached. Modification 1 moves a candidate point to within user specified bounds by simply "clipping" the infeasible variable value to lie on the bound, e.g if a candidate variable value of 10 is given for a problem with an upper bound constraint of 5 on that variable, then the candidate variable value is "clipped" to 5. This is how I believe NLopt's Nelder-Mead implementation works, and I believe that this is similar to the method in [1] which moves the value slightly within the bound by a very small magnitude. Modification 2 simply allows the solver to avoid having to evaluate all the points in the user-specified initial simplex if the user already has that information at hand. In my own work, I am performing experimental designs within the problem bound constraints before handing the best points from the design evaluations to the Nelder-Mead as an initial simplex. Since I already had the function evaluations, it made sense to give these the the solver rather than repeating the evaluations. In my use case this saves a lot of computational time since I am looking at expensive black-boxes. Modification 3 would ensure that the Nelder-Mead algorithm can only run one objective function evaluation before a check is made to see if the count of function evaluations has matched an evaluation budget. Currently, more than one function evaluation can be performed during a single solver iteration; further, no evaluation checks are made before the first iteration i.e. when evaluating the user-specified initial simplex. My use case demands that the Nelder-Mead solver tightly conforms to a user specified evaluation budget and no more. This is a time saving measure to avoid excess expensive black-box evaluations which will be rejected anyway since my other work demands tight conformity with the evaluation budget. I am a software development amateur so contributing to Scipy would be a brand new experience. However I am prepared to try out adding these modifications to Scipy, if people believe that these modifications may be useful to others. Please share your thoughts on these proposals! I look forward to your insight. Many thanks, colvin4993 [1] M. J. Box; A New Method of Constrained Optimization and a Comparison With Other Methods, The Computer Journal, Volume 8, Issue 1, 1 April 1965, Pages 42?52, https://doi.org/10.1093/comjnl/8.1.42 -------------- next part -------------- An HTML attachment was scrubbed... URL: From phillip.m.feldman at gmail.com Mon Jul 23 03:17:24 2018 From: phillip.m.feldman at gmail.com (Phillip Feldman) Date: Mon, 23 Jul 2018 00:17:24 -0700 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: It is very impressive that you are taking this on. Re. clipping the infeasible variable to lie on the bound: If the bound is an arbitrary function of the variables, it might be non-trivial to find the nearest point that lies on the bound. A loosely-related item: It should be possible to speed up the Nelder-Mead algorithm by dividing calculations over multiple cores on a multi-core computer. Phillip On Sat, Jul 21, 2018 at 11:08 PM, dickson41152 wrote: > Hi all, > > > I have been working on optimisation problems with expensive black-box > functions. As part of this work, I have made two modifications to the > Nelder-Mead Simplex solver implementation provided by Scipy. These are: > > > 1. Allowing bound constraints in the same manner as NLopt's Nelder-Mead > implementation which is itself originally based upon the method of > satisfying bound constraints in Box's Complex solver [1]. > > 2. Allowing the user to specify initial objective function values for any > or all points in a user's initial simplex. > > I am also considering adding a further modification: > > 3. Checking the objective function evaluation count against the > user-specified evaluation budget at each and every function evaluation in > the solver, then terminating if the budget has been reached. > > Modification 1 moves a candidate point to within user specified bounds by > simply "clipping" the infeasible variable value to lie on the bound, e.g if > a candidate variable value of 10 is given for a problem with an upper bound > constraint of 5 on that variable, then the candidate variable value is > "clipped" to 5. This is how I believe NLopt's Nelder-Mead implementation > works, and I believe that this is similar to the method in [1] which moves > the value slightly within the bound by a very small magnitude. > > Modification 2 simply allows the solver to avoid having to evaluate all > the points in the user-specified initial simplex if the user already has > that information at hand. In my own work, I am performing experimental > designs within the problem bound constraints before handing the best points > from the design evaluations to the Nelder-Mead as an initial simplex. Since > I already had the function evaluations, it made sense to give these the the > solver rather than repeating the evaluations. In my use case this saves a > lot of computational time since I am looking at expensive black-boxes. > > Modification 3 would ensure that the Nelder-Mead algorithm can only run > one objective function evaluation before a check is made to see if the > count of function evaluations has matched an evaluation budget. Currently, > more than one function evaluation can be performed during a single solver > iteration; further, no evaluation checks are made before the first > iteration i.e. when evaluating the user-specified initial simplex. My use > case demands that the Nelder-Mead solver tightly conforms to a user > specified evaluation budget and no more. This is a time saving measure to > avoid excess expensive black-box evaluations which will be rejected anyway > since my other work demands tight conformity with the evaluation budget. > > I am a software development amateur so contributing to Scipy would be a > brand new experience. However I am prepared to try out adding these > modifications to Scipy, if people believe that these modifications may be > useful to others. > > Please share your thoughts on these proposals! I look forward to your > insight. > > Many thanks, > colvin4993 > > [1] M. J. Box; A New Method of Constrained Optimization and a Comparison > With Other Methods, *The Computer Journal*, Volume 8, Issue 1, 1 April > 1965, Pages 42?52, https://doi.org/10.1093/comjnl/8.1.42 > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From b.rosa at unistra.fr Mon Jul 23 03:37:48 2018 From: b.rosa at unistra.fr (Benoit Rosa) Date: Mon, 23 Jul 2018 09:37:48 +0200 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: <4033d974-4c49-55ff-0893-81cd83cd9284@unistra.fr> Hi, There is a implementation in Matlab, both with bounds and constraints, for the Nelder-Mead simplex : https://fr.mathworks.com/matlabcentral/fileexchange/8277-fminsearchbnd-fminsearchcon The implementation uses a transformation method to enforce bounds, and an implicit penalization approach for the nonlinear constraints. Perhaps worth having a look to compare both approaches ? Note : this is not an official matlab function, it is user-supplied. I did not author that code, but I did use it and it seems to work well. I did not, however, run comparative benchmarks. Beno?t On 23/07/2018 09:17, Phillip Feldman wrote: > It is very impressive that you are taking this on. > > Re. clipping the infeasible variable to lie on the bound: If the bound > is an arbitrary function of the variables, it might be non-trivial to > find the nearest point that lies on the bound. > > A loosely-related item: It should be possible to speed up the > Nelder-Mead algorithm by dividing calculations over multiple cores on > a multi-core computer. > > Phillip > > On Sat, Jul 21, 2018 at 11:08 PM, dickson41152 > > wrote: > > Hi all, > > > I have been working on optimisation problems with expensive > black-box functions. As part of this work, I have made two > modifications to the Nelder-Mead Simplex solver implementation > provided by Scipy. These are: > > > 1. Allowing bound constraints in the same manner as NLopt's > Nelder-Mead implementation which is itself originally based upon > the method of satisfying bound constraints in Box's Complex solver > [1]. > > 2. Allowing the user to specify initial objective function values > for any or all points in a user's initial simplex. > > > I am also considering adding a further modification: > > 3. Checking the objective function evaluation count against the > user-specified evaluation budget at each and every function > evaluation in the solver, then terminating if the budget has been > reached. > > Modification 1 moves a candidate point to within user specified > bounds by simply "clipping" the infeasible variable value to lie > on the?bound, e.g if a candidate variable value of 10 is given for > a problem with an upper bound constraint of 5 on that variable, > then the candidate variable value is "clipped" to 5. This is how I > believe NLopt's Nelder-Mead implementation works, and I believe > that this is similar to the method in [1] which moves the value > slightly within the bound by a very small magnitude. > > Modification 2 simply allows the solver to avoid having to > evaluate all the points in the user-specified initial simplex if > the user already has that information at hand. In my own work, I > am performing experimental designs within the problem bound > constraints before handing the best points from the design > evaluations to the Nelder-Mead as an initial simplex. Since I > already had the function evaluations, it made sense to give these > the the solver rather than repeating the evaluations. In my use > case this saves a lot of computational time since I am looking at > expensive black-boxes. > > Modification 3 would ensure that the Nelder-Mead algorithm can > only run one objective function evaluation before a check is made > to see if the count of function evaluations has matched an > evaluation budget. Currently, more than one function evaluation > can be performed during a single solver iteration; further, no > evaluation checks are made before the first iteration i.e. when > evaluating the user-specified initial simplex. My use case demands > that the Nelder-Mead solver tightly conforms to a user specified > evaluation budget and no more. This is a time saving measure to > avoid excess expensive black-box evaluations which will be > rejected anyway since my other work demands tight conformity with > the evaluation budget. > > I am a software development amateur so contributing to Scipy would > be a brand new experience. However I am prepared to try out adding > these modifications to Scipy, if people believe that these > modifications may be useful to others. > > Please share your thoughts on these proposals! I look forward to > your insight. > > Many thanks, > colvin4993 > > [1] ?M. J. Box; A New Method of Constrained Optimization and a > Comparison With Other Methods, /The Computer Journal/, Volume 8, > Issue 1, 1 April 1965, Pages 42?52, > https://doi.org/10.1093/comjnl/8.1.42 > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Jul 23 03:47:46 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Mon, 23 Jul 2018 17:47:46 +1000 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: Some good suggestions. I'm not sure about the best way to enforce constraints, there are differing ways of doing so. You mention one, but there are also maths transforms that one can use, such as those used in the lmfit project. See https://lmfit.github.io/lmfit-py/bounds.html for further details. These kinds of modifications have to be considered in a module wide context for consistency. For specification of bounds the public interface would need to use a similar approach to that used by other minimizers, such as LBFGSB. As regards the internal implementation, it may be that a wrapper function could be written that would be useable across the module. Modification 2 could be done by using an f_simplex keyword (or something along those lines, good keyword choice is important), and have the same size as 'initial_simplex'. I'm more circumspect about modification 3. One way of achieving this would be for your objective function (or a scipy wrapper function) to raise a StopIteration error, and for the minimizer to be able to gracefully handle this. This would require porting to other minimizers to make worthwhile. For interest this point is one of the reasons for pull request #8552, https://github.com/scipy/scipy/pull/8552. W.r.t parallelisation/vectorisation, there is another PR that is considering this at the moment, and it would be good to see how that goes. It's a good idea to raise these points for discussion on the mailing list before going too far down the implementation route. That way you know if any mods are going to have a good chance of getting in. All mods need to have comprehensive testing. Andrew. On Mon, 23 Jul 2018 at 17:18, Phillip Feldman wrote: > It is very impressive that you are taking this on. > > Re. clipping the infeasible variable to lie on the bound: If the bound is > an arbitrary function of the variables, it might be non-trivial to find the > nearest point that lies on the bound. > > A loosely-related item: It should be possible to speed up the Nelder-Mead > algorithm by dividing calculations over multiple cores on a multi-core > computer. > > Phillip > > On Sat, Jul 21, 2018 at 11:08 PM, dickson41152 < > dickson41152 at protonmail.com> wrote: > >> Hi all, >> >> >> I have been working on optimisation problems with expensive black-box >> functions. As part of this work, I have made two modifications to the >> Nelder-Mead Simplex solver implementation provided by Scipy. These are: >> >> >> 1. Allowing bound constraints in the same manner as NLopt's Nelder-Mead >> implementation which is itself originally based upon the method of >> satisfying bound constraints in Box's Complex solver [1]. >> >> 2. Allowing the user to specify initial objective function values for any >> or all points in a user's initial simplex. >> >> I am also considering adding a further modification: >> >> 3. Checking the objective function evaluation count against the >> user-specified evaluation budget at each and every function evaluation in >> the solver, then terminating if the budget has been reached. >> >> Modification 1 moves a candidate point to within user specified bounds by >> simply "clipping" the infeasible variable value to lie on the bound, e.g if >> a candidate variable value of 10 is given for a problem with an upper bound >> constraint of 5 on that variable, then the candidate variable value is >> "clipped" to 5. This is how I believe NLopt's Nelder-Mead implementation >> works, and I believe that this is similar to the method in [1] which moves >> the value slightly within the bound by a very small magnitude. >> >> Modification 2 simply allows the solver to avoid having to evaluate all >> the points in the user-specified initial simplex if the user already has >> that information at hand. In my own work, I am performing experimental >> designs within the problem bound constraints before handing the best points >> from the design evaluations to the Nelder-Mead as an initial simplex. Since >> I already had the function evaluations, it made sense to give these the the >> solver rather than repeating the evaluations. In my use case this saves a >> lot of computational time since I am looking at expensive black-boxes. >> >> Modification 3 would ensure that the Nelder-Mead algorithm can only run >> one objective function evaluation before a check is made to see if the >> count of function evaluations has matched an evaluation budget. Currently, >> more than one function evaluation can be performed during a single solver >> iteration; further, no evaluation checks are made before the first >> iteration i.e. when evaluating the user-specified initial simplex. My >> use case demands that the Nelder-Mead solver tightly conforms to a user >> specified evaluation budget and no more. This is a time saving measure to >> avoid excess expensive black-box evaluations which will be rejected anyway >> since my other work demands tight conformity with the evaluation budget. >> >> I am a software development amateur so contributing to Scipy would be a >> brand new experience. However I am prepared to try out adding these >> modifications to Scipy, if people believe that these modifications may be >> useful to others. >> >> Please share your thoughts on these proposals! I look forward to your >> insight. >> >> Many thanks, >> colvin4993 >> >> [1] M. J. Box; A New Method of Constrained Optimization and a Comparison >> With Other Methods, *The Computer Journal*, Volume 8, Issue 1, 1 April >> 1965, Pages 42?52, https://doi.org/10.1093/comjnl/8.1.42 >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at python.org >> https://mail.python.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Jul 23 13:21:05 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 23 Jul 2018 11:21:05 -0600 Subject: [SciPy-Dev] NumPy 1.15.0 released. Message-ID: Hi All, On behalf of the NumPy team I'm pleased to announce the release of NumPy 1.15.0rc2. This release has an unusual number of cleanups, many deprecations of old functions, and improvements to many existing functions. A total of 438 pull reguests were merged for this release, please look at the release notes for details. Some highlights are: - NumPy has switched to pytest for testing. - A new `numpy.printoptions` context manager. - Many improvements to the histogram functions. - Support for unicode field names in python 2.7. - Improved support for PyPy. - Fixes and improvements to `numpy.einsum`. The Python versions supported by this release are 2.7, 3.4-3.7. The wheels are linked with OpenBLAS v0.3.0, which should fix some of the linalg problems reported for NumPy 1.14. Wheels for this release can be downloaded from PyPI , source archives are available from Github . *Contributors* A total of 133 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * Aaron Critchley + * Aarthi + * Aarthi Agurusa + * Alex Thomas + * Alexander Belopolsky * Allan Haldane * Anas Khan + * Andras Deak * Andrey Portnoy + * Anna Chiara * Aurelien Jarno + * Baurzhan Muftakhidinov * Berend Kapelle + * Bernhard M. Wiedemann * Bjoern Thiel + * Bob Eldering * Cenny Wenner + * Charles Harris * ChloeColeongco + * Chris Billington + * Christopher + * Chun-Wei Yuan + * Claudio Freire + * Daniel Smith * Darcy Meyer + * David Abdurachmanov + * David Freese * Deepak Kumar Gouda + * Dennis Weyland + * Derrick Williams + * Dmitriy Shalyga + * Eric Cousineau + * Eric Larson * Eric Wieser * Evgeni Burovski * Frederick Lefebvre + * Gaspar Karm + * Geoffrey Irving * Gerhard Hobler + * Gerrit Holl * Guo Ci + * Hameer Abbasi + * Han Shen * Hiroyuki V. Yamazaki + * Hong Xu * Ihor Melnyk + * Jaime Fernandez * Jake VanderPlas + * James Tocknell + * Jarrod Millman * Jeff VanOss + * John Kirkham * Jonas Rauber + * Jonathan March + * Joseph Fox-Rabinovitz * Julian Taylor * Junjie Bai + * Juris Bogusevs + * J?rg D?pfert * Kenichi Maehashi + * Kevin Sheppard * Kimikazu Kato + * Kirit Thadaka + * Kritika Jalan + * Kyle Sunden + * Lakshay Garg + * Lars G + * Licht Takeuchi * Louis Potok + * Luke Zoltan Kelley * MSeifert04 + * Mads R. B. Kristensen + * Malcolm Smith + * Mark Harfouche + * Marten H. van Kerkwijk + * Marten van Kerkwijk * Matheus Vieira Portela + * Mathieu Lamarre * Mathieu Sornay + * Matthew Brett * Matthew Rocklin + * Matthias Bussonnier * Matti Picus * Michael Droettboom * Miguel S?nchez de Le?n Peque + * Mike Toews + * Milo + * Nathaniel J. Smith * Nelle Varoquaux * Nicholas Nadeau, P.Eng., AVS + * Nick Minkyu Lee + * Nikita + * Nikita Kartashov + * Nils Becker + * Oleg Zabluda * Orestis Floros + * Pat Gunn + * Paul van Mulbregt + * Pauli Virtanen * Pierre Chanial + * Ralf Gommers * Raunak Shah + * Robert Kern * Russell Keith-Magee + * Ryan Soklaski + * Samuel Jackson + * Sebastian Berg * Siavash Eliasi + * Simon Conseil * Simon Gibbons * Stefan Krah + * Stefan van der Walt * Stephan Hoyer * Subhendu + * Subhendu Ranjan Mishra + * Tai-Lin Wu + * Tobias Fischer + * Toshiki Kataoka + * Tyler Reddy + * Unknown + * Varun Nayyar * Victor Rodriguez + * Warren Weckesser * William D. Irons + * Zane Bradley + * cclauss + * fo40225 + * lapack_lite code generator + * lumbric + * luzpaz + * mamrehn + * tynn + * xoviat Cheers, Charles Harris -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Jul 23 13:23:11 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 23 Jul 2018 11:23:11 -0600 Subject: [SciPy-Dev] NumPy 1.15.0 released. In-Reply-To: References: Message-ID: On Mon, Jul 23, 2018 at 11:21 AM, Charles R Harris < charlesr.harris at gmail.com> wrote: > Hi All, > > On behalf of the NumPy team I'm pleased to announce the release of NumPy > 1.15.0rc2. > Oops, NumPy 1.15.0. Oh well ... This release has an unusual number of cleanups, many deprecations of old > functions, > and improvements to many existing functions. A total of 438 pull reguests > were merged > for this release, please look at the release notes > for details. Some > highlights are: > > - NumPy has switched to pytest for testing. > - A new `numpy.printoptions` context manager. > - Many improvements to the histogram functions. > - Support for unicode field names in python 2.7. > - Improved support for PyPy. > - Fixes and improvements to `numpy.einsum`. > > The Python versions supported by this release are 2.7, 3.4-3.7. The > wheels are linked with > OpenBLAS v0.3.0, which should fix some of the linalg problems reported for > NumPy 1.14. > > Wheels for this release can be downloaded from PyPI > , source archives are available > from Github . > > > *Contributors* > > A total of 133 people contributed to this release. People with a "+" by > their > names contributed a patch for the first time. > > * Aaron Critchley + > * Aarthi + > * Aarthi Agurusa + > * Alex Thomas + > * Alexander Belopolsky > * Allan Haldane > * Anas Khan + > * Andras Deak > * Andrey Portnoy + > * Anna Chiara > * Aurelien Jarno + > * Baurzhan Muftakhidinov > * Berend Kapelle + > * Bernhard M. Wiedemann > * Bjoern Thiel + > * Bob Eldering > * Cenny Wenner + > * Charles Harris > * ChloeColeongco + > * Chris Billington + > * Christopher + > * Chun-Wei Yuan + > * Claudio Freire + > * Daniel Smith > * Darcy Meyer + > * David Abdurachmanov + > * David Freese > * Deepak Kumar Gouda + > * Dennis Weyland + > * Derrick Williams + > * Dmitriy Shalyga + > * Eric Cousineau + > * Eric Larson > * Eric Wieser > * Evgeni Burovski > * Frederick Lefebvre + > * Gaspar Karm + > * Geoffrey Irving > * Gerhard Hobler + > * Gerrit Holl > * Guo Ci + > * Hameer Abbasi + > * Han Shen > * Hiroyuki V. Yamazaki + > * Hong Xu > * Ihor Melnyk + > * Jaime Fernandez > * Jake VanderPlas + > * James Tocknell + > * Jarrod Millman > * Jeff VanOss + > * John Kirkham > * Jonas Rauber + > * Jonathan March + > * Joseph Fox-Rabinovitz > * Julian Taylor > * Junjie Bai + > * Juris Bogusevs + > * J?rg D?pfert > * Kenichi Maehashi + > * Kevin Sheppard > * Kimikazu Kato + > * Kirit Thadaka + > * Kritika Jalan + > * Kyle Sunden + > * Lakshay Garg + > * Lars G + > * Licht Takeuchi > * Louis Potok + > * Luke Zoltan Kelley > * MSeifert04 + > * Mads R. B. Kristensen + > * Malcolm Smith + > * Mark Harfouche + > * Marten H. van Kerkwijk + > * Marten van Kerkwijk > * Matheus Vieira Portela + > * Mathieu Lamarre > * Mathieu Sornay + > * Matthew Brett > * Matthew Rocklin + > * Matthias Bussonnier > * Matti Picus > * Michael Droettboom > * Miguel S?nchez de Le?n Peque + > * Mike Toews + > * Milo + > * Nathaniel J. Smith > * Nelle Varoquaux > * Nicholas Nadeau, P.Eng., AVS + > * Nick Minkyu Lee + > * Nikita + > * Nikita Kartashov + > * Nils Becker + > * Oleg Zabluda > * Orestis Floros + > * Pat Gunn + > * Paul van Mulbregt + > * Pauli Virtanen > * Pierre Chanial + > * Ralf Gommers > * Raunak Shah + > * Robert Kern > * Russell Keith-Magee + > * Ryan Soklaski + > * Samuel Jackson + > * Sebastian Berg > * Siavash Eliasi + > * Simon Conseil > * Simon Gibbons > * Stefan Krah + > * Stefan van der Walt > * Stephan Hoyer > * Subhendu + > * Subhendu Ranjan Mishra + > * Tai-Lin Wu + > * Tobias Fischer + > * Toshiki Kataoka + > * Tyler Reddy + > * Unknown + > * Varun Nayyar > * Victor Rodriguez + > * Warren Weckesser > * William D. Irons + > * Zane Bradley + > * cclauss + > * fo40225 + > * lapack_lite code generator + > * lumbric + > * luzpaz + > * mamrehn + > * tynn + > * xoviat > > Cheers, > > Charles Harris > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dickson41152 at protonmail.com Mon Jul 23 22:33:55 2018 From: dickson41152 at protonmail.com (dickson41152) Date: Mon, 23 Jul 2018 22:33:55 -0400 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: Thank you for your interest Phillip. > Re. clipping the infeasible variable to lie on the bound: If the bound is an arbitrary function of the variables, it might be non-trivial to find the nearest point that lies on the bound. May I ask if you can explain this comment in more detail? dickson -------------- next part -------------- An HTML attachment was scrubbed... URL: From dickson41152 at protonmail.com Mon Jul 23 23:02:42 2018 From: dickson41152 at protonmail.com (dickson41152) Date: Mon, 23 Jul 2018 23:02:42 -0400 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: Thank you for your interest Beno?t. It looks like the variable transformation for a variable with both upper and lower bounds is the same method as that shown on the web page Andrew linked to. However I think from a quick glance the upper and lower bound only transformations using quadratic terms are subtly different from the web page Andrew linked. dickson -------------- next part -------------- An HTML attachment was scrubbed... URL: From dickson41152 at protonmail.com Mon Jul 23 23:36:37 2018 From: dickson41152 at protonmail.com (dickson41152) Date: Mon, 23 Jul 2018 23:36:37 -0400 Subject: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex Algorithm In-Reply-To: References: Message-ID: <5xgSJzGhTqz3FgLpZlSQCqNBm-v5aytNC9kOBDasi_Mus8NNwoCKjPVEmrVqOpIn2B22oT1r0jOc8cNWJaBxRicbjjkAClPDRUwCRdPgAXs=@protonmail.com> Thank you for your interest Andrew. > variable transformations to form unconstrained problems from bound constrained problems I am considering writing a new transformation function to calculate the equations on that web page that you linked to discussing variable transformations. If it is as straight-forward as I think it will be to write, then I may also do some comparisons between the "clipping" method and the sin() / arcsin() / sqrt() method you highlighted. As an aside, I was told informally about the existence of that method you referenced during a conversation I had last week, but I have been finding it very difficult to find any written discussion (i.e. actual analytical expressions) about that method via web searches before I made the proposals. So thank you very much for linking that web page - it is very useful information for my own personal work as well as the proposed Scipy modifications! > user-specified objective function values for points in the user-specified initial simplex I have indeed used a keyword in the manner you describe (and therefore similar in style to the initial_simplex keyword) - it seemed to me to be a "clean" way of implementing this modification and very straightforward to implement since it is so similar to the initial_simplex handling. > evaluation budget count check I am thinking of writing checks which will effectively skip the execution of the remainder of the solver algorithm if the budget has been met, before finally sorting the current simplex of evaluations and returning the best. My inclination is to avoid using StopIteration since it is an exception i.e. it suggests some kind of problem has occurred, where as my proposed idea is to handle evaluations up to the exact budget as a feature. Anyway I haven't yet made an attempt on this modification, so my thoughts are tentative and may change - especially if it turns out that I am trying to write spaghetti. dickson -------------- next part -------------- An HTML attachment was scrubbed... URL: From newville at cars.uchicago.edu Thu Jul 26 10:47:17 2018 From: newville at cars.uchicago.edu (Matt Newville) Date: Thu, 26 Jul 2018 16:47:17 +0200 Subject: [SciPy-Dev] SciPy-Dev Digest, Vol 177, Issue 21 In-Reply-To: References: Message-ID: Hi Andrew, Dickson, All, Sorry for barging in rather late to this discussion, and for responding to a digest, and if this appears to be too negative. > Date: Mon, 23 Jul 2018 23:36:37 -0400 > From: dickson41152 > To: Andrew Nelson > Cc: scipy-dev > Subject: Re: [SciPy-Dev] Proposed Modifications to Nelder-Mead Simplex > Algorithm > Message-ID: > > <5xgSJzGhTqz3FgLpZlSQCqNBm-v5aytNC9kOBDasi_Mus8NNwoCKjPVEmrVqOpIn2B22oT1r0jOc8cNWJaBxRicbjjkAClPDRUwCRdPgAXs=@ > protonmail.com> > > Content-Type: text/plain; charset="utf-8" > > Thank you for your interest Andrew. > > > variable transformations to form unconstrained problems from bound > constrained problems > > I am considering writing a new transformation function to calculate the > equations on that web page that you linked to discussing variable > transformations. If it is as straight-forward as I think it will be to > write, then I may also do some comparisons between the "clipping" method > and the sin() / arcsin() / sqrt() method you highlighted. > > As an aside, I was told informally about the existence of that method you > referenced during a conversation I had last week, but I have been finding > it very difficult to find any written discussion (i.e. actual analytical > expressions) about that method via web searches before I made the > proposals. So thank you very much for linking that web page - it is very > useful information for my own personal work as well as the proposed Scipy > modifications! > > > user-specified objective function values for points in the > user-specified initial simplex > > I have indeed used a keyword in the manner you describe (and therefore > similar in style to the initial_simplex keyword) - it seemed to me to be a > "clean" way of implementing this modification and very straightforward to > implement since it is so similar to the initial_simplex handling. > > > evaluation budget count check > > I am thinking of writing checks which will effectively skip the execution > of the remainder of the solver algorithm if the budget has been met, before > finally sorting the current simplex of evaluations and returning the best. > My inclination is to avoid using StopIteration since it is an exception > i.e. it suggests some kind of problem has occurred, where as my proposed > idea is to handle evaluations up to the exact budget as a feature. > > Anyway I haven't yet made an attempt on this modification, so my thoughts > are tentative and may change - especially if it turns out that I am trying > to write spaghetti. > > dickson ?I think that adding bounds to Nelder-Mead Simplex is ?an interesting idea. Using the Minuit-style bounds - using smooth functions to transform infiniite "internal" values to finite "external" values - would definitely be an easy and stable way to do that. But, with all due respect, I also think it is the wrong approach ;). The main point of the lmfit package that Andrew linked to is that the fitting *algorithm* should not support bounds on variable values. I do understand that other scipy.optimize algorithms support bounds, and yes, I do think that is an unfortunate choice ;). Instead, the *parameters* should have bounds (and perhaps other properties), and the algorithm should work on these parameter objects. In this way one does not add bounds to Nelder-Mead, and then add bounds to Powell's method, and then add bounds to Levenberg-Marquardt, etc. Rather, one adds bounds to parameters, and has the methods use these parameters instead of scalar values, and thus one add bounds to *all* methods in a consistent and transferrable way. A few months ago Andrew suggested a class-based approach to implementing fitting algorithms. I think this idea has a lot of merit to it. I also believe that lmfit successfully demonstrates that having parameter objects as described briefly above does better encapsulate the desired properties (bounds, etc) *and* makes the code implementing the algorithms clearer and simpler. In fact, if I understand the original goals here to be: 1. Allowing bound constraints ? ? 2. Allowing the user to specify initial objective function values for any or all points in a user's initial simplex. ? ? 3. Checking the objective function evaluation count against the user-specified evaluation budget at each function evaluation ?. I would point out that lmfit has a Nelder-Mead ?solver that already does #1 and #3 and that (as with bounds), #3 may be best done not within the solver, but in outer code that calls the solver, so that this may be available in a consistent manner to multiple methods. If #2 means to memoize each step in the process to avoid repeated calculation, this does not currently exist ? within lmfit.? Such a goal might really be best done in the solver method itself. ? It? may be ?also be ? doable using a per-iteration callback ? (which lmfit does have) , ?though ?I don't know that anyone one has ever tried this. I apologize if this appears to be an outsider yelling "you're doing it all wrong". I rely heavily on scipy.optimize for my daily work, and would like to be able to make these tools better but I am over-committed and cannot contribute significantly to the scipy ecosystem myself other than try to support lmfit as best I can. Cheers, --Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Sun Jul 29 21:17:48 2018 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Sun, 29 Jul 2018 21:17:48 -0400 Subject: [SciPy-Dev] [Cython] [Scipy-Dev] FFT from Cython Message-ID: Hello all, For a project in DIPY (http://dipy.org) we need to be calling 1D and 2D FFTs and iFFTs very often. For this reason the most efficient way seems to be through Cython. It would be great if we could call scipy's fft or numpy's fft without the overhead. I would like to ask if someone has worked on this and can share some of hers/his experience or code. What we are trying to do is avoid increasing our dependencies. For example we prefer not using FFTW if this is not available in scipy/numpy. Finally, many thanks to scipy devs and cython devs for making available BLAS and LAPACK functions from Cython. This has been super useful for our project. I hope we can do something similar for FFT. Let me know if you have any ideas. Can we for example access the FFT functions of scipy directly from Cython. Has someone done this? Best regards, Eleftherios -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilhanpolat at gmail.com Mon Jul 30 12:04:44 2018 From: ilhanpolat at gmail.com (Ilhan Polat) Date: Mon, 30 Jul 2018 18:04:44 +0200 Subject: [SciPy-Dev] [Cython] [Scipy-Dev] FFT from Cython In-Reply-To: References: Message-ID: What you can do is, since you are using Cython already, to replicate the signatures in DIPY similar to LAPACK functions in SciPy. You can copy the machinery, say from, https://github.com/scipy/scipy/blob/master/scipy/linalg/_cython_signature_generator.py and include FFT related funcs. That's unless someone sends a PR that does this on SciPy directly. Best, ilhan On Mon, Jul 30, 2018 at 3:17 AM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > Hello all, > > For a project in DIPY (http://dipy.org) we need to be calling 1D and 2D > FFTs and iFFTs very often. For this reason the most efficient way seems to > be through Cython. > > It would be great if we could call scipy's fft or numpy's fft without the > overhead. I would like to ask if someone has worked on this and can share > some of hers/his experience or code. > > What we are trying to do is avoid increasing our dependencies. For example > we prefer not using FFTW if this is not available in scipy/numpy. > > Finally, many thanks to scipy devs and cython devs for making available > BLAS and LAPACK functions from Cython. This has been super useful for our > project. I hope we can do something similar for FFT. > > Let me know if you have any ideas. Can we for example access the FFT > functions of scipy directly from Cython. Has someone done this? > > Best regards, > Eleftherios > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sseibert at anaconda.com Mon Jul 30 12:12:25 2018 From: sseibert at anaconda.com (Stanley Seibert) Date: Mon, 30 Jul 2018 11:12:25 -0500 Subject: [SciPy-Dev] [Cython] [Scipy-Dev] FFT from Cython In-Reply-To: References: Message-ID: This would also be useful to Numba to enable automatic usage of optimized FFT functions from nopython mode. We already use this technique with the BLAS/LAPACK exports from SciPy to great benefit. On Sun, Jul 29, 2018 at 8:17 PM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > Hello all, > > For a project in DIPY (http://dipy.org) we need to be calling 1D and 2D > FFTs and iFFTs very often. For this reason the most efficient way seems to > be through Cython. > > It would be great if we could call scipy's fft or numpy's fft without the > overhead. I would like to ask if someone has worked on this and can share > some of hers/his experience or code. > > What we are trying to do is avoid increasing our dependencies. For example > we prefer not using FFTW if this is not available in scipy/numpy. > > Finally, many thanks to scipy devs and cython devs for making available > BLAS and LAPACK functions from Cython. This has been super useful for our > project. I hope we can do something similar for FFT. > > Let me know if you have any ideas. Can we for example access the FFT > functions of scipy directly from Cython. Has someone done this? > > Best regards, > Eleftherios > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Jul 30 12:45:11 2018 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 30 Jul 2018 18:45:11 +0200 Subject: [SciPy-Dev] [Cython] [Scipy-Dev] FFT from Cython In-Reply-To: References: Message-ID: <5552575b558ec6d48dc9e32c95d783ee6a9c6f78.camel@iki.fi> ma, 2018-07-30 kello 11:12 -0500, Stanley Seibert kirjoitti: > This would also be useful to Numba to enable automatic usage of > optimized > FFT functions from nopython mode. We already use this technique with > the > BLAS/LAPACK exports from SciPy to great benefit. As a general comment, the "optimization" situation with FFT functions is somewhat different from BLAS/LAPACK, as the FFT library in Scipy is not state of the art. There are faster alternatives out there, however to my knowledge not with BSD-compatible licenses (at least a few years ago, so this information may be outdated). -- Pauli Virtanen From sseibert at anaconda.com Mon Jul 30 12:56:31 2018 From: sseibert at anaconda.com (Stanley Seibert) Date: Mon, 30 Jul 2018 11:56:31 -0500 Subject: [SciPy-Dev] [Cython] [Scipy-Dev] FFT from Cython In-Reply-To: <5552575b558ec6d48dc9e32c95d783ee6a9c6f78.camel@iki.fi> References: <5552575b558ec6d48dc9e32c95d783ee6a9c6f78.camel@iki.fi> Message-ID: True, but something that would avoid having to go back into the Python interpreter and matches SciPy behavior would still be helpful here. If SciPy ever made the FFT backend swappable, then this would also ensure Numba would track as well. And, yes, there doesn't seem to be an actively maintained FFT library with broad platform support that has a BSD license. FFTS seemed promising, but there haven't been any Github updates for 2 years and the homepage is no longer up. On Mon, Jul 30, 2018 at 11:45 AM, Pauli Virtanen wrote: > ma, 2018-07-30 kello 11:12 -0500, Stanley Seibert kirjoitti: > > This would also be useful to Numba to enable automatic usage of > > optimized > > FFT functions from nopython mode. We already use this technique with > > the > > BLAS/LAPACK exports from SciPy to great benefit. > > As a general comment, the "optimization" situation with FFT functions > is somewhat different from BLAS/LAPACK, as the FFT library in Scipy is > not state of the art. There are faster alternatives out there, however > to my knowledge not with BSD-compatible licenses (at least a few years > ago, so this information may be outdated). > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at python.org > https://mail.python.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rlucente at pipeline.com Mon Jul 30 13:10:58 2018 From: rlucente at pipeline.com (Robert Lucente) Date: Mon, 30 Jul 2018 13:10:58 -0400 (GMT-04:00) Subject: [SciPy-Dev] Google Optimization Tools Message-ID: <29795504.6328.1532970658917@wamui-marley.atl.sa.earthlink.net> An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Mon Jul 30 18:21:02 2018 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Mon, 30 Jul 2018 17:21:02 -0500 Subject: [SciPy-Dev] [cython-users] [Cython] [Scipy-Dev] FFT from Cython In-Reply-To: References: Message-ID: Actually, this seems like a pretty reasonable SciPy feature request. Feel free to open an issue there. Thanks, Ian Henriksen On Mon, Jul 30, 2018 at 3:18 PM 'Chris Barker' via cython-users < cython-users at googlegroups.com> wrote: > BTW, > > Please don't spam so many lists with the same question. cython-users is > the right one for this. > > -CHB > > > > On Mon, Jul 30, 2018 at 10:50 AM, Chris Barker > wrote: > >> On Sun, Jul 29, 2018 at 6:17 PM, Eleftherios Garyfallidis < >> garyfallidis at gmail.com> wrote: >> >>> For a project in DIPY (http://dipy.org) we need to be calling 1D and 2D >>> FFTs and iFFTs very often. For this reason the most efficient way seems to >>> be through Cython. >>> >>> It would be great if we could call scipy's fft or numpy's fft without >>> the overhead. >>> >> >> have you profiled the overhead? Is it really significant? I"d expect the >> answer would only be yes if you were doing a lot of really small arrays -- >> is that the case? >> >> -CHB >> >> >> -- >> >> Christopher Barker, Ph.D. >> Oceanographer >> >> Emergency Response Division >> NOAA/NOS/OR&R (206) 526-6959 voice >> 7600 Sand Point Way NE >> >> (206) 526-6329 fax >> Seattle, WA 98115 (206) 526-6317 main reception >> >> Chris.Barker at noaa.gov >> > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE > > (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > -- > > --- > You received this message because you are subscribed to the Google Groups > "cython-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to cython-users+unsubscribe at googlegroups.com. > For more options, visit https://groups.google.com/d/optout. > -------------- next part -------------- An HTML attachment was scrubbed... URL: