[SciPy-Dev] SciPy-Dev Digest, Vol 192, Issue 3

rlucas7 at vt.edu rlucas7 at vt.edu
Wed Oct 2 12:45:37 EDT 2019


> On Oct 2, 2019, at 12:00 PM, scipy-dev-request at python.org wrote:
> 
> Send SciPy-Dev mailing list submissions to
>    scipy-dev at python.org
> 
> To subscribe or unsubscribe via the World Wide Web, visit
>    https://mail.python.org/mailman/listinfo/scipy-dev
> or, via email, send a message with subject or body 'help' to
>    scipy-dev-request at python.org
> 
> You can reach the person managing the list at
>    scipy-dev-owner at python.org
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of SciPy-Dev digest..."
> 
> 
> Today's Topics:
> 
>   1. Directory for performance analysis (Todd)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Wed, 2 Oct 2019 09:40:51 -0400
> From: Todd <toddrjen at gmail.com>
> To: SciPy Developers List <scipy-dev at python.org>
> Subject: [SciPy-Dev] Directory for performance analysis
> Message-ID:
>    <CAFpSVpJ6MoNmvvLYJJJO3dRySLHiQFhKPK=Y2dQVG1tQCjtG2Q at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Although scipy has a directory ("benchmarks") for automatically testing how
> the performance of individual functions has changed, there is no directory
> as far as I have found for analysis of how particular approaches to the
> same problem compare.  I think it would be worthwhile to have one.
> 
> For example the "scipy.signal.convolve" function has code to automatically
> choose whether to use the conventional convolution algorithm or the
> FFT-based algorithm, based on an analysis someone did.  However, I cannot
> find the code that actually carried out that analysis.  This is important
> because I want to extend that analysis to include another algorithm.
> 
> There are a bunch of other situations where particular approaches are
> recommended as being faster in certain situations, or faster by certain
> amounts in certain situations.
> 
> So I think that having a directory inside the scipy source tree that
> includes analyses like that would be worthwhile.  I was thinking something
> like "benchmarks/analysis", although it could also be part of the
> documentation proper.

Hi Todd, 
This sounds interesting. The place where this could be useful to the releases would be in determining defaults. 

Is this the use you have in mind or is there another use that I’m not thinking of at the moment?

The downside to that could be if defaults are changed they could impact end users of SciPy.

> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20191002/bd1b0447/attachment-0001.html>
> 
> ------------------------------
> 
> Subject: Digest Footer
> 
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
> 
> 
> ------------------------------
> 
> End of SciPy-Dev Digest, Vol 192, Issue 3
> *****************************************


More information about the SciPy-Dev mailing list