From denis.akhiyarov at gmail.com Mon May 1 03:44:57 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Mon, 01 May 2017 07:44:57 +0000 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: I still suggest Intel+MSVC compilers, since you can use trial version or request license for open-source projects from Intel. This is what Anaconda team is using. Also this is what Christoph Gohlke wheels are based on: http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy If you end up with m2w64, here is lapack for conda, you may still have to modify paths: https://anaconda.org/conda-forge/lapack And blas: https://anaconda.org/search?q=Blas On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher wrote: > Why do you want to pay Intel? You can install the MKL and develop with it, > no sweat. > > 2017-04-30 22:41 GMT+01:00 David Hagen : > >> > Welcome to the world of pain with building scientific packages from >> source on Windows! >> >> I am beginning to feel it. >> >> > You need Fortran and C/C++ compilers on Windows to build scipy from >> source >> >> I have MinGW-w64 installed, which seems to be the recommended method. >> >> > I?m pretty sure that anaconda does not come with the development files >> for MKL, only the runtime files. >> >> I understand now. It looks like MKL is not the way to go unless I want to >> pay Intel. >> >> > If you don't need mkl and lapack/blas is good enough, then >> m2w64-toolchain from conda should have all necessary dependencies for >> building scipy. >> >> My only goal is to install and use Scipy master somewhere where it won't >> break my stable installation. I thought Anaconda would be a good place to >> start because once I activate an Anaconda environment, I should be able to >> treat like a normal Python installation and follow the normal >> install-from-source instructions. I went ahead and installed that >> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >> should change my question to: how do I install Scipy on Windows from >> source? Though when I search for this specifically on the web, the answer >> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >> Scipy. Do you know of a conda/pip package that provides ATLAS for building >> Scipy or another more suitable BLAS/LAPACK? >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > > -- > Information System Engineer, Ph.D. > Blog: http://blog.audio-tk.com/ > LinkedIn: http://www.linkedin.com/in/matthieubrucher > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mohamed.ibnalkadi at gmail.com Mon May 1 03:51:40 2017 From: mohamed.ibnalkadi at gmail.com (Mohamed IBN AL KADI) Date: Mon, 1 May 2017 09:51:40 +0200 Subject: [SciPy-User] SciPy-User Digest, Vol 165, Issue 1 In-Reply-To: References: Message-ID: I want to unscribe please. On 1 May 2017 9:47 a.m., wrote: > Send SciPy-User mailing list submissions to > scipy-user at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/scipy-user > or, via email, send a message with subject or body 'help' to > scipy-user-request at python.org > > You can reach the person managing the list at > scipy-user-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of SciPy-User digest..." > > > Today's Topics: > > 1. Re: Install Scipy with Anaconda's MKL libraries (David Hagen) > 2. Re: Install Scipy with Anaconda's MKL libraries (Matthieu Brucher) > 3. Re: Install Scipy with Anaconda's MKL libraries (Denis Akhiyarov) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sun, 30 Apr 2017 17:41:05 -0400 > From: David Hagen > To: scipy-user at python.org > Subject: Re: [SciPy-User] Install Scipy with Anaconda's MKL libraries > Message-ID: > gmail.com> > Content-Type: text/plain; charset="utf-8" > > > Welcome to the world of pain with building scientific packages from > source on Windows! > > I am beginning to feel it. > > > You need Fortran and C/C++ compilers on Windows to build scipy from > source > > I have MinGW-w64 installed, which seems to be the recommended method. > > > I?m pretty sure that anaconda does not come with the development files > for MKL, only the runtime files. > > I understand now. It looks like MKL is not the way to go unless I want to > pay Intel. > > > If you don't need mkl and lapack/blas is good enough, then > m2w64-toolchain from conda should have all necessary dependencies for > building scipy. > > My only goal is to install and use Scipy master somewhere where it won't > break my stable installation. I thought Anaconda would be a good place to > start because once I activate an Anaconda environment, I should be able to > treat like a normal Python installation and follow the normal > install-from-source instructions. I went ahead and installed that > m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I > should change my question to: how do I install Scipy on Windows from > source? Though when I search for this specifically on the web, the answer > seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by > Scipy. Do you know of a conda/pip package that provides ATLAS for building > Scipy or another more suitable BLAS/LAPACK? > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170430/3a767c3a/attachment-0001.html> > > ------------------------------ > > Message: 2 > Date: Sun, 30 Apr 2017 23:22:03 +0100 > From: Matthieu Brucher > To: SciPy Users List > Subject: Re: [SciPy-User] Install Scipy with Anaconda's MKL libraries > Message-ID: > XOP++bArRmsepQ at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > Why do you want to pay Intel? You can install the MKL and develop with it, > no sweat. > > 2017-04-30 22:41 GMT+01:00 David Hagen : > > > > Welcome to the world of pain with building scientific packages from > > source on Windows! > > > > I am beginning to feel it. > > > > > You need Fortran and C/C++ compilers on Windows to build scipy from > > source > > > > I have MinGW-w64 installed, which seems to be the recommended method. > > > > > I?m pretty sure that anaconda does not come with the development files > > for MKL, only the runtime files. > > > > I understand now. It looks like MKL is not the way to go unless I want to > > pay Intel. > > > > > If you don't need mkl and lapack/blas is good enough, then > > m2w64-toolchain from conda should have all necessary dependencies for > > building scipy. > > > > My only goal is to install and use Scipy master somewhere where it won't > > break my stable installation. I thought Anaconda would be a good place to > > start because once I activate an Anaconda environment, I should be able > to > > treat like a normal Python installation and follow the normal > > install-from-source instructions. I went ahead and installed that > > m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. > Maybe I > > should change my question to: how do I install Scipy on Windows from > > source? Though when I search for this specifically on the web, the answer > > seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended > by > > Scipy. Do you know of a conda/pip package that provides ATLAS for > building > > Scipy or another more suitable BLAS/LAPACK? > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > > > -- > Information System Engineer, Ph.D. > Blog: http://blog.audio-tk.com/ > LinkedIn: http://www.linkedin.com/in/matthieubrucher > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170430/30c78312/attachment-0001.html> > > ------------------------------ > > Message: 3 > Date: Mon, 01 May 2017 07:44:57 +0000 > From: Denis Akhiyarov > To: SciPy Users List > Subject: Re: [SciPy-User] Install Scipy with Anaconda's MKL libraries > Message-ID: > mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > I still suggest Intel+MSVC compilers, since you can use trial version or > request license for open-source projects from Intel. This is what Anaconda > team is using. Also this is what Christoph Gohlke wheels are based on: > > http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy > > If you end up with m2w64, here is lapack for conda, you may still have to > modify paths: > > https://anaconda.org/conda-forge/lapack > > And blas: > > https://anaconda.org/search?q=Blas > > On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher > > wrote: > > > Why do you want to pay Intel? You can install the MKL and develop with > it, > > no sweat. > > > > 2017-04-30 22:41 GMT+01:00 David Hagen : > > > >> > Welcome to the world of pain with building scientific packages from > >> source on Windows! > >> > >> I am beginning to feel it. > >> > >> > You need Fortran and C/C++ compilers on Windows to build scipy from > >> source > >> > >> I have MinGW-w64 installed, which seems to be the recommended method. > >> > >> > I?m pretty sure that anaconda does not come with the development files > >> for MKL, only the runtime files. > >> > >> I understand now. It looks like MKL is not the way to go unless I want > to > >> pay Intel. > >> > >> > If you don't need mkl and lapack/blas is good enough, then > >> m2w64-toolchain from conda should have all necessary dependencies for > >> building scipy. > >> > >> My only goal is to install and use Scipy master somewhere where it won't > >> break my stable installation. I thought Anaconda would be a good place > to > >> start because once I activate an Anaconda environment, I should be able > to > >> treat like a normal Python installation and follow the normal > >> install-from-source instructions. I went ahead and installed that > >> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. > Maybe I > >> should change my question to: how do I install Scipy on Windows from > >> source? Though when I search for this specifically on the web, the > answer > >> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended > by > >> Scipy. Do you know of a conda/pip package that provides ATLAS for > building > >> Scipy or another more suitable BLAS/LAPACK? > >> > >> _______________________________________________ > >> SciPy-User mailing list > >> SciPy-User at python.org > >> https://mail.python.org/mailman/listinfo/scipy-user > >> > >> > > > > > > -- > > Information System Engineer, Ph.D. > > Blog: http://blog.audio-tk.com/ > > LinkedIn: http://www.linkedin.com/in/matthieubrucher > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: attachments/20170501/f5de028e/attachment.html> > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > ------------------------------ > > End of SciPy-User Digest, Vol 165, Issue 1 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Mon May 1 19:40:19 2017 From: david at drhagen.com (David Hagen) Date: Mon, 1 May 2017 19:40:19 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: I'll try to stick with MinGW-w64 for now, but I don't even get to the compilation phase. If I install lapack and blas from conda-forge, it still says that lapack/blas are not found, but you indicated that I need to set some paths. Are there instructions for this? I have no idea what environment variables to set in order to tell Scipy to use these packages. On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov wrote: > I still suggest Intel+MSVC compilers, since you can use trial version or > request license for open-source projects from Intel. This is what Anaconda > team is using. Also this is what Christoph Gohlke wheels are based on: > > http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy > > If you end up with m2w64, here is lapack for conda, you may still have to > modify paths: > > https://anaconda.org/conda-forge/lapack > > And blas: > > https://anaconda.org/search?q=Blas > > On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher > wrote: > >> Why do you want to pay Intel? You can install the MKL and develop with >> it, no sweat. >> >> 2017-04-30 22:41 GMT+01:00 David Hagen : >> >>> > Welcome to the world of pain with building scientific packages from >>> source on Windows! >>> >>> I am beginning to feel it. >>> >>> > You need Fortran and C/C++ compilers on Windows to build scipy from >>> source >>> >>> I have MinGW-w64 installed, which seems to be the recommended method. >>> >>> > I?m pretty sure that anaconda does not come with the development files >>> for MKL, only the runtime files. >>> >>> I understand now. It looks like MKL is not the way to go unless I want >>> to pay Intel. >>> >>> > If you don't need mkl and lapack/blas is good enough, then >>> m2w64-toolchain from conda should have all necessary dependencies for >>> building scipy. >>> >>> My only goal is to install and use Scipy master somewhere where it won't >>> break my stable installation. I thought Anaconda would be a good place to >>> start because once I activate an Anaconda environment, I should be able to >>> treat like a normal Python installation and follow the normal >>> install-from-source instructions. I went ahead and installed that >>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>> should change my question to: how do I install Scipy on Windows from >>> source? Though when I search for this specifically on the web, the answer >>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>> Scipy or another more suitable BLAS/LAPACK? >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> >> -- >> Information System Engineer, Ph.D. >> Blog: http://blog.audio-tk.com/ >> LinkedIn: http://www.linkedin.com/in/matthieubrucher >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From immudzen at gmail.com Tue May 2 04:00:09 2017 From: immudzen at gmail.com (William Heymann) Date: Tue, 2 May 2017 10:00:09 +0200 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: Intel has made MKL, TBB, and a few other things completely free to use, even in a commercial project. Visual Studio is also free unless you are a very large company. https://software.intel.com/en-us/articles/free-mkl I have been using that for other projects without any problems and compiling with Visual Studio has been very easy. On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: > I'll try to stick with MinGW-w64 for now, but I don't even get to the > compilation phase. If I install lapack and blas from conda-forge, it still > says that lapack/blas are not found, but you indicated that I need to set > some paths. Are there instructions for this? I have no idea what > environment variables to set in order to tell Scipy to use these packages. > > On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov > wrote: > >> I still suggest Intel+MSVC compilers, since you can use trial version or >> request license for open-source projects from Intel. This is what Anaconda >> team is using. Also this is what Christoph Gohlke wheels are based on: >> >> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >> >> If you end up with m2w64, here is lapack for conda, you may still have to >> modify paths: >> >> https://anaconda.org/conda-forge/lapack >> >> And blas: >> >> https://anaconda.org/search?q=Blas >> >> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >> matthieu.brucher at gmail.com> wrote: >> >>> Why do you want to pay Intel? You can install the MKL and develop with >>> it, no sweat. >>> >>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>> >>>> > Welcome to the world of pain with building scientific packages from >>>> source on Windows! >>>> >>>> I am beginning to feel it. >>>> >>>> > You need Fortran and C/C++ compilers on Windows to build scipy from >>>> source >>>> >>>> I have MinGW-w64 installed, which seems to be the recommended method. >>>> >>>> > I?m pretty sure that anaconda does not come with the development >>>> files for MKL, only the runtime files. >>>> >>>> I understand now. It looks like MKL is not the way to go unless I want >>>> to pay Intel. >>>> >>>> > If you don't need mkl and lapack/blas is good enough, then >>>> m2w64-toolchain from conda should have all necessary dependencies for >>>> building scipy. >>>> >>>> My only goal is to install and use Scipy master somewhere where it >>>> won't break my stable installation. I thought Anaconda would be a good >>>> place to start because once I activate an Anaconda environment, I should be >>>> able to treat like a normal Python installation and follow the normal >>>> install-from-source instructions. I went ahead and installed that >>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>> should change my question to: how do I install Scipy on Windows from >>>> source? Though when I search for this specifically on the web, the answer >>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>> Scipy or another more suitable BLAS/LAPACK? >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> >>> >>> -- >>> Information System Engineer, Ph.D. >>> Blog: http://blog.audio-tk.com/ >>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gideon.simpson at gmail.com Tue May 2 22:22:19 2017 From: gideon.simpson at gmail.com (Gideon Simpson) Date: Tue, 2 May 2017 22:22:19 -0400 Subject: [SciPy-User] scipy.optimize and vectorization (or not) Message-ID: <516A9270-5E48-4674-A868-F37A8515AA13@gmail.com> I have a scalar valued function that I?d like to minimize, but, by default, it?s not readily vectorized. In particular, it depends parametrically on the solution of an ODE, and vaguely looks like: def f(x, ode_soln): ... val = spicy.integrate.quad(lambda t: sin(x * t) * ode_soln.sol(t),0,1) return val By default, if I try to do spicy.optimize.minimize(lambda x: f(x,ode_soln)) I get an error that indicates that minimize thinks f accepts arrays of data. I can remedy this by writing a little looped version, def f_vec(xvec, ode_soln): fvals = np.zeros_like(xvec) for j in range(xvec.size): fvals[j] = f(xvec[j], ode_soln) return fvals but I?m wondering if: 1. Is there a smarter/more elegant solution to handling the vectorized input? 2. Is there a way to just tell minimize that it?s going to have to evaluate f one point at a time, rather than writing some other function? -gideon From david at drhagen.com Tue May 2 22:12:15 2017 From: david at drhagen.com (David Hagen) Date: Tue, 2 May 2017 22:12:15 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: Is there a recipe for this combination? I installed MKL from that link and the latest Visual Studio. Scipy did not find MKL on its own. I'm sure there's some environment variables that need to be set, but I don't know what they are. On Tue, May 2, 2017 at 4:00 AM, William Heymann wrote: > Intel has made MKL, TBB, and a few other things completely free to use, > even in a commercial project. Visual Studio is also free unless you are a > very large company. > > https://software.intel.com/en-us/articles/free-mkl > > I have been using that for other projects without any problems and > compiling with Visual Studio has been very easy. > > On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: > >> I'll try to stick with MinGW-w64 for now, but I don't even get to the >> compilation phase. If I install lapack and blas from conda-forge, it still >> says that lapack/blas are not found, but you indicated that I need to set >> some paths. Are there instructions for this? I have no idea what >> environment variables to set in order to tell Scipy to use these packages. >> >> On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov < >> denis.akhiyarov at gmail.com> wrote: >> >>> I still suggest Intel+MSVC compilers, since you can use trial version or >>> request license for open-source projects from Intel. This is what Anaconda >>> team is using. Also this is what Christoph Gohlke wheels are based on: >>> >>> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >>> >>> If you end up with m2w64, here is lapack for conda, you may still have >>> to modify paths: >>> >>> https://anaconda.org/conda-forge/lapack >>> >>> And blas: >>> >>> https://anaconda.org/search?q=Blas >>> >>> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >>> matthieu.brucher at gmail.com> wrote: >>> >>>> Why do you want to pay Intel? You can install the MKL and develop with >>>> it, no sweat. >>>> >>>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>>> >>>>> > Welcome to the world of pain with building scientific packages from >>>>> source on Windows! >>>>> >>>>> I am beginning to feel it. >>>>> >>>>> > You need Fortran and C/C++ compilers on Windows to build scipy from >>>>> source >>>>> >>>>> I have MinGW-w64 installed, which seems to be the recommended method. >>>>> >>>>> > I?m pretty sure that anaconda does not come with the development >>>>> files for MKL, only the runtime files. >>>>> >>>>> I understand now. It looks like MKL is not the way to go unless I want >>>>> to pay Intel. >>>>> >>>>> > If you don't need mkl and lapack/blas is good enough, then >>>>> m2w64-toolchain from conda should have all necessary dependencies for >>>>> building scipy. >>>>> >>>>> My only goal is to install and use Scipy master somewhere where it >>>>> won't break my stable installation. I thought Anaconda would be a good >>>>> place to start because once I activate an Anaconda environment, I should be >>>>> able to treat like a normal Python installation and follow the normal >>>>> install-from-source instructions. I went ahead and installed that >>>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>>> should change my question to: how do I install Scipy on Windows from >>>>> source? Though when I search for this specifically on the web, the answer >>>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>>> Scipy or another more suitable BLAS/LAPACK? >>>>> >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>>> >>>> >>>> >>>> -- >>>> Information System Engineer, Ph.D. >>>> Blog: http://blog.audio-tk.com/ >>>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Tue May 2 23:15:06 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Wed, 03 May 2017 03:15:06 +0000 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: Did you follow these instructions? https://software.intel.com/en-us/articles/numpyscipy-with-intel-mkl On Tue, May 2, 2017, 10:00 PM David Hagen wrote: > Is there a recipe for this combination? I installed MKL from that link and > the latest Visual Studio. Scipy did not find MKL on its own. I'm sure > there's some environment variables that need to be set, but I don't know > what they are. > > On Tue, May 2, 2017 at 4:00 AM, William Heymann > wrote: > >> Intel has made MKL, TBB, and a few other things completely free to use, >> even in a commercial project. Visual Studio is also free unless you are a >> very large company. >> >> https://software.intel.com/en-us/articles/free-mkl >> >> I have been using that for other projects without any problems and >> compiling with Visual Studio has been very easy. >> >> On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: >> >>> I'll try to stick with MinGW-w64 for now, but I don't even get to the >>> compilation phase. If I install lapack and blas from conda-forge, it still >>> says that lapack/blas are not found, but you indicated that I need to set >>> some paths. Are there instructions for this? I have no idea what >>> environment variables to set in order to tell Scipy to use these packages. >>> >>> On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov < >>> denis.akhiyarov at gmail.com> wrote: >>> >>>> I still suggest Intel+MSVC compilers, since you can use trial version >>>> or request license for open-source projects from Intel. This is what >>>> Anaconda team is using. Also this is what Christoph Gohlke wheels are based >>>> on: >>>> >>>> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >>>> >>>> If you end up with m2w64, here is lapack for conda, you may still have >>>> to modify paths: >>>> >>>> https://anaconda.org/conda-forge/lapack >>>> >>>> And blas: >>>> >>>> https://anaconda.org/search?q=Blas >>>> >>>> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >>>> matthieu.brucher at gmail.com> wrote: >>>> >>>>> Why do you want to pay Intel? You can install the MKL and develop with >>>>> it, no sweat. >>>>> >>>>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>>>> >>>>>> > Welcome to the world of pain with building scientific packages from >>>>>> source on Windows! >>>>>> >>>>>> I am beginning to feel it. >>>>>> >>>>>> > You need Fortran and C/C++ compilers on Windows to build scipy from >>>>>> source >>>>>> >>>>>> I have MinGW-w64 installed, which seems to be the recommended method. >>>>>> >>>>>> > I?m pretty sure that anaconda does not come with the development >>>>>> files for MKL, only the runtime files. >>>>>> >>>>>> I understand now. It looks like MKL is not the way to go unless I >>>>>> want to pay Intel. >>>>>> >>>>>> > If you don't need mkl and lapack/blas is good enough, then >>>>>> m2w64-toolchain from conda should have all necessary dependencies for >>>>>> building scipy. >>>>>> >>>>>> My only goal is to install and use Scipy master somewhere where it >>>>>> won't break my stable installation. I thought Anaconda would be a good >>>>>> place to start because once I activate an Anaconda environment, I should be >>>>>> able to treat like a normal Python installation and follow the normal >>>>>> install-from-source instructions. I went ahead and installed that >>>>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>>>> should change my question to: how do I install Scipy on Windows from >>>>>> source? Though when I search for this specifically on the web, the answer >>>>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>>>> Scipy or another more suitable BLAS/LAPACK? >>>>>> >>>>>> _______________________________________________ >>>>>> SciPy-User mailing list >>>>>> SciPy-User at python.org >>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> Information System Engineer, Ph.D. >>>>> Blog: http://blog.audio-tk.com/ >>>>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From johnl at crumpington.com Wed May 3 07:11:16 2017 From: johnl at crumpington.com (David Lee) Date: Wed, 3 May 2017 13:11:16 +0200 Subject: [SciPy-User] Batch / HTC system testing Message-ID: <2c55e634-f21e-bb10-0cfa-e899de3275d2@crumpington.com> Hi, I hope this isn't too off-topic. I'm developing an on-demand, auto-scaling batch/HTC system, and I'm interesting in finding a tester or two. Please get in touch privately if this is something that you might be interested in. I can provide some free compute time and support. Take care, David -- J. David Lee From newville at cars.uchicago.edu Wed May 3 13:41:14 2017 From: newville at cars.uchicago.edu (Matt Newville) Date: Wed, 3 May 2017 12:41:14 -0500 Subject: [SciPy-User] scipy.optimize and vectorization (or not) In-Reply-To: <516A9270-5E48-4674-A868-F37A8515AA13@gmail.com> References: <516A9270-5E48-4674-A868-F37A8515AA13@gmail.com> Message-ID: On Tue, May 2, 2017 at 9:22 PM, Gideon Simpson wrote: > I have a scalar valued function that I?d like to minimize, but, by > default, it?s not readily vectorized. In particular, it depends > parametrically on the solution of an ODE, and vaguely looks like: > > def f(x, ode_soln): > ... > val = spicy.integrate.quad(lambda t: sin(x * t) * > ode_soln.sol(t),0,1) > > return val > > By default, if I try to do > > spicy.optimize.minimize(lambda x: f(x,ode_soln)) > > I get an error that indicates that minimize thinks f accepts arrays of > data. minimize() takes two arguments: a callable function that takes an array of values to be adjusted, and an array of starting values. Presumably, `x` is the array of values that you would like optimized, but you have to provide starting values. I don't think you need the lambda, but you might want something like: scipy.optimize.minimize(f, x0, args=(ode_soln,)) where x0 is an array of starting values. > I can remedy this by writing a little looped version, > > def f_vec(xvec, ode_soln): > fvals = np.zeros_like(xvec) > > for j in range(xvec.size): > fvals[j] = f(xvec[j], ode_soln) > > return fvals > > but I?m wondering if: > > 1. Is there a smarter/more elegant solution to handling the vectorized > input? > It does handle vectorized input, or perhaps I'm not understanding your question. > > 2. Is there a way to just tell minimize that it?s going to have to > evaluate f one point at a time, rather than writing some other function? > No, or well it depends what you mean by "one point at a time". The objective function provided should take an array of candidate values for the parameters and return either an array to be minimized in the least-squares sense or the scalar cost value. Hope that helps, --Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Wed May 3 21:26:42 2017 From: david at drhagen.com (David Hagen) Date: Wed, 3 May 2017 21:26:42 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: So those instructions get me farther than I have gotten before. Is this actually necessary?: Note: You will need to rebuild Python from source files. This is due to the downloaded Python binary on Windows might be incompatible with the Visual Studio version you used. Otherwise you will encounter runtime crash when run numpy or script tests. This is what I have done so far: 1) Download Scipy zip file from Github mater branch 2) Unzip file 3) Add a site.cfg file with these contents: [mkl] library_dirs = C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\mkl\lib\intel64_win;C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\compiler\lib\intel64_win include_dirs = C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\mkl\include mkl_libs = mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_thread,mkl_core,libiomp5md lapack_libs = mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_thread,mkl_core,libiomp5md 4) cd into that directory 5) run: python setup.py config --compiler=intelemw --fcompiler=intelvem build_clib --compiler=intelemw --fcompiler=intelvem build_ext --compiler=intelemw --fcompiler=intelvem install It gets really far into the build but this is the error that results: building extension "scipy.spatial.qhull" sources creating build\src.win-amd64-3.6\scipy\spatial Could not locate executable icc Could not locate executable ecc Traceback (most recent call last): File "setup.py", line 417, in setup_package() File "setup.py", line 413, in setup_package setup(**metadata) File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\core.py ", line 166, in setup return old_setup(**new_attr) File "C:\Anaconda3\envs\scipy_master\lib\distutils\core.py", line 148, in setu p dist.run_commands() File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 955, in run_ commands self.run_command(cmd) File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, in run_ command cmd_obj.run() File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \build_clib.py", line 74, in run self.run_command('build_src') File "C:\Anaconda3\envs\scipy_master\lib\distutils\cmd.py", line 313, in run_c ommand self.distribution.run_command(command) File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, in run_ command cmd_obj.run() File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \build_src.py", line 148, in run self.build_sources() File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \build_src.py", line 165, in build_sources self.build_extension_sources(ext) File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \build_src.py", line 324, in build_extension_sources sources = self.generate_sources(sources, ext) File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \build_src.py", line 377, in generate_sources source = func(extension, build_dir) File "scipy\spatial\setup.py", line 35, in get_qhull_misc_config if config_cmd.check_func('open_memstream', decl=True, call=True): File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \config.py", line 312, in check_func self._check_compiler() File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command \config.py", line 51, in _check_compiler self.compiler.initialize() File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\intelcc ompiler.py", line 86, in initialize MSVCCompiler.initialize(self, plat_name) File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\msvc9co mpiler.py", line 53, in initialize os.environ['lib'] = _merge(environ_lib, os.environ['lib']) File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\msvc9co mpiler.py", line 32, in _merge if new in old: TypeError: argument of type 'NoneType' is not iterable I notice that this is calling into Numpy. Do I have modify Numpy in order to compile Scipy? I skipped the Numpy specific steps in the instructions because I don't need the development version of Numpy. On Tue, May 2, 2017 at 11:15 PM, Denis Akhiyarov wrote: > Did you follow these instructions? > > https://software.intel.com/en-us/articles/numpyscipy-with-intel-mkl > > On Tue, May 2, 2017, 10:00 PM David Hagen wrote: > >> Is there a recipe for this combination? I installed MKL from that link >> and the latest Visual Studio. Scipy did not find MKL on its own. I'm sure >> there's some environment variables that need to be set, but I don't know >> what they are. >> >> On Tue, May 2, 2017 at 4:00 AM, William Heymann >> wrote: >> >>> Intel has made MKL, TBB, and a few other things completely free to use, >>> even in a commercial project. Visual Studio is also free unless you are a >>> very large company. >>> >>> https://software.intel.com/en-us/articles/free-mkl >>> >>> I have been using that for other projects without any problems and >>> compiling with Visual Studio has been very easy. >>> >>> On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: >>> >>>> I'll try to stick with MinGW-w64 for now, but I don't even get to the >>>> compilation phase. If I install lapack and blas from conda-forge, it still >>>> says that lapack/blas are not found, but you indicated that I need to set >>>> some paths. Are there instructions for this? I have no idea what >>>> environment variables to set in order to tell Scipy to use these packages. >>>> >>>> On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov < >>>> denis.akhiyarov at gmail.com> wrote: >>>> >>>>> I still suggest Intel+MSVC compilers, since you can use trial version >>>>> or request license for open-source projects from Intel. This is what >>>>> Anaconda team is using. Also this is what Christoph Gohlke wheels are based >>>>> on: >>>>> >>>>> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >>>>> >>>>> If you end up with m2w64, here is lapack for conda, you may still have >>>>> to modify paths: >>>>> >>>>> https://anaconda.org/conda-forge/lapack >>>>> >>>>> And blas: >>>>> >>>>> https://anaconda.org/search?q=Blas >>>>> >>>>> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >>>>> matthieu.brucher at gmail.com> wrote: >>>>> >>>>>> Why do you want to pay Intel? You can install the MKL and develop >>>>>> with it, no sweat. >>>>>> >>>>>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>>>>> >>>>>>> > Welcome to the world of pain with building scientific packages >>>>>>> from source on Windows! >>>>>>> >>>>>>> I am beginning to feel it. >>>>>>> >>>>>>> > You need Fortran and C/C++ compilers on Windows to build scipy >>>>>>> from source >>>>>>> >>>>>>> I have MinGW-w64 installed, which seems to be the recommended method. >>>>>>> >>>>>>> > I?m pretty sure that anaconda does not come with the development >>>>>>> files for MKL, only the runtime files. >>>>>>> >>>>>>> I understand now. It looks like MKL is not the way to go unless I >>>>>>> want to pay Intel. >>>>>>> >>>>>>> > If you don't need mkl and lapack/blas is good enough, then >>>>>>> m2w64-toolchain from conda should have all necessary dependencies for >>>>>>> building scipy. >>>>>>> >>>>>>> My only goal is to install and use Scipy master somewhere where it >>>>>>> won't break my stable installation. I thought Anaconda would be a good >>>>>>> place to start because once I activate an Anaconda environment, I should be >>>>>>> able to treat like a normal Python installation and follow the normal >>>>>>> install-from-source instructions. I went ahead and installed that >>>>>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>>>>> should change my question to: how do I install Scipy on Windows from >>>>>>> source? Though when I search for this specifically on the web, the answer >>>>>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>>>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>>>>> Scipy or another more suitable BLAS/LAPACK? >>>>>>> >>>>>>> _______________________________________________ >>>>>>> SciPy-User mailing list >>>>>>> SciPy-User at python.org >>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Information System Engineer, Ph.D. >>>>>> Blog: http://blog.audio-tk.com/ >>>>>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>>>>> _______________________________________________ >>>>>> SciPy-User mailing list >>>>>> SciPy-User at python.org >>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>> >>>>> >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Thu May 4 02:58:34 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Thu, 4 May 2017 01:58:34 -0500 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: Ok, I followed instructions on this page: https://software.intel.com/en-us/articles/building-numpyscipy-with-intel-mkl-and-intel-fortran-on-windows Installed parallel_studio_xe_2016_update4 with mkl, ifort, and icc. 1. I compiled numpy (python setup.py install) from source with this site.cfg file: [mkl] include_dirs = C:\Program Files (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\mkl\include library_dirs = C:\Program Files (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\mkl\lib\intel64;C:\Program Files (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\compiler\lib\intel64 mkl_libs = mkl_core_dll, mkl_intel_lp64_dll, mkl_intel_thread_dll lapack_libs = mkl_lapack95_lp64 2. I replaced site.cfg file in ..\Lib\site-packages\numpy\distutils with the one above, since this is used by scipy in step 3. 3. I compiled scipy (python setup.py install) from source. Compilation went without errors, both packages import without errors in python. On Wed, May 3, 2017 at 8:26 PM, David Hagen wrote: > So those instructions get me farther than I have gotten before. > > Is this actually necessary?: > > Note: You will need to rebuild Python from source files. This is due to > the downloaded Python binary on Windows might be incompatible with the > Visual Studio version you used. Otherwise you will encounter runtime crash > when run numpy or script tests. > > This is what I have done so far: > > 1) Download Scipy zip file from Github mater branch > 2) Unzip file > 3) Add a site.cfg file with these contents: > [mkl] > library_dirs = C:\Program Files (x86)\IntelSWTools\compilers_ > and_libraries_2017.2.187\windows\mkl\lib\intel64_win;C:\Program Files > (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\ > windows\compiler\lib\intel64_win > include_dirs = C:\Program Files (x86)\IntelSWTools\compilers_ > and_libraries_2017.2.187\windows\mkl\include > mkl_libs = mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_ > thread,mkl_core,libiomp5md > lapack_libs = mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_ > thread,mkl_core,libiomp5md > > 4) cd into that directory > 5) run: python setup.py config --compiler=intelemw --fcompiler=intelvem > build_clib --compiler=intelemw --fcompiler=intelvem build_ext > --compiler=intelemw --fcompiler=intelvem install > > It gets really far into the build but this is the error that results: > > building extension "scipy.spatial.qhull" sources > creating build\src.win-amd64-3.6\scipy\spatial > Could not locate executable icc > Could not locate executable ecc > Traceback (most recent call last): > File "setup.py", line 417, in > setup_package() > File "setup.py", line 413, in setup_package > setup(**metadata) > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\core.py > ", line 166, in setup > return old_setup(**new_attr) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\core.py", line 148, > in setu > p > dist.run_commands() > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 955, > in run_ > commands > self.run_command(cmd) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, > in run_ > command > cmd_obj.run() > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \build_clib.py", line 74, in run > self.run_command('build_src') > File "C:\Anaconda3\envs\scipy_master\lib\distutils\cmd.py", line 313, > in run_c > ommand > self.distribution.run_command(command) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, > in run_ > command > cmd_obj.run() > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \build_src.py", line 148, in run > self.build_sources() > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \build_src.py", line 165, in build_sources > self.build_extension_sources(ext) > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \build_src.py", line 324, in build_extension_sources > sources = self.generate_sources(sources, ext) > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \build_src.py", line 377, in generate_sources > source = func(extension, build_dir) > File "scipy\spatial\setup.py", line 35, in get_qhull_misc_config > if config_cmd.check_func('open_memstream', decl=True, call=True): > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \config.py", line 312, in check_func > self._check_compiler() > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\command > \config.py", line 51, in _check_compiler > self.compiler.initialize() > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\intelcc > ompiler.py", line 86, in initialize > MSVCCompiler.initialize(self, plat_name) > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\msvc9co > mpiler.py", line 53, in initialize > os.environ['lib'] = _merge(environ_lib, os.environ['lib']) > File "C:\Anaconda3\envs\scipy_master\lib\site-packages\ > numpy\distutils\msvc9co > mpiler.py", line 32, in _merge > if new in old: > TypeError: argument of type 'NoneType' is not iterable > > I notice that this is calling into Numpy. Do I have modify Numpy in order > to compile Scipy? I skipped the Numpy specific steps in the instructions > because I don't need the development version of Numpy. > > On Tue, May 2, 2017 at 11:15 PM, Denis Akhiyarov < > denis.akhiyarov at gmail.com> wrote: > >> Did you follow these instructions? >> >> https://software.intel.com/en-us/articles/numpyscipy-with-intel-mkl >> >> On Tue, May 2, 2017, 10:00 PM David Hagen wrote: >> >>> Is there a recipe for this combination? I installed MKL from that link >>> and the latest Visual Studio. Scipy did not find MKL on its own. I'm sure >>> there's some environment variables that need to be set, but I don't know >>> what they are. >>> >>> On Tue, May 2, 2017 at 4:00 AM, William Heymann >>> wrote: >>> >>>> Intel has made MKL, TBB, and a few other things completely free to use, >>>> even in a commercial project. Visual Studio is also free unless you are a >>>> very large company. >>>> >>>> https://software.intel.com/en-us/articles/free-mkl >>>> >>>> I have been using that for other projects without any problems and >>>> compiling with Visual Studio has been very easy. >>>> >>>> On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: >>>> >>>>> I'll try to stick with MinGW-w64 for now, but I don't even get to the >>>>> compilation phase. If I install lapack and blas from conda-forge, it still >>>>> says that lapack/blas are not found, but you indicated that I need to set >>>>> some paths. Are there instructions for this? I have no idea what >>>>> environment variables to set in order to tell Scipy to use these packages. >>>>> >>>>> On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov < >>>>> denis.akhiyarov at gmail.com> wrote: >>>>> >>>>>> I still suggest Intel+MSVC compilers, since you can use trial version >>>>>> or request license for open-source projects from Intel. This is what >>>>>> Anaconda team is using. Also this is what Christoph Gohlke wheels are based >>>>>> on: >>>>>> >>>>>> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >>>>>> >>>>>> If you end up with m2w64, here is lapack for conda, you may still >>>>>> have to modify paths: >>>>>> >>>>>> https://anaconda.org/conda-forge/lapack >>>>>> >>>>>> And blas: >>>>>> >>>>>> https://anaconda.org/search?q=Blas >>>>>> >>>>>> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >>>>>> matthieu.brucher at gmail.com> wrote: >>>>>> >>>>>>> Why do you want to pay Intel? You can install the MKL and develop >>>>>>> with it, no sweat. >>>>>>> >>>>>>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>>>>>> >>>>>>>> > Welcome to the world of pain with building scientific packages >>>>>>>> from source on Windows! >>>>>>>> >>>>>>>> I am beginning to feel it. >>>>>>>> >>>>>>>> > You need Fortran and C/C++ compilers on Windows to build scipy >>>>>>>> from source >>>>>>>> >>>>>>>> I have MinGW-w64 installed, which seems to be the recommended >>>>>>>> method. >>>>>>>> >>>>>>>> > I?m pretty sure that anaconda does not come with the development >>>>>>>> files for MKL, only the runtime files. >>>>>>>> >>>>>>>> I understand now. It looks like MKL is not the way to go unless I >>>>>>>> want to pay Intel. >>>>>>>> >>>>>>>> > If you don't need mkl and lapack/blas is good enough, then >>>>>>>> m2w64-toolchain from conda should have all necessary dependencies for >>>>>>>> building scipy. >>>>>>>> >>>>>>>> My only goal is to install and use Scipy master somewhere where it >>>>>>>> won't break my stable installation. I thought Anaconda would be a good >>>>>>>> place to start because once I activate an Anaconda environment, I should be >>>>>>>> able to treat like a normal Python installation and follow the normal >>>>>>>> install-from-source instructions. I went ahead and installed that >>>>>>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>>>>>> should change my question to: how do I install Scipy on Windows from >>>>>>>> source? Though when I search for this specifically on the web, the answer >>>>>>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>>>>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>>>>>> Scipy or another more suitable BLAS/LAPACK? >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> SciPy-User mailing list >>>>>>>> SciPy-User at python.org >>>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> Information System Engineer, Ph.D. >>>>>>> Blog: http://blog.audio-tk.com/ >>>>>>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>>>>>> _______________________________________________ >>>>>>> SciPy-User mailing list >>>>>>> SciPy-User at python.org >>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> SciPy-User mailing list >>>>>> SciPy-User at python.org >>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Thu May 4 08:16:13 2017 From: david at drhagen.com (David Hagen) Date: Thu, 4 May 2017 08:16:13 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: Am I correct that there is no free version of ifort and icc? Will this break after the 30-trial ends? On Thu, May 4, 2017 at 2:58 AM, Denis Akhiyarov wrote: > Ok, I followed instructions on this page: > > https://software.intel.com/en-us/articles/building- > numpyscipy-with-intel-mkl-and-intel-fortran-on-windows > > Installed parallel_studio_xe_2016_update4 with mkl, ifort, and icc. > > 1. I compiled numpy (python setup.py install) from source with this > site.cfg file: > > [mkl] > include_dirs = C:\Program Files (x86)\IntelSWTools\parallel_ > studio_xe_2016.4.062\compilers_and_libraries_2016\windows\mkl\include > library_dirs = C:\Program Files (x86)\IntelSWTools\parallel_ > studio_xe_2016.4.062\compilers_and_libraries_2016\ > windows\mkl\lib\intel64;C:\Program Files (x86)\IntelSWTools\parallel_ > studio_xe_2016.4.062\compilers_and_libraries_2016\ > windows\compiler\lib\intel64 > mkl_libs = mkl_core_dll, mkl_intel_lp64_dll, mkl_intel_thread_dll > lapack_libs = mkl_lapack95_lp64 > > > 2. I replaced site.cfg file in ..\Lib\site-packages\numpy\distutils with > the one above, since this is used by scipy in step 3. > > 3. I compiled scipy (python setup.py install) from source. > > Compilation went without errors, both packages import without errors in > python. > > > On Wed, May 3, 2017 at 8:26 PM, David Hagen wrote: > >> So those instructions get me farther than I have gotten before. >> >> Is this actually necessary?: >> >> Note: You will need to rebuild Python from source files. This is due to >> the downloaded Python binary on Windows might be incompatible with the >> Visual Studio version you used. Otherwise you will encounter runtime crash >> when run numpy or script tests. >> >> This is what I have done so far: >> >> 1) Download Scipy zip file from Github mater branch >> 2) Unzip file >> 3) Add a site.cfg file with these contents: >> [mkl] >> library_dirs = C:\Program Files (x86)\IntelSWTools\compilers_a >> nd_libraries_2017.2.187\windows\mkl\lib\intel64_win;C:\Program Files >> (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\window >> s\compiler\lib\intel64_win >> include_dirs = C:\Program Files (x86)\IntelSWTools\compilers_a >> nd_libraries_2017.2.187\windows\mkl\include >> mkl_libs = mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_t >> hread,mkl_core,libiomp5md >> lapack_libs = mkl_lapack95_lp64,mkl_blas95_l >> p64,mkl_intel_lp64,mkl_intel_thread,mkl_core,libiomp5md >> >> 4) cd into that directory >> 5) run: python setup.py config --compiler=intelemw --fcompiler=intelvem >> build_clib --compiler=intelemw --fcompiler=intelvem build_ext >> --compiler=intelemw --fcompiler=intelvem install >> >> It gets really far into the build but this is the error that results: >> >> building extension "scipy.spatial.qhull" sources >> creating build\src.win-amd64-3.6\scipy\spatial >> Could not locate executable icc >> Could not locate executable ecc >> Traceback (most recent call last): >> File "setup.py", line 417, in >> setup_package() >> File "setup.py", line 413, in setup_package >> setup(**metadata) >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\core.py >> ", line 166, in setup >> return old_setup(**new_attr) >> File "C:\Anaconda3\envs\scipy_master\lib\distutils\core.py", line 148, >> in setu >> p >> dist.run_commands() >> File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 955, >> in run_ >> commands >> self.run_command(cmd) >> File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, >> in run_ >> command >> cmd_obj.run() >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \build_clib.py", line 74, in run >> self.run_command('build_src') >> File "C:\Anaconda3\envs\scipy_master\lib\distutils\cmd.py", line 313, >> in run_c >> ommand >> self.distribution.run_command(command) >> File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", line 974, >> in run_ >> command >> cmd_obj.run() >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \build_src.py", line 148, in run >> self.build_sources() >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \build_src.py", line 165, in build_sources >> self.build_extension_sources(ext) >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \build_src.py", line 324, in build_extension_sources >> sources = self.generate_sources(sources, ext) >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \build_src.py", line 377, in generate_sources >> source = func(extension, build_dir) >> File "scipy\spatial\setup.py", line 35, in get_qhull_misc_config >> if config_cmd.check_func('open_memstream', decl=True, call=True): >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \config.py", line 312, in check_func >> self._check_compiler() >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\command >> \config.py", line 51, in _check_compiler >> self.compiler.initialize() >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\intelcc >> ompiler.py", line 86, in initialize >> MSVCCompiler.initialize(self, plat_name) >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\msvc9co >> mpiler.py", line 53, in initialize >> os.environ['lib'] = _merge(environ_lib, os.environ['lib']) >> File "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\ >> distutils\msvc9co >> mpiler.py", line 32, in _merge >> if new in old: >> TypeError: argument of type 'NoneType' is not iterable >> >> I notice that this is calling into Numpy. Do I have modify Numpy in order >> to compile Scipy? I skipped the Numpy specific steps in the instructions >> because I don't need the development version of Numpy. >> >> On Tue, May 2, 2017 at 11:15 PM, Denis Akhiyarov < >> denis.akhiyarov at gmail.com> wrote: >> >>> Did you follow these instructions? >>> >>> https://software.intel.com/en-us/articles/numpyscipy-with-intel-mkl >>> >>> On Tue, May 2, 2017, 10:00 PM David Hagen wrote: >>> >>>> Is there a recipe for this combination? I installed MKL from that link >>>> and the latest Visual Studio. Scipy did not find MKL on its own. I'm sure >>>> there's some environment variables that need to be set, but I don't know >>>> what they are. >>>> >>>> On Tue, May 2, 2017 at 4:00 AM, William Heymann >>>> wrote: >>>> >>>>> Intel has made MKL, TBB, and a few other things completely free to >>>>> use, even in a commercial project. Visual Studio is also free unless you >>>>> are a very large company. >>>>> >>>>> https://software.intel.com/en-us/articles/free-mkl >>>>> >>>>> I have been using that for other projects without any problems and >>>>> compiling with Visual Studio has been very easy. >>>>> >>>>> On Tue, May 2, 2017 at 1:40 AM, David Hagen wrote: >>>>> >>>>>> I'll try to stick with MinGW-w64 for now, but I don't even get to the >>>>>> compilation phase. If I install lapack and blas from conda-forge, it still >>>>>> says that lapack/blas are not found, but you indicated that I need to set >>>>>> some paths. Are there instructions for this? I have no idea what >>>>>> environment variables to set in order to tell Scipy to use these packages. >>>>>> >>>>>> On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov < >>>>>> denis.akhiyarov at gmail.com> wrote: >>>>>> >>>>>>> I still suggest Intel+MSVC compilers, since you can use trial >>>>>>> version or request license for open-source projects from Intel. This is >>>>>>> what Anaconda team is using. Also this is what Christoph Gohlke wheels are >>>>>>> based on: >>>>>>> >>>>>>> http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy >>>>>>> >>>>>>> If you end up with m2w64, here is lapack for conda, you may still >>>>>>> have to modify paths: >>>>>>> >>>>>>> https://anaconda.org/conda-forge/lapack >>>>>>> >>>>>>> And blas: >>>>>>> >>>>>>> https://anaconda.org/search?q=Blas >>>>>>> >>>>>>> On Sun, Apr 30, 2017, 5:22 PM Matthieu Brucher < >>>>>>> matthieu.brucher at gmail.com> wrote: >>>>>>> >>>>>>>> Why do you want to pay Intel? You can install the MKL and develop >>>>>>>> with it, no sweat. >>>>>>>> >>>>>>>> 2017-04-30 22:41 GMT+01:00 David Hagen : >>>>>>>> >>>>>>>>> > Welcome to the world of pain with building scientific packages >>>>>>>>> from source on Windows! >>>>>>>>> >>>>>>>>> I am beginning to feel it. >>>>>>>>> >>>>>>>>> > You need Fortran and C/C++ compilers on Windows to build scipy >>>>>>>>> from source >>>>>>>>> >>>>>>>>> I have MinGW-w64 installed, which seems to be the recommended >>>>>>>>> method. >>>>>>>>> >>>>>>>>> > I?m pretty sure that anaconda does not come with the development >>>>>>>>> files for MKL, only the runtime files. >>>>>>>>> >>>>>>>>> I understand now. It looks like MKL is not the way to go unless I >>>>>>>>> want to pay Intel. >>>>>>>>> >>>>>>>>> > If you don't need mkl and lapack/blas is good enough, then >>>>>>>>> m2w64-toolchain from conda should have all necessary dependencies for >>>>>>>>> building scipy. >>>>>>>>> >>>>>>>>> My only goal is to install and use Scipy master somewhere where it >>>>>>>>> won't break my stable installation. I thought Anaconda would be a good >>>>>>>>> place to start because once I activate an Anaconda environment, I should be >>>>>>>>> able to treat like a normal Python installation and follow the normal >>>>>>>>> install-from-source instructions. I went ahead and installed that >>>>>>>>> m2w64-toolchain package, but it still doesn't find any BLAS/LAPACK. Maybe I >>>>>>>>> should change my question to: how do I install Scipy on Windows from >>>>>>>>> source? Though when I search for this specifically on the web, the answer >>>>>>>>> seems to be "Don't.". It seems that MinGW-w64 and ATLAS are recommended by >>>>>>>>> Scipy. Do you know of a conda/pip package that provides ATLAS for building >>>>>>>>> Scipy or another more suitable BLAS/LAPACK? >>>>>>>>> >>>>>>>>> _______________________________________________ >>>>>>>>> SciPy-User mailing list >>>>>>>>> SciPy-User at python.org >>>>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> Information System Engineer, Ph.D. >>>>>>>> Blog: http://blog.audio-tk.com/ >>>>>>>> LinkedIn: http://www.linkedin.com/in/matthieubrucher >>>>>>>> _______________________________________________ >>>>>>>> SciPy-User mailing list >>>>>>>> SciPy-User at python.org >>>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> SciPy-User mailing list >>>>>>> SciPy-User at python.org >>>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>>> >>>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> SciPy-User mailing list >>>>>> SciPy-User at python.org >>>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.molnar at sbcglobal.net Thu May 4 09:20:53 2017 From: s.molnar at sbcglobal.net (Stephen P. Molnar) Date: Thu, 4 May 2017 09:20:53 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: Message-ID: <590B2AB5.9080505@sbcglobal.net> On 05/04/2017 08:16 AM, David Hagen wrote: > Am I correct that there is no free version of ifort and icc? Will this > break after the 30-trial ends? > > On Thu, May 4, 2017 at 2:58 AM, Denis Akhiyarov > > wrote: > > Ok, I followed instructions on this page: > > https://software.intel.com/en-us/articles/building-numpyscipy-with-intel-mkl-and-intel-fortran-on-windows > > > Installed parallel_studio_xe_2016_update4 with mkl, ifort, and icc. > > 1. I compiled numpy (python setup.py install) from source with this > site.cfg file: > > [mkl] > include_dirs = C:\Program Files > (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\mkl\include > library_dirs = C:\Program Files > (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\mkl\lib\intel64;C:\Program > Files > (x86)\IntelSWTools\parallel_studio_xe_2016.4.062\compilers_and_libraries_2016\windows\compiler\lib\intel64 > mkl_libs = mkl_core_dll, mkl_intel_lp64_dll, mkl_intel_thread_dll > lapack_libs = mkl_lapack95_lp64 > > > 2. I replaced site.cfg file in ..\Lib\site-packages\numpy\distutils > with the one above, since this is used by scipy in step 3. > > 3. I compiled scipy (python setup.py install) from source. > > Compilation went without errors, both packages import without errors > in python. > > > On Wed, May 3, 2017 at 8:26 PM, David Hagen > wrote: > > So those instructions get me farther than I have gotten before. > > Is this actually necessary?: > > Note: You will need to rebuild Python from source files. This is > due to the downloaded Python binary on Windows might > be incompatible with the Visual Studio version you used. > Otherwise you will encounter runtime crash when run numpy or > script tests. > > This is what I have done so far: > > 1) Download Scipy zip file from Github mater branch > 2) Unzip file > 3) Add a site.cfg file with these contents: > [mkl] > library_dirs = C:\Program Files > (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\mkl\lib\intel64_win;C:\Program > Files > (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\compiler\lib\intel64_win > include_dirs = C:\Program Files > (x86)\IntelSWTools\compilers_and_libraries_2017.2.187\windows\mkl\include > mkl_libs = > mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_thread,mkl_core,libiomp5md > lapack_libs = > mkl_lapack95_lp64,mkl_blas95_lp64,mkl_intel_lp64,mkl_intel_thread,mkl_core,libiomp5md > > 4) cd into that directory > 5) run: python setup.py config --compiler=intelemw > --fcompiler=intelvem build_clib --compiler=intelemw > --fcompiler=intelvem build_ext --compiler=intelemw > --fcompiler=intelvem install > > It gets really far into the build but this is the error that > results: > > building extension "scipy.spatial.qhull" sources > creating build\src.win-amd64-3.6\scipy\spatial > Could not locate executable icc > Could not locate executable ecc > Traceback (most recent call last): > File "setup.py", line 417, in > setup_package() > File "setup.py", line 413, in setup_package > setup(**metadata) > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\core.py > ", line 166, in setup > return old_setup(**new_attr) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\core.py", > line 148, in setu > p > dist.run_commands() > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", > line 955, in run_ > commands > self.run_command(cmd) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", > line 974, in run_ > command > cmd_obj.run() > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \build_clib.py", line 74, in run > self.run_command('build_src') > File "C:\Anaconda3\envs\scipy_master\lib\distutils\cmd.py", > line 313, in run_c > ommand > self.distribution.run_command(command) > File "C:\Anaconda3\envs\scipy_master\lib\distutils\dist.py", > line 974, in run_ > command > cmd_obj.run() > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \build_src.py", line 148, in run > self.build_sources() > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \build_src.py", line 165, in build_sources > self.build_extension_sources(ext) > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \build_src.py", line 324, in build_extension_sources > sources = self.generate_sources(sources, ext) > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \build_src.py", line 377, in generate_sources > source = func(extension, build_dir) > File "scipy\spatial\setup.py", line 35, in get_qhull_misc_config > if config_cmd.check_func('open_memstream', decl=True, > call=True): > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \config.py", line 312, in check_func > self._check_compiler() > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\command > \config.py", line 51, in _check_compiler > self.compiler.initialize() > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\intelcc > ompiler.py", line 86, in initialize > MSVCCompiler.initialize(self, plat_name) > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\msvc9co > mpiler.py", line 53, in initialize > os.environ['lib'] = _merge(environ_lib, os.environ['lib']) > File > "C:\Anaconda3\envs\scipy_master\lib\site-packages\numpy\distutils\msvc9co > mpiler.py", line 32, in _merge > if new in old: > TypeError: argument of type 'NoneType' is not iterable > > I notice that this is calling into Numpy. Do I have modify Numpy > in order to compile Scipy? I skipped the Numpy specific steps in > the instructions because I don't need the development version of > Numpy. > > On Tue, May 2, 2017 at 11:15 PM, Denis Akhiyarov > > > wrote: > > Did you follow these instructions? > > https://software.intel.com/en-us/articles/numpyscipy-with-intel-mkl > > > > On Tue, May 2, 2017, 10:00 PM David Hagen > wrote: > > Is there a recipe for this combination? I installed MKL > from that link and the latest Visual Studio. Scipy did > not find MKL on its own. I'm sure there's some > environment variables that need to be set, but I don't > know what they are. > > On Tue, May 2, 2017 at 4:00 AM, William Heymann > > wrote: > > Intel has made MKL, TBB, and a few other things > completely free to use, even in a commercial > project. Visual Studio is also free unless you are a > very large company. > > https://software.intel.com/en-us/articles/free-mkl > > > I have been using that for other projects without > any problems and compiling with Visual Studio has > been very easy. > > On Tue, May 2, 2017 at 1:40 AM, David Hagen > > wrote: > > I'll try to stick with MinGW-w64 for now, but I > don't even get to the compilation phase. If I > install lapack and blas from conda-forge, it > still says that lapack/blas are not found, but > you indicated that I need to set some paths. Are > there instructions for this? I have no idea what > environment variables to set in order to tell > Scipy to use these packages. > > On Mon, May 1, 2017 at 3:44 AM, Denis Akhiyarov > > wrote: > > I still suggest Intel+MSVC compilers, since > you can use trial version or request license > for open-source projects from Intel. This is > what Anaconda team is using. Also this is > what Christoph Gohlke wheels are based on: > > http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy > > > If you end up with m2w64, here is lapack for > conda, you may still have to modify paths: > > https://anaconda.org/conda-forge/lapack > > > And blas: > > https://anaconda.org/search?q=Blas > > > > On Sun, Apr 30, 2017, 5:22 PM Matthieu > Brucher > wrote: > > Why do you want to pay Intel? You can > install the MKL and develop with it, no > sweat. > > 2017-04-30 22:41 GMT+01:00 David Hagen > >: > > > Welcome to the world of pain with building scientific packages from source on Windows! > > I am beginning to feel it. > > > You need Fortran and C/C++ compilers on Windows to build scipy from source > > I have MinGW-w64 installed, which > seems to be the recommended method. > > > I?m pretty sure that anaconda does not come with the development files for MKL, only the runtime files. > > I understand now. It looks like MKL > is not the way to go unless I want > to pay Intel. > > > If you don't need mkl and lapack/blas is good enough, then m2w64-toolchain from conda should have all necessary dependencies for building scipy. > > My only goal is to install and use > Scipy master somewhere where it > won't break my stable installation. > I thought Anaconda would be a good > place to start because once I > activate an Anaconda environment, I > should be able to treat like a > normal Python installation and > follow the normal > install-from-source instructions. I > went ahead and installed that > m2w64-toolchain package, but it > still doesn't find any BLAS/LAPACK. > Maybe I should change my question > to: how do I install Scipy on > Windows from source? Though when I > search for this specifically on the > web, the answer seems to be > "Don't.". It seems that MinGW-w64 > and ATLAS are recommended by Scipy. > Do you know of a conda/pip package > that provides ATLAS for building > Scipy or another more suitable > BLAS/LAPACK? > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > > -- > Information System Engineer, Ph.D. > Blog: http://blog.audio-tk.com/ > LinkedIn: > http://www.linkedin.com/in/matthieubrucher > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-us No, you are not correct about ifort. It's wstill available. ayou just won't receive support. -- Stephen P. Molnar, Ph.D. Life is a fuzzy set www.molecular-modeling.net Stochastic and multivariate (614)312-7528 (c) Skype: smolnar1 From david at drhagen.com Thu May 4 10:54:41 2017 From: david at drhagen.com (David Hagen) Date: Thu, 4 May 2017 10:54:41 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: <590B2AB5.9080505@sbcglobal.net> References: <590B2AB5.9080505@sbcglobal.net> Message-ID: > No, you are not correct about ifort. It's wstill available. ayou just won't receive support. I must be unable to Google correctly. I can find free MKL for everyone. I can find free ifort and icc for students. I can find free ifort and icc for Linux for open source developers. But I can't find free ifort and icc for Windows for open source developers. This is what I am looking at: https://software.intel.com/en-us/articles/free-ipsxe-tools-and-libraries -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Thu May 4 13:04:44 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Thu, 4 May 2017 12:04:44 -0500 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> Message-ID: You can sign-up for beta program which includes Windows ifort compiler. https://software.intel.com/en-us/articles/intel-parallel-studio-xe-2018-beta "The beta program officially ends *June 22nd, 2017*. The beta license provided will expire October 12, 2017." On Thu, May 4, 2017 at 9:54 AM, David Hagen wrote: > > No, you are not correct about ifort. It's wstill available. ayou just > won't receive support. > > I must be unable to Google correctly. I can find free MKL for everyone. I > can find free ifort and icc for students. I can find free ifort and icc for > Linux for open source developers. But I can't find free ifort and icc for > Windows for open source developers. This is what I am looking at: > https://software.intel.com/en-us/articles/free-ipsxe-tools-and-libraries > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.molnar at sbcglobal.net Thu May 4 13:17:58 2017 From: s.molnar at sbcglobal.net (Stephen P. Molnar) Date: Thu, 4 May 2017 13:17:58 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> Message-ID: <590B6246.6090607@sbcglobal.net> On 05/04/2017 10:54 AM, David Hagen wrote: > > No, you are not correct about ifort. It's wstill available. ayou > just won't receive support. > > I must be unable to Google correctly. I can find free MKL for everyone. > I can find free ifort and icc for students. I can find free ifort and > icc for Linux for open source developers. But I can't find free ifort > and icc for Windows for open source developers. This is what I am > looking at: > https://software.intel.com/en-us/articles/free-ipsxe-tools-and-libraries > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > Check out: https://software.intel.com/en-us/free-software -- Stephen P. Molnar, Ph.D. Life is a fuzzy set www.molecular-modeling.net Stochastic and multivariate (614)312-7528 (c) Skype: smolnar1 From noflaco at gmail.com Fri May 5 19:18:02 2017 From: noflaco at gmail.com (Carlton Banks) Date: Sat, 6 May 2017 01:18:02 +0200 Subject: [SciPy-User] multiple cores Message-ID: <8E823DFF-2412-4E9D-A8D5-B3D4B910D755@gmail.com> Anyone are of whether SciPy use multiple cores? if not is there a possibility of this? -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Fri May 5 20:27:58 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Fri, 5 May 2017 17:27:58 -0700 Subject: [SciPy-User] multiple cores In-Reply-To: <8E823DFF-2412-4E9D-A8D5-B3D4B910D755@gmail.com> References: <8E823DFF-2412-4E9D-A8D5-B3D4B910D755@gmail.com> Message-ID: Hi, Carlton. This is not an area of expertise of mine, but I took the liberty of Googling the topic and here's the top hit: http://scipy.github.io/old-wiki/pages/ParallelProgramming. The top line is: "This is an archival dump of old wiki content --- see scipy.org for current material," so I clicked on the link to go to the scipy home page and repeated my search there...and the top hit was the same page. :-( Scrolling down in these search results, I see some more-or-less relevant links, but nothing general, so I started reading the "old" page. These are the highlights, IMO: "numpy/scipy are not perfect in this area, but there are some things you can do." "The best way to make use of a parallel processing system depend on the task you're doing and on the parallel system you're using." (I regard this as particularly pertinent, absent knowledge of your system, as one thing I do know is that Windows is, or at least was--I stopped using Windoze with the introduction of 8--particularly "protective" of the allocation of processors; I'm less familiar with the "friendliness" of Posix flavors, including Linux and Mac.) And finally: ""premature optimization is the root of all evil "...Get your code working first, before even thinking about parallelization. Then ask yourself whether your code actually needs to be any faster. Don't embark on the bug-strewn path of parallelization unless you have to." Wiser words are rarely spoken. Nevertheless, though billed as old, that document _appears_ to be the most current, general document on parallel programming w/ scipy, so it appears to be the place to start (and it _does_ have specific suggestions on how and when to parallelize scipy code). I'd hope that more expert voices will chime in, but _sometimes_ that takes a little while, so I wanted to give you something to mull in the meantime. FWIW, DLG On Fri, May 5, 2017 at 4:18 PM, Carlton Banks wrote: > Anyone are of whether SciPy > use > multiple cores? > > if not is there a possibility of this? > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Sat May 6 10:24:26 2017 From: david at drhagen.com (David Hagen) Date: Sat, 6 May 2017 10:24:26 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: I noticed that I had two versions of Visual Studio installed: Microsoft Visual C++ 14.0 and Microsoft Visual Studio 2017. I thought that this might be confusing the installer, so I uninstalled the older Microsoft Visual C++ 14.0. However, the Numpy installation now gives a new error: error: Microsoft Visual C++ 14.0 is required. Get it with "Microsoft Visual C++ Build Tools": http://landinghub.visualstudio.com/visual-cpp-build-tools This seems to make it clear that uninstalling was a mistake, but the Numpy build instructions say to specifically get the latest version of Visual Studio. -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Sat May 6 10:09:02 2017 From: david at drhagen.com (David Hagen) Date: Sat, 6 May 2017 10:09:02 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: <590B6246.6090607@sbcglobal.net> References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: I am following Stephen's instructions with the Beta distribution of Intel Parallel Studio. The compilation of Numpy fails with this error: LINK : fatal error LNK1158: cannot run 'rc.exe' It also has this message during setup: customize IntelVisualFCompiler Could not locate executable ifort Could not locate executable ifl and this: Found executable C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe Do I need to configure Numpy to use the Parallel Studio compilers rather than whatever else I may have installed? -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Sat May 6 11:23:20 2017 From: david at drhagen.com (David Hagen) Date: Sat, 6 May 2017 11:23:20 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: I reinstalled Microsoft Visual C++ 14.0. Then following various advice on the web, I copied rc.exe and rcdll.exe from C:\Program Files (x86)\Windows Kits\8.1\bin\x64 into C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\x86_amd64 It seems that MSVC++ 14.0 and that weird file copying is necessary for compiling Numpy. I can now install Numpy from source using python setup.py install. I then copied the site.cfg into every numpy/distutils directory I could find. However, compiling Scipy gives a LNK2001 linking error with some _gfortran symbols: mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_st_write mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_st_write_done mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_stop_numeric_f08 mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_transfer_character_write mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_transfer_integer_write mach.lib(d1mach.o) : error LNK2001: unresolved external symbol _gfortran_stop_string -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Sat May 6 11:25:46 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Sat, 06 May 2017 15:25:46 +0000 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: Hi, David. You (re-?)learned a frustrating lesson: when testing the hypothesis that an installation is "in the way," before/instead of uninstalling, rename/move. Now that i've moralized, with evidence of a "bug in the doc", my belief is that you will be asked to file a ticket (doc bugs are considered just as legitimate bugs as code bugs). The only exception to this i can see is if someone else chimes in with a reasonable explanation as to why that's a "feature," not a bug, but in this instance, given what you say, that doesn't seem likely. DLG On Sat, May 6, 2017 at 7:50 AM David Hagen wrote: > I noticed that I had two versions of Visual Studio installed: Microsoft > Visual C++ 14.0 and Microsoft Visual Studio 2017. I thought that this might > be confusing the installer, so I uninstalled the older Microsoft Visual C++ > 14.0. However, the Numpy installation now gives a new error: > > error: Microsoft Visual C++ 14.0 is required. Get it with "Microsoft > Visual C++ > Build Tools": http://landinghub.visualstudio.com/visual-cpp-build-tools > > This seems to make it clear that uninstalling was a mistake, but the Numpy > build instructions > say to specifically get the latest version of Visual Studio. > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaredvacanti at gmail.com Sun May 7 14:45:17 2017 From: jaredvacanti at gmail.com (Jared Vacanti) Date: Sun, 7 May 2017 13:45:17 -0500 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions Message-ID: I am trying to fit a polynomial to observational data with shape restrictions - in this particular case monotonicity (decreasing) of the function and an always positive second derivative. Some of the interpolation classes have a mathematical "built-in" restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty imposes some restrictions, but it's not explicit or adjustable. What are my options for imposing boundary conditions or shape restrictions on the spline? I have sample data here: import pandas as pd df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") or available independently at https://bpaste.net/raw/3e20878b5237 I have tried using a interior point convex optimization solver, but the results seem to be numerically finicky. Are there other alternatives? -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun May 7 15:02:22 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 7 May 2017 15:02:22 -0400 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti wrote: > I am trying to fit a polynomial to observational data with shape > restrictions - in this particular case monotonicity (decreasing) of the > function and an always positive second derivative. > > Some of the interpolation classes have a mathematical "built-in" > restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty > imposes some restrictions, but it's not explicit or adjustable. > > What are my options for imposing boundary conditions or shape restrictions > on the spline? > > I have sample data here: > > import pandas as pd > df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") > > or available independently at https://bpaste.net/raw/3e20878b5237 > > I have tried using a interior point convex optimization solver, but the > results seem to be numerically finicky. Are there other alternatives? > As far as I know, pchip is the only one with monotonicity constraints https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.pchip_interpolate.html Josef > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaredvacanti at gmail.com Sun May 7 15:12:32 2017 From: jaredvacanti at gmail.com (Jared Vacanti) Date: Sun, 7 May 2017 14:12:32 -0500 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: Are there other techniques of fitting splines (perhaps using scipy.optimize ?) where I can impose boundary conditions myself, even outside of the options provided in scipy.interpolate? On Sun, May 7, 2017 at 2:02 PM, wrote: > > > On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti > wrote: > >> I am trying to fit a polynomial to observational data with shape >> restrictions - in this particular case monotonicity (decreasing) of the >> function and an always positive second derivative. >> >> Some of the interpolation classes have a mathematical "built-in" >> restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty >> imposes some restrictions, but it's not explicit or adjustable. >> >> What are my options for imposing boundary conditions or shape >> restrictions on the spline? >> >> I have sample data here: >> >> import pandas as pd >> df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") >> >> or available independently at https://bpaste.net/raw/3e20878b5237 >> >> I have tried using a interior point convex optimization solver, but the >> results seem to be numerically finicky. Are there other alternatives? >> > > As far as I know, pchip is the only one with monotonicity constraints > https://docs.scipy.org/doc/scipy/reference/generated/ > scipy.interpolate.pchip_interpolate.html > > Josef > > > >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Sun May 7 15:34:12 2017 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sun, 7 May 2017 22:34:12 +0300 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: On Sun, May 7, 2017 at 10:12 PM, Jared Vacanti wrote: > Are there other techniques of fitting splines (perhaps using scipy.optimize > ?) where I can impose boundary conditions myself, even outside of the > options provided in scipy.interpolate? FITPACK has some spline fitting with convexity constraints, but I don't think it's exposed to python: https://github.com/scipy/scipy/blob/master/scipy/interpolate/fitpack/concon.f https://github.com/scipy/scipy/blob/master/scipy/interpolate/fitpack/cocosp.f Hyman et al have some discussion of constructing local interpolants with convexity constraints: http://math.lanl.gov/~mac/papers/numerics/DEH89.pdf and references therein. > On Sun, May 7, 2017 at 2:02 PM, wrote: >> >> >> >> On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti >> wrote: >>> >>> I am trying to fit a polynomial to observational data with shape >>> restrictions - in this particular case monotonicity (decreasing) of the >>> function and an always positive second derivative. >>> >>> Some of the interpolation classes have a mathematical "built-in" >>> restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty >>> imposes some restrictions, but it's not explicit or adjustable. >>> >>> What are my options for imposing boundary conditions or shape >>> restrictions on the spline? >>> >>> I have sample data here: >>> >>> import pandas as pd >>> df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") >>> >>> or available independently at https://bpaste.net/raw/3e20878b5237 >>> >>> I have tried using a interior point convex optimization solver, but the >>> results seem to be numerically finicky. Are there other alternatives? >> >> >> As far as I know, pchip is the only one with monotonicity constraints >> https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.pchip_interpolate.html >> >> Josef >> >> >>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > From eulergaussriemann at gmail.com Sun May 7 16:01:51 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Sun, 07 May 2017 20:01:51 +0000 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: I'd suggest doing some advance pencil-and-paper work to determine what your constraints imply about the relationships among the coefficients. For example, you mention that the second derivative should always be positive: you can readily formalize this by writing down the second derative of your model, adding a slack variable, which you constrain to be strictly negative, and then set that second-derivative-plus-slack expression equal to zero: oui la, an additional equation--which can be added to the matrix representation of your minimization problem--to rigorously represent your second derivative constraint! The monotonicity requirement can be treated similarly, as a limitation on the first derivative, though it may require introduction of additional parameters, perhaps in the form of bounds on the interval over which you plan to use the model. But in any event, with a little pre-programming effort, you should be able to readily solve your problem without having to resort to any programming "tricks" or "workarounds." (I was once charged with writing a program to explore the nine-dimensional parameter space of eighth-order polynomials to look for sub-spaces with particular properties; too late was i informed that there were additional "pre-constraints," e.g., monotonicity such as you describe, and strict positiveness of the polynomial output--that i could have used to significantly reduce both the dimensionality of and the ranges in the space I needed to search. This was in the area of plasma confinement simulation: may i ask, in what area you are working?) HTH, DLG On Sun, May 7, 2017 at 12:13 PM Jared Vacanti wrote: > Are there other techniques of fitting splines (perhaps using > scipy.optimize ?) where I can impose boundary conditions myself, even > outside of the options provided in scipy.interpolate? > > On Sun, May 7, 2017 at 2:02 PM, wrote: > >> >> >> On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti >> wrote: >> >>> I am trying to fit a polynomial to observational data with shape >>> restrictions - in this particular case monotonicity (decreasing) of the >>> function and an always positive second derivative. >>> >>> Some of the interpolation classes have a mathematical "built-in" >>> restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty >>> imposes some restrictions, but it's not explicit or adjustable. >>> >>> What are my options for imposing boundary conditions or shape >>> restrictions on the spline? >>> >>> I have sample data here: >>> >>> import pandas as pd >>> df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") >>> >>> or available independently at https://bpaste.net/raw/3e20878b5237 >>> >>> I have tried using a interior point convex optimization solver, but the >>> results seem to be numerically finicky. Are there other alternatives? >>> >> >> As far as I know, pchip is the only one with monotonicity constraints >> https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.pchip_interpolate.html >> >> Josef >> >> >> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun May 7 16:36:46 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 7 May 2017 16:36:46 -0400 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: On Sun, May 7, 2017 at 3:34 PM, Evgeni Burovski wrote: > On Sun, May 7, 2017 at 10:12 PM, Jared Vacanti > wrote: > > Are there other techniques of fitting splines (perhaps using > scipy.optimize > > ?) where I can impose boundary conditions myself, even outside of the > > options provided in scipy.interpolate? > > FITPACK has some spline fitting with convexity constraints, but I > don't think it's exposed to python: > > https://github.com/scipy/scipy/blob/master/scipy/ > interpolate/fitpack/concon.f > https://github.com/scipy/scipy/blob/master/scipy/ > interpolate/fitpack/cocosp.f > > Hyman et al have some discussion of constructing local interpolants > with convexity constraints: > http://math.lanl.gov/~mac/papers/numerics/DEH89.pdf > and references therein. > > I looked into monotonicity and similar constraints for splines several years ago. I gave up because adding all the constraints that are implied on the coefficients (values and derivatives at the knots) looked difficult, and it is difficult to get the relevant parameters directly from the scipy splines. AFAIR and I was hoping for somebody to come along with more background and more incentive than I had. Josef > > > > On Sun, May 7, 2017 at 2:02 PM, wrote: > >> > >> > >> > >> On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti > >> wrote: > >>> > >>> I am trying to fit a polynomial to observational data with shape > >>> restrictions - in this particular case monotonicity (decreasing) of the > >>> function and an always positive second derivative. > >>> > >>> Some of the interpolation classes have a mathematical "built-in" > >>> restriction - like scipy.interpolate.Rbf's thin-plate roughness penalty > >>> imposes some restrictions, but it's not explicit or adjustable. > >>> > >>> What are my options for imposing boundary conditions or shape > >>> restrictions on the spline? > >>> > >>> I have sample data here: > >>> > >>> import pandas as pd > >>> df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") > >>> > >>> or available independently at https://bpaste.net/raw/3e20878b5237 > >>> > >>> I have tried using a interior point convex optimization solver, but the > >>> results seem to be numerically finicky. Are there other alternatives? > >> > >> > >> As far as I know, pchip is the only one with monotonicity constraints > >> https://docs.scipy.org/doc/scipy/reference/generated/ > scipy.interpolate.pchip_interpolate.html > >> > >> Josef > >> > >> > >>> > >>> > >>> _______________________________________________ > >>> SciPy-User mailing list > >>> SciPy-User at python.org > >>> https://mail.python.org/mailman/listinfo/scipy-user > >>> > >> > >> > >> _______________________________________________ > >> SciPy-User mailing list > >> SciPy-User at python.org > >> https://mail.python.org/mailman/listinfo/scipy-user > >> > > > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Sun May 7 16:52:51 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Sun, 07 May 2017 20:52:51 +0000 Subject: [SciPy-User] Fitting Polynomial With Shape Restrictions In-Reply-To: References: Message-ID: Hi, josef. If you mean having generic code that automates the process, then you are probably correct: there's just too many degrees of freedom to account for all possibilities in a generic program (at least that's the way it appears to me, too). But using known math to work out, "by hand," these implications in specific cases prior to using the generic code tools we do have, largely thanks to you, is eminently "do-able." Thanks for all you've done for scipy, lo these many years! DLG On Sun, May 7, 2017 at 1:37 PM wrote: > On Sun, May 7, 2017 at 3:34 PM, Evgeni Burovski < > evgeny.burovskiy at gmail.com> wrote: > >> On Sun, May 7, 2017 at 10:12 PM, Jared Vacanti >> wrote: >> > Are there other techniques of fitting splines (perhaps using >> scipy.optimize >> > ?) where I can impose boundary conditions myself, even outside of the >> > options provided in scipy.interpolate? >> >> FITPACK has some spline fitting with convexity constraints, but I >> don't think it's exposed to python: >> >> >> https://github.com/scipy/scipy/blob/master/scipy/interpolate/fitpack/concon.f >> >> https://github.com/scipy/scipy/blob/master/scipy/interpolate/fitpack/cocosp.f >> >> Hyman et al have some discussion of constructing local interpolants >> with convexity constraints: >> http://math.lanl.gov/~mac/papers/numerics/DEH89.pdf >> and references therein. >> >> > I looked into monotonicity and similar constraints for splines several > years ago. > I gave up because adding all the constraints that are implied on the > coefficients (values and derivatives at the knots) looked difficult, and it > is difficult to get the relevant parameters directly from the scipy splines. > > AFAIR > and I was hoping for somebody to come along with more background and more > incentive than I had. > > Josef > > > >> >> >> > On Sun, May 7, 2017 at 2:02 PM, wrote: >> >> >> >> >> >> >> >> On Sun, May 7, 2017 at 2:45 PM, Jared Vacanti >> >> wrote: >> >>> >> >>> I am trying to fit a polynomial to observational data with shape >> >>> restrictions - in this particular case monotonicity (decreasing) of >> the >> >>> function and an always positive second derivative. >> >>> >> >>> Some of the interpolation classes have a mathematical "built-in" >> >>> restriction - like scipy.interpolate.Rbf's thin-plate roughness >> penalty >> >>> imposes some restrictions, but it's not explicit or adjustable. >> >>> >> >>> What are my options for imposing boundary conditions or shape >> >>> restrictions on the spline? >> >>> >> >>> I have sample data here: >> >>> >> >>> import pandas as pd >> >>> df = pd.read_csv("https://bpaste.net/raw/3e20878b5237") >> >>> >> >>> or available independently at https://bpaste.net/raw/3e20878b5237 >> >>> >> >>> I have tried using a interior point convex optimization solver, but >> the >> >>> results seem to be numerically finicky. Are there other alternatives? >> >> >> >> >> >> As far as I know, pchip is the only one with monotonicity constraints >> >> >> https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.pchip_interpolate.html >> >> >> >> Josef >> >> >> >> >> >>> >> >>> >> >>> _______________________________________________ >> >>> SciPy-User mailing list >> >>> SciPy-User at python.org >> >>> https://mail.python.org/mailman/listinfo/scipy-user >> >>> >> >> >> >> >> >> _______________________________________________ >> >> SciPy-User mailing list >> >> SciPy-User at python.org >> >> https://mail.python.org/mailman/listinfo/scipy-user >> >> >> > >> > >> > _______________________________________________ >> > SciPy-User mailing list >> > SciPy-User at python.org >> > https://mail.python.org/mailman/listinfo/scipy-user >> > >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Tue May 9 00:25:58 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Mon, 8 May 2017 23:25:58 -0500 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: 1. The documentation definitely needs improvements. The difference here is that CPython requirements for MSVC compilers from Visual Studio are not the same as ifort requirements for Visual Studio. For example MSVC compiler for Python 2.7 (aka.ms/vcpython27) is likely not enough to install ifort. So 2.7 - MSVC 2008, 3.4 - MSVC 2010, 3.5-3.6 MSVC 2015/2017: https://wiki.python.org/moin/WindowsCompilers Here are ifort requirements for VS: https://software.intel.com /en-us/articles/troubleshooting-fortran-integration-issues-w ith-visual-studio 2. VS 2017 is supported by setuptools 34.4.0+ 3. Anyway I followed Intel article about scipy and ifort using VS 2015 with C++ compiler installed. In python 3.5, 64-bit both scipy and numpy compile without any issues even with beta 2018 ifort compiler. Don't forget to use the command prompt provided by Intel, Step 3 - Configuration: https://software.intel.com/en-us/articles/building-numpyscipy-with-intel-mkl-and-intel-fortran-on-windows On Sat, May 6, 2017 at 10:25 AM, David Goldsmith < eulergaussriemann at gmail.com> wrote: > Hi, David. You (re-?)learned a frustrating lesson: when testing the > hypothesis that an installation is "in the way," before/instead of > uninstalling, rename/move. > > Now that i've moralized, with evidence of a "bug in the doc", my belief is > that you will be asked to file a ticket (doc bugs are considered just as > legitimate bugs as code bugs). The only exception to this i can see is if > someone else chimes in with a reasonable explanation as to why that's a > "feature," not a bug, but in this instance, given what you say, that > doesn't seem likely. > > DLG > > On Sat, May 6, 2017 at 7:50 AM David Hagen wrote: > >> I noticed that I had two versions of Visual Studio installed: Microsoft >> Visual C++ 14.0 and Microsoft Visual Studio 2017. I thought that this might >> be confusing the installer, so I uninstalled the older Microsoft Visual C++ >> 14.0. However, the Numpy installation now gives a new error: >> >> error: Microsoft Visual C++ 14.0 is required. Get it with "Microsoft >> Visual C++ >> Build Tools": http://landinghub.visualstudio.com/visual-cpp-build-tools >> >> This seems to make it clear that uninstalling was a mistake, but the Numpy >> build instructions >> say to specifically get the latest version of Visual Studio. >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Tue May 9 06:44:57 2017 From: david at drhagen.com (David Hagen) Date: Tue, 9 May 2017 06:44:57 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: > The difference here is that CPython requirements for MSVC compilers from Visual Studio are not the same as ifort requirements for Visual Studio. I currently have Python 3.6.1. I have both Visual Studio C++ 14.0 (I think this is the same thing as VS2015) and Visual Studio 2017 installed. I have Parallel Studio XE 2018 Beta installed. I am not sure if that is all compatible because 2018 Beta isn't on there. Should I uninstall VS 2017? > VS 2017 is supported by setuptools 34.4.0+ Should I upgrade from 27.2.0, which for whatever reason is the version that conda installs with Python 3.6.1? > Anyway I followed Intel article about scipy and ifort using VS 2015 with C++ compiler installed. I am able to run plain python setup.py install on the latest version of Numpy and it installs fine. (Doing the same with Scipy errors out with missing symbols described previously.) However, running the install command given for Intel64 in the link it fails. I followed everything except for recompile Python. It gets to here: running config running build_clib running build_src build_src building py_modules sources building library "npymath" sources Could not locate executable icc Could not locate executable ecc And then throws a NoneType is not iterable exception: ... File "c:\numpy-1.12.1\numpy\distutils\msvc9compiler.py", line 53, in initialize os.environ['lib'] = _merge(environ_lib, os.environ['lib']) File "c:\numpy-1.12.1\numpy\distutils\msvc9compiler.py", line 32, in _merge if new in old: TypeError: argument of type 'NoneType' is not iterable This happen both with and without modifying the compile options. It looks like it can't find icc, but without a real error, it is difficult to tell if that was a notify, warning, or error. It installs with just python setup.py install. But if I do that is that the reason I get symbol errors trying to install Scipy? Is it just managing to muddle through with one of my other compilers but not be linkable by Scipy? I have Intel C++ Compiler installed as part of the Parallel Studio install. -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.pollak at jku.at Wed May 10 03:14:42 2017 From: robert.pollak at jku.at (Robert Pollak) Date: Wed, 10 May 2017 09:14:42 +0200 Subject: [SciPy-User] EuroSciPy 2017 In-Reply-To: <20170208180203.GJ14363@pi-x230> References: <0a08235d-eb73-56ee-b937-df840d4b4176@jku.at> <20170208180203.GJ14363@pi-x230> Message-ID: <8a4db233-6ada-77c7-985c-9104e24f51c7@jku.at> Dear EuroSciPy organizers, as it's May already and I don't see anything announced, I would like to ask again about (the date of) this year's conference. Best regards, Robert On 2017-02-08 19:02, Pierre de Buyl wrote to scipy-user: > Dear Robert, > > The official contact information for EuroSciPy is euroscipy-org at python.org or > https://twitter.com/EuroSciPy > > The dates are not set yet as we are trying to arrange for the venue (the > previous venue could not be taken at our typical dates). > > The official announcement will be posted on our Twitter account and also on the > NumPy and SciPy lists. > > Best regards, > > Pierre > > > On Tue, Feb 07, 2017 at 04:28:01PM +0100, Robert Pollak wrote to scipy-user: >> Dear EuroSciPy team, >> >> I have noticed that EuroSciPy 2017 in Erlangen is announced as >> "upcoming" on https://conference.scipy.org/ . Is there already a date fixed? >> >> Best regards, >> Robert From charlesr.harris at gmail.com Wed May 10 21:48:34 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 10 May 2017 19:48:34 -0600 Subject: [SciPy-User] NumPy v1.13.0rc1 released. Message-ID: Hi All, I'm please to announce the NumPy 1.13.0rc1 release. This release supports Python 2.7 and 3.4-3.6 and contains many new features. It is one of the most ambitious releases in the last several years. Some of the highlights and new functions are *Highlights* - Operations like ``a + b + c`` will reuse temporaries on some platforms, resulting in less memory use and faster execution. - Inplace operations check if inputs overlap outputs and create temporaries to avoid problems. - New __array_ufunc__ attribute provides improved ability for classes to override default ufunc behavior. - New np.block function for creating blocked arrays. *New functions* - New ``np.positive`` ufunc. - New ``np.divmod`` ufunc provides more efficient divmod. - New ``np.isnat`` ufunc tests for NaT special values. - New ``np.heaviside`` ufunc computes the Heaviside function. - New ``np.isin`` function, improves on ``in1d``. - New ``np.block`` function for creating blocked arrays. - New ``PyArray_MapIterArrayCopyIfOverlap`` added to NumPy C-API. Wheels for the pre-release are available on PyPI. Source tarballs, zipfiles, release notes, and the Changelog are available on github . A total of 100 people contributed to this release. People with a "+" by their names contributed a patch for the first time. - A. Jesse Jiryu Davis + - Alessandro Pietro Bardelli + - Alex Rothberg + - Alexander Shadchin - Allan Haldane - Andres Guzman-Ballen + - Antoine Pitrou - Antony Lee - B R S Recht + - Baurzhan Muftakhidinov + - Ben Rowland - Benda Xu + - Blake Griffith - Bradley Wogsland + - Brandon Carter + - CJ Carey - Charles Harris - Danny Hermes + - Duke Vijitbenjaronk + - Egor Klenin + - Elliott Forney + - Elliott M Forney + - Endolith - Eric Wieser - Erik M. Bray - Eugene + - Evan Limanto + - Felix Berkenkamp + - Fran?ois Bissey + - Frederic Bastien - Greg Young - Gregory R. Lee - Importance of Being Ernest + - Jaime Fernandez - Jakub Wilk + - James Cowgill + - James Sanders - Jean Utke + - Jesse Thoren + - Jim Crist + - Joerg Behrmann + - John Kirkham - Jonathan Helmus - Jonathan L Long - Jonathan Tammo Siebert + - Joseph Fox-Rabinovitz - Joshua Loyal + - Juan Nunez-Iglesias + - Julian Taylor - Kirill Balunov + - Likhith Chitneni + - Lo?c Est?ve - Mads Ohm Larsen - Marein K?nings + - Marten van Kerkwijk - Martin Thoma - Martino Sorbaro + - Marvin Schmidt + - Matthew Brett - Matthias Bussonnier + - Matthias C. M. Troffaes + - Matti Picus - Michael Seifert - Mikhail Pak + - Mortada Mehyar - Nathaniel J. Smith - Nick Papior - Oscar Villellas + - Pauli Virtanen - Pavel Potocek - Pete Peeradej Tanruangporn + - Philipp A + - Ralf Gommers - Robert Kern - Roland Kaufmann + - Ronan Lamy - Sami Salonen + - Sanchez Gonzalez Alvaro - Sebastian Berg - Shota Kawabuchi - Simon Gibbons - Stefan Otte - Stefan Peterson + - Stephan Hoyer - S?ren Fuglede J?rgensen + - Takuya Akiba - Tom Boyd + - Ville Skytt? + - Warren Weckesser - Wendell Smith - Yu Feng - Zixu Zhao + - Z? Vin?cius + - aha66 + - davidjn + - drabach + - drlvk + - jsh9 + - solarjoe + - zengi + Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed May 10 22:15:56 2017 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 10 May 2017 19:15:56 -0700 Subject: [SciPy-User] [Numpy-discussion] NumPy v1.13.0rc1 released. In-Reply-To: References: Message-ID: On Wed, May 10, 2017 at 7:06 PM, Nathan Goldbaum wrote: > Hi Chuck, > > Is there a docs build for this release somewhere? I'd like to find an > authoritative reference about __array_ufunc__, which I'd hesistated on > looking into until now for fear about the API changing. A sort-of-rendered version of the end-user docs can be seen here: https://github.com/numpy/numpy/blob/master/doc/source/reference/arrays.classes.rst And the NEP has been updated to hopefully provide a more spec-like description of the final version: https://github.com/numpy/numpy/blob/master/doc/neps/ufunc-overrides.rst Note that the API is "provisional" in 1.13, i.e. it *might* change in backwards-incompatible ways: https://docs.python.org/3/glossary.html#term-provisional-api -n -- Nathaniel J. Smith -- https://vorpus.org From kenny.vermeers at gmail.com Thu May 11 06:12:05 2017 From: kenny.vermeers at gmail.com (Kenny Vermeers) Date: Thu, 11 May 2017 12:12:05 +0200 Subject: [SciPy-User] Arc length of parametric spline Message-ID: Hi, I have a curve given by x,y points and interpolated it using splines with parameter t. t is in [0-1] by default but I would like to remap this to [0-arclength]. My plan was to calculate the arc length analytically or numerically using the light blue highlighted formula in http://tutorial.math.lamar.edu/Classes/CalcII/ParaArcLength.aspx i.e. int_0to1(sqrt((dx/dt)^2 + (dy/dt)^2))dt In order to do this I need the splines dx/dt and dy/dt,. I was going to do this as follows tck, u = interpolate.splprep([x, y]) dspl = interpolate.splder(tck) hoping to get dx/dt and dy/dt and then solving the integral. Unfortunately that doesn't work since splder doesn't seem to accept parametric splines. Do you have any suggestions on how to get the spline parameter from 0 to arclen? Thanks! Kenny -------------- next part -------------- An HTML attachment was scrubbed... URL: From nadavh at visionsense.com Thu May 11 09:09:51 2017 From: nadavh at visionsense.com (Nadav Horesh) Date: Thu, 11 May 2017 13:09:51 +0000 Subject: [SciPy-User] [Numpy-discussion] NumPy v1.13.0rc1 released. In-Reply-To: References: Message-ID: There is a change to "expand_dims" function, that it is now does not allow axis = a.ndims. This influences matplotlib function get_bending_matrices in triinterpolate.py Nadav ________________________________ From: NumPy-Discussion on behalf of Charles R Harris Sent: 11 May 2017 04:48:34 To: numpy-discussion; SciPy-User; SciPy Developers List; python-announce-list at python.org Subject: [Numpy-discussion] NumPy v1.13.0rc1 released. Hi All, I'm please to announce the NumPy 1.13.0rc1 release. This release supports Python 2.7 and 3.4-3.6 and contains many new features. It is one of the most ambitious releases in the last several years. Some of the highlights and new functions are Highlights * Operations like ``a + b + c`` will reuse temporaries on some platforms, resulting in less memory use and faster execution. * Inplace operations check if inputs overlap outputs and create temporaries to avoid problems. * New __array_ufunc__ attribute provides improved ability for classes to override default ufunc behavior. * New np.block function for creating blocked arrays. New functions * New ``np.positive`` ufunc. * New ``np.divmod`` ufunc provides more efficient divmod. * New ``np.isnat`` ufunc tests for NaT special values. * New ``np.heaviside`` ufunc computes the Heaviside function. * New ``np.isin`` function, improves on ``in1d``. * New ``np.block`` function for creating blocked arrays. * New ``PyArray_MapIterArrayCopyIfOverlap`` added to NumPy C-API. Wheels for the pre-release are available on PyPI. Source tarballs, zipfiles, release notes, and the Changelog are available on github. A total of 100 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * A. Jesse Jiryu Davis + * Alessandro Pietro Bardelli + * Alex Rothberg + * Alexander Shadchin * Allan Haldane * Andres Guzman-Ballen + * Antoine Pitrou * Antony Lee * B R S Recht + * Baurzhan Muftakhidinov + * Ben Rowland * Benda Xu + * Blake Griffith * Bradley Wogsland + * Brandon Carter + * CJ Carey * Charles Harris * Danny Hermes + * Duke Vijitbenjaronk + * Egor Klenin + * Elliott Forney + * Elliott M Forney + * Endolith * Eric Wieser * Erik M. Bray * Eugene + * Evan Limanto + * Felix Berkenkamp + * Fran?ois Bissey + * Frederic Bastien * Greg Young * Gregory R. Lee * Importance of Being Ernest + * Jaime Fernandez * Jakub Wilk + * James Cowgill + * James Sanders * Jean Utke + * Jesse Thoren + * Jim Crist + * Joerg Behrmann + * John Kirkham * Jonathan Helmus * Jonathan L Long * Jonathan Tammo Siebert + * Joseph Fox-Rabinovitz * Joshua Loyal + * Juan Nunez-Iglesias + * Julian Taylor * Kirill Balunov + * Likhith Chitneni + * Lo?c Est?ve * Mads Ohm Larsen * Marein K?nings + * Marten van Kerkwijk * Martin Thoma * Martino Sorbaro + * Marvin Schmidt + * Matthew Brett * Matthias Bussonnier + * Matthias C. M. Troffaes + * Matti Picus * Michael Seifert * Mikhail Pak + * Mortada Mehyar * Nathaniel J. Smith * Nick Papior * Oscar Villellas + * Pauli Virtanen * Pavel Potocek * Pete Peeradej Tanruangporn + * Philipp A + * Ralf Gommers * Robert Kern * Roland Kaufmann + * Ronan Lamy * Sami Salonen + * Sanchez Gonzalez Alvaro * Sebastian Berg * Shota Kawabuchi * Simon Gibbons * Stefan Otte * Stefan Peterson + * Stephan Hoyer * S?ren Fuglede J?rgensen + * Takuya Akiba * Tom Boyd + * Ville Skytt? + * Warren Weckesser * Wendell Smith * Yu Feng * Zixu Zhao + * Z? Vin?cius + * aha66 + * davidjn + * drabach + * drlvk + * jsh9 + * solarjoe + * zengi + Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From shoyer at gmail.com Thu May 11 12:23:20 2017 From: shoyer at gmail.com (Stephan Hoyer) Date: Thu, 11 May 2017 09:23:20 -0700 Subject: [SciPy-User] [Numpy-discussion] NumPy v1.13.0rc1 released. In-Reply-To: References: Message-ID: Also, as friendly reminder, GitHub is a better place for bug reports than mailing lists with hundreds of subscribers :). On Thu, May 11, 2017 at 6:56 AM, Eric Wieser wrote: > Nadav: Can you provide a testcase that fails? > > I don't think you're correct - it works just fine when `axis = a.ndims` - > the issue arises when `axis > a.ndims`, but I'd argue that in that case an > error is correct behaviour. But still a change, so perhaps needs a release > note entry > > On Thu, 11 May 2017 at 14:25 Nadav Horesh wrote: > >> There is a change to "expand_dims" function, that it is now does not >> allow axis = a.ndims. >> >> This influences matplotlib function get_bending_matrices in >> triinterpolate.py >> >> >> Nadav >> ------------------------------ >> *From:* NumPy-Discussion > visionsense.com at python.org> on behalf of Charles R Harris < >> charlesr.harris at gmail.com> >> *Sent:* 11 May 2017 04:48:34 >> *To:* numpy-discussion; SciPy-User; SciPy Developers List; >> python-announce-list at python.org >> *Subject:* [Numpy-discussion] NumPy v1.13.0rc1 released. >> >> Hi All, >> >> I'm please to announce the NumPy 1.13.0rc1 release. This release supports >> Python 2.7 and 3.4-3.6 and contains many new features. It is one of the >> most ambitious releases in the last several years. Some of the highlights >> and new functions are >> >> *Highlights* >> >> - Operations like ``a + b + c`` will reuse temporaries on some >> platforms, resulting in less memory use and faster execution. >> - Inplace operations check if inputs overlap outputs and create >> temporaries to avoid problems. >> - New __array_ufunc__ attribute provides improved ability for classes >> to override default ufunc behavior. >> - New np.block function for creating blocked arrays. >> >> >> *New functions* >> >> - New ``np.positive`` ufunc. >> - New ``np.divmod`` ufunc provides more efficient divmod. >> - New ``np.isnat`` ufunc tests for NaT special values. >> - New ``np.heaviside`` ufunc computes the Heaviside function. >> - New ``np.isin`` function, improves on ``in1d``. >> - New ``np.block`` function for creating blocked arrays. >> - New ``PyArray_MapIterArrayCopyIfOverlap`` added to NumPy C-API. >> >> Wheels for the pre-release are available on PyPI. Source tarballs, >> zipfiles, release notes, and the Changelog are available on github >> . >> >> A total of 100 people contributed to this release. People with a "+" by >> their >> names contributed a patch for the first time. >> >> - A. Jesse Jiryu Davis + >> - Alessandro Pietro Bardelli + >> - Alex Rothberg + >> - Alexander Shadchin >> - Allan Haldane >> - Andres Guzman-Ballen + >> - Antoine Pitrou >> - Antony Lee >> - B R S Recht + >> - Baurzhan Muftakhidinov + >> - Ben Rowland >> - Benda Xu + >> - Blake Griffith >> - Bradley Wogsland + >> - Brandon Carter + >> - CJ Carey >> - Charles Harris >> - Danny Hermes + >> - Duke Vijitbenjaronk + >> - Egor Klenin + >> - Elliott Forney + >> - Elliott M Forney + >> - Endolith >> - Eric Wieser >> - Erik M. Bray >> - Eugene + >> - Evan Limanto + >> - Felix Berkenkamp + >> - Fran?ois Bissey + >> - Frederic Bastien >> - Greg Young >> - Gregory R. Lee >> - Importance of Being Ernest + >> - Jaime Fernandez >> - Jakub Wilk + >> - James Cowgill + >> - James Sanders >> - Jean Utke + >> - Jesse Thoren + >> - Jim Crist + >> - Joerg Behrmann + >> - John Kirkham >> - Jonathan Helmus >> - Jonathan L Long >> - Jonathan Tammo Siebert + >> - Joseph Fox-Rabinovitz >> - Joshua Loyal + >> - Juan Nunez-Iglesias + >> - Julian Taylor >> - Kirill Balunov + >> - Likhith Chitneni + >> - Lo?c Est?ve >> - Mads Ohm Larsen >> - Marein K?nings + >> - Marten van Kerkwijk >> - Martin Thoma >> - Martino Sorbaro + >> - Marvin Schmidt + >> - Matthew Brett >> - Matthias Bussonnier + >> - Matthias C. M. Troffaes + >> - Matti Picus >> - Michael Seifert >> - Mikhail Pak + >> - Mortada Mehyar >> - Nathaniel J. Smith >> - Nick Papior >> - Oscar Villellas + >> - Pauli Virtanen >> - Pavel Potocek >> - Pete Peeradej Tanruangporn + >> - Philipp A + >> - Ralf Gommers >> - Robert Kern >> - Roland Kaufmann + >> - Ronan Lamy >> - Sami Salonen + >> - Sanchez Gonzalez Alvaro >> - Sebastian Berg >> - Shota Kawabuchi >> - Simon Gibbons >> - Stefan Otte >> - Stefan Peterson + >> - Stephan Hoyer >> - S?ren Fuglede J?rgensen + >> - Takuya Akiba >> - Tom Boyd + >> - Ville Skytt? + >> - Warren Weckesser >> - Wendell Smith >> - Yu Feng >> - Zixu Zhao + >> - Z? Vin?cius + >> - aha66 + >> - davidjn + >> - drabach + >> - drlvk + >> - jsh9 + >> - solarjoe + >> - zengi + >> >> Cheers, >> >> Chuck >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at python.org >> https://mail.python.org/mailman/listinfo/numpy-discussion >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at python.org > https://mail.python.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Thu May 11 15:05:08 2017 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 11 May 2017 21:05:08 +0200 Subject: [SciPy-User] Arc length of parametric spline In-Reply-To: References: Message-ID: <3100705a-1b26-ad1a-f2a9-6365cf403b34@iki.fi> Kenny Vermeers kirjoitti 11.05.2017 klo 12:12: > In order to do this I need the splines dx/dt and dy/dt,. I was going to > do this as follows > tck, u = interpolate.splprep([x, y]) > dspl = interpolate.splder(tck) > hoping to get dx/dt and dy/dt and then solving the integral. > Unfortunately that doesn't work since splder doesn't seem to accept > parametric splines. Do you have any suggestions on how to get the spline > parameter from 0 to arclen? You can grab the x(u) and y(u) splines from the output of splprep. It is a bit ugly because splprep has a different padding convention than splrep. t, c, k = tck cx = np.concatenate([c[0], np.zeros(len(t) - len(c[0]))]) cy = np.concatenate([c[1], np.zeros(len(t) - len(c[1]))]) tck_x = (t, cx, k) tck_y = (t, cy, k) From david at drhagen.com Sat May 13 07:50:51 2017 From: david at drhagen.com (David Hagen) Date: Sat, 13 May 2017 07:50:51 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: I uninstalled all versions of MSVS and installed Visual Studio 2015 update 3. This did not fix anything, but at least I ruled out conflicts there. To get past the NoneType error from my previous message I had to fix a bug in Numpy; namely, in the file numpy\distutils\msvc9compiler.py the _merge function fails if old is None. I merely swap the two `if` statements and it got past. I will submit a pull request. Then the installer couldn't find icl.exe. I also had to add this to my PATH environment variable: C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2018.0.065\windows\bin\intel64 Then I got this error LINK : fatal error LNK1104: cannot open file 'libiomp5md.lib' which I tracked down to MPI. The instructions here had told me to add several compiler options in numpy/distutils/intelccompiler.py, one of which was openmp. Once I reverted the compiler options to their original state, this error went away. Then I got this error LINK : fatal error LNK1104: cannot open file 'libmmd.lib' I tracked this down to the compiler libraries not be on the library path so I added this to the LIB environment variable: C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2018.0.065\windows\compiler\lib\intel64 Finally Numpy installed and imports. But while it was nice to learn how to install Numpy from source, the main goal was to install Scipy from source. I thought that installing Numpy from source would fix my earlier problems with unresolved external _gfortran symbols. Alas, I still get this error while installing Scipy: building 'scipy.integrate._quadpack' extension compiling C sources C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2018.0.065\windows\bin\intel64\icl.exe /c /nologo /O3 /MD /W3 /Qstd=c99 -DSCIPY_MKL_H -DHAVE_CBLAS - Iscipy\_lib\src -I"C:\Program Files (x86)\IntelSWTools\parallel_studio_xe_2018.0.019\compilers_and_libraries_2018\windows\mkl\include" -IC:\Anaconda3\envs\numpy _source\lib\site-packages\numpy-1.12.1-py3.6-win-amd64.egg\numpy\core\include -IC:\Anaconda3\envs\numpy_source\include -IC:\Anaconda3\envs\numpy_source\include /Tcscipy\integrate\_quadpackmodule.c /Fobuild\temp.win-amd64-3.6\Release\scipy\integrate\_quadpackmodule.obj could not find library 'quadpack' in directories ['C:\\Program Files (x86)\\Inte lSWTools\\parallel_studio_xe_2018.0.019\\compilers_and_libraries_2018\\windows\\ mkl\\lib\\intel64'] ... Creating library build\temp.win-amd64-3.6\Release\scipy\integrate\_quadpack.c p36-win_amd64.lib and object build\temp.win-amd64-3.6\Release\scipy\integrate\_quadpack.cp36-win_amd64.exp mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_st_write referenced in function d1mach_ mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_st_write_done referenced in function d1mach_ mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_stop_numeric_f08 referenced in function d1mach_ mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_transfer_character_write referenced in function d1mach_ mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_transfer_integer_write referenced in function d1mach_ mach.lib(d1mach.o) : error LNK2019: unresolved external symbol _gfortran_stop_string referenced in function d1mach_ build\lib.win-amd64-3.6\scipy\integrate\_quadpack.cp36-win_amd64.pyd : fatal error LNK1120: 6 unresolved externals *What does this even mean?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Sat May 13 20:37:16 2017 From: david at drhagen.com (David Hagen) Date: Sat, 13 May 2017 20:37:16 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: Ok, I solved everything. Here's the solution to the unresolved external symbols error: *DELETE THE BUILD FOLDER*. If Scipy starts compiling with one compiler, it will used the cached binaries when linking later even if it is now using a different compiler. Deleting the build folder forces it to start over. -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Sat May 13 21:35:46 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Sun, 14 May 2017 01:35:46 +0000 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: Ah yes, the old "start the build fresh each time" problem--been there, done that. Even if something says i shouldn't have to start over, i no longer trust that: always start with a fresh build. DLG On Sat, May 13, 2017 at 6:01 PM David Hagen wrote: > Ok, I solved everything. Here's the solution to the unresolved external > symbols error: *DELETE THE BUILD FOLDER*. If Scipy starts compiling with > one compiler, it will used the cached binaries when linking later even if > it is now using a different compiler. Deleting the build folder forces it > to start over. > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Sun May 14 17:37:13 2017 From: david at drhagen.com (David Hagen) Date: Sun, 14 May 2017 17:37:13 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: I spoke too soon. I "successfully" compiled Numpy and Scipy from master and both can be imported. However, some Scipy items don't import. It doesn't really matter what exactly I import. If I run scipy.test() everything dies. Can someone interpret this error message? >>> from scipy.special import roots_legendre Traceback (most recent call last): File "__init__.pxd", line 987, in numpy.import_array (scipy\special\_ufuncs_cx x.cxx:3729) RuntimeError: module compiled against API version 0xb but this version of numpy is 0xa During handling of the above exception, another exception occurred: Traceback (most recent call last): File "", line 1, in File "C:\Anaconda3\envs\numpy_scipy\lib\site-packages\scipy- 1.0.0.dev0+unknown -py3.6-win-amd64.egg\scipy\special\__init__.py", line 640, in from ._ufuncs import * File "scipy\special\_ufuncs.pyx", line 1, in init scipy.special._ufuncs (scipy \special\_ufuncs.c:39403) File "scipy\special\_ufuncs_extra_code_common.pxi", line 34, in init scipy.spe cial._ufuncs_cxx (scipy\special\_ufuncs_cxx.cxx:4376) File "__init__.pxd", line 989, in numpy.import_array (scipy\special\_ufuncs_cx x.cxx:3772) ImportError: numpy.core.multiarray failed to import -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.akhiyarov at gmail.com Mon May 15 17:42:37 2017 From: denis.akhiyarov at gmail.com (Denis Akhiyarov) Date: Mon, 15 May 2017 16:42:37 -0500 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: It looks like you have multiple numpy versions installed. Did you uninstall numpy, scipy with both pip and conda before building and installing numpy and scipy from source? I presume that you also built numpy from clean source without cache? Since you are using conda environments, make sure that you activate it before the build process and before launching python interpreter. On Sun, May 14, 2017 at 4:37 PM, David Hagen wrote: > I spoke too soon. I "successfully" compiled Numpy and Scipy from master > and both can be imported. However, some Scipy items don't import. It > doesn't really matter what exactly I import. If I run scipy.test() > everything dies. Can someone interpret this error message? > > >>> from scipy.special import roots_legendre > Traceback (most recent call last): > File "__init__.pxd", line 987, in numpy.import_array > (scipy\special\_ufuncs_cx > x.cxx:3729) > RuntimeError: module compiled against API version 0xb but this version of > numpy > is 0xa > > During handling of the above exception, another exception occurred: > > Traceback (most recent call last): > File "", line 1, in > File "C:\Anaconda3\envs\numpy_scipy\lib\site-packages\scipy-1.0. > 0.dev0+unknown > -py3.6-win-amd64.egg\scipy\special\__init__.py", line 640, in > from ._ufuncs import * > File "scipy\special\_ufuncs.pyx", line 1, in init scipy.special._ufuncs > (scipy > \special\_ufuncs.c:39403) > File "scipy\special\_ufuncs_extra_code_common.pxi", line 34, in init > scipy.spe > cial._ufuncs_cxx (scipy\special\_ufuncs_cxx.cxx:4376) > File "__init__.pxd", line 989, in numpy.import_array > (scipy\special\_ufuncs_cx > x.cxx:3772) > ImportError: numpy.core.multiarray failed to import > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Mon May 15 20:48:03 2017 From: david at drhagen.com (David Hagen) Date: Mon, 15 May 2017 20:48:03 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: > It looks like you have multiple numpy versions installed. Did you uninstall numpy, scipy with both pip and conda before building and installing numpy and scipy from source? I presume that you also built numpy from clean source without cache? I have numpy installed in a separate conda environment because I don't want to lose my production environment while playing with Numpy and Scipy master. I got this error after creating a fresh conda environment, activating it, conda installing python and cython in it, and then running setup.py on freshly downloaded numpy and scipy master folders. How can I check if my other Numpy copies are visible in the search path for modules? Is there a command I can run to diagnose the version compatibility of the numpy and scipy modules that are imported and where they are located? -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Wed May 17 20:40:58 2017 From: david at drhagen.com (David Hagen) Date: Wed, 17 May 2017 20:40:58 -0400 Subject: [SciPy-User] Install Scipy with Anaconda's MKL libraries In-Reply-To: References: <590B2AB5.9080505@sbcglobal.net> <590B6246.6090607@sbcglobal.net> Message-ID: > It looks like you have multiple numpy versions installed. Did you uninstall numpy, scipy with both pip and conda before building and installing numpy and scipy from source? I presume that you also built numpy from clean source without cache? I can confirm it was another version of Numpy installed alongside. Anaconda had pulled it in when I installed matplotlib. Once I removed it and installed matplotlib from pip, then everything worked fine. -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Thu May 18 17:42:23 2017 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 18 May 2017 15:42:23 -0600 Subject: [SciPy-User] NumPy 1.13.0rc2 released Message-ID: Hi All, I'm pleased to announce the NumPy 1.13.0rc2 release. This release supports Python 2.7 and 3.4-3.6 and contains many new features. It is one of the most ambitious releases in the last several years. Some of the highlights and new functions are *Highlights* - Operations like ``a + b + c`` will reuse temporaries on some platforms, resulting in less memory use and faster execution. - Inplace operations check if inputs overlap outputs and create temporaries to avoid problems. - New __array_ufunc__ attribute provides improved ability for classes to override default ufunc behavior. - New np.block function for creating blocked arrays. *New functions* - New ``np.positive`` ufunc. - New ``np.divmod`` ufunc provides more efficient divmod. - New ``np.isnat`` ufunc tests for NaT special values. - New ``np.heaviside`` ufunc computes the Heaviside function. - New ``np.isin`` function, improves on ``in1d``. - New ``np.block`` function for creating blocked arrays. - New ``PyArray_MapIterArrayCopyIfOverlap`` added to NumPy C-API. Wheels for the pre-release are available on PyPI. Source tarballs, zipfiles, release notes, and the changelog are available on github . A total of 102 people contributed to this release. People with a "+" by their names contributed a patch for the first time. - A. Jesse Jiryu Davis + - Alessandro Pietro Bardelli + - Alex Rothberg + - Alexander Shadchin - Allan Haldane - Andres Guzman-Ballen + - Antoine Pitrou - Antony Lee - B R S Recht + - Baurzhan Muftakhidinov + - Ben Rowland - Benda Xu + - Blake Griffith - Bradley Wogsland + - Brandon Carter + - CJ Carey - Charles Harris - Christoph Gohlke - Danny Hermes + - David Hagen + - David Nicholson + - Duke Vijitbenjaronk + - Egor Klenin + - Elliott Forney + - Elliott M Forney + - Endolith - Eric Wieser - Erik M. Bray - Eugene + - Evan Limanto + - Felix Berkenkamp + - Fran?ois Bissey + - Frederic Bastien - Greg Young - Gregory R. Lee - Importance of Being Ernest + - Jaime Fernandez - Jakub Wilk + - James Cowgill + - James Sanders - Jean Utke + - Jesse Thoren + - Jim Crist + - Joerg Behrmann + - John Kirkham - Jonathan Helmus - Jonathan L Long - Jonathan Tammo Siebert + - Joseph Fox-Rabinovitz - Joshua Loyal + - Juan Nunez-Iglesias + - Julian Taylor - Kirill Balunov + - Likhith Chitneni + - Lo?c Est?ve - Mads Ohm Larsen - Marein K?nings + - Marten van Kerkwijk - Martin Thoma - Martino Sorbaro + - Marvin Schmidt + - Matthew Brett - Matthias Bussonnier + - Matthias C. M. Troffaes + - Matti Picus - Michael Seifert - Mikhail Pak + - Mortada Mehyar - Nathaniel J. Smith - Nick Papior - Oscar Villellas + - Pauli Virtanen - Pavel Potocek - Pete Peeradej Tanruangporn + - Philipp A + - Ralf Gommers - Robert Kern - Roland Kaufmann + - Ronan Lamy - Sami Salonen + - Sanchez Gonzalez Alvaro - Sebastian Berg - Shota Kawabuchi - Simon Gibbons - Stefan Otte - Stefan Peterson + - Stephan Hoyer - S?ren Fuglede J?rgensen + - Takuya Akiba - Tom Boyd + - Ville Skytt? + - Warren Weckesser - Wendell Smith - Yu Feng - Zixu Zhao + - Z? Vin?cius + - aha66 + - drabach + - drlvk + - jsh9 + - solarjoe + - zengi + Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Fri May 19 04:52:37 2017 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Fri, 19 May 2017 10:52:37 +0200 Subject: [SciPy-User] ANN: SfePy 2017.2 Message-ID: I am pleased to announce release 2017.2 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software for solving systems of coupled partial differential equations by the finite element method or by the isogeometric analysis (limited support). It is distributed under the new BSD license. Home page: http://sfepy.org Mailing list: https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/ Git (source) repository, issue tracker: https://github.com/sfepy/sfepy Highlights of this release -------------------------- - simplified and unified implementation of some homogenized coefficients - support for saving custom structured data to HDF5 files - new tutorial on preparing meshes using FreeCAD/OpenSCAD and Gmsh For full release notes see http://docs.sfepy.org/doc/release_notes.html#id1 (rather long and technical). Cheers, Robert Cimrman --- Contributors to this release in alphabetical order: Robert Cimrman Jan Heczko Lubos Kejzlar Vladimir Lukes Matyas Novak From schilling.klaus at web.de Fri May 19 12:59:20 2017 From: schilling.klaus at web.de (klaus schilling) Date: Fri, 19 May 2017 18:59:20 +0200 Subject: [SciPy-User] Alergia algorithm Message-ID: <87tw4gbx5j.fsf@web.de> is there already a module implementing, on top of sci-py, the Alergia-algorithm for the construction of stochastic finite automata from prefix tree models? Klaus Schilling From kirillbalunov at gmail.com Thu May 25 16:48:30 2017 From: kirillbalunov at gmail.com (Kirill Balunov) Date: Thu, 25 May 2017 23:48:30 +0300 Subject: [SciPy-User] scipy optimization: inequality constraint type Message-ID: Hi, I've tried some scipy optimization routines, they work great!!! But I wondered, why historically for inequality constraints the type was chosen to be "greater than or equal" type? This is inconsistent with the classical formulation of non-linear programming problems. Thanks! -gdg -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Thu May 25 17:07:54 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Thu, 25 May 2017 21:07:54 +0000 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: I would assume that it's because the "or equal to" option allows greater flexibility: if that criterion is allowed by the problem, and the algorithm can find such a solution (e.g., by checking all such corner points), then that's better than not even providing the option, yes? And if you try to "rig" the strict inequality approach by allowing for a little extra room around the corner, then the exact solution might not be found, yes? Indeed, i'm no expert, but i did have a course in this, and IIRC, if your problem allows for equality, then you _must_ separately check all the corners, yes? (In other words, what you state about the "classical formulation" is not what i was taught: I was taught that the specifics of the problem dictate whether any given inequality should be strict or "weak.") DLG On Thu, May 25, 2017 at 1:49 PM Kirill Balunov wrote: > Hi, > I've tried some scipy optimization routines, they work great!!! But I > wondered, why historically for inequality constraints the type was chosen > to be "greater than or equal" type? This is inconsistent with the classical > formulation of non-linear programming problems. > > Thanks! > > -gdg > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kirillbalunov at gmail.com Thu May 25 17:28:45 2017 From: kirillbalunov at gmail.com (Kirill Balunov) Date: Fri, 26 May 2017 00:28:45 +0300 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: I'm sorry, perhaps I should more clearly formulate the question. David you are totally right. What I mean by classical: is "less than or equal" type. Of course it's a question of a sign, but still... -gdg 2017-05-26 0:07 GMT+03:00 David Goldsmith : > I would assume that it's because the "or equal to" option allows greater > flexibility: if that criterion is allowed by the problem, and the algorithm > can find such a solution (e.g., by checking all such corner points), then > that's better than not even providing the option, yes? And if you try to > "rig" the strict inequality approach by allowing for a little extra room > around the corner, then the exact solution might not be found, yes? > Indeed, i'm no expert, but i did have a course in this, and IIRC, if your > problem allows for equality, then you _must_ separately check all the > corners, yes? (In other words, what you state about the "classical > formulation" is not what i was taught: I was taught that the specifics of > the problem dictate whether any given inequality should be strict or > "weak.") > > DLG > > On Thu, May 25, 2017 at 1:49 PM Kirill Balunov > wrote: > >> Hi, >> I've tried some scipy optimization routines, they work great!!! But I >> wondered, why historically for inequality constraints the type was chosen >> to be "greater than or equal" type? This is inconsistent with the classical >> formulation of non-linear programming problems. >> >> Thanks! >> >> -gdg >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Thu May 25 17:36:01 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Thu, 25 May 2017 21:36:01 +0000 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: Ah, yes, that convention i am familiar with; maybe to accommodate the "inflexibility" of less numerate potential users (who may be fixated, e.g., on wanting to maximize profit or yield)? Of course, at some point such people may want to minimize something, so hopefully they have someone around to tell them to simply multiply by negative one. ;-) DLG On Thu, May 25, 2017 at 2:29 PM Kirill Balunov wrote: > I'm sorry, perhaps I should more clearly formulate the question. David you > are totally right. What I mean by classical: is "less than or equal" > type. Of course it's a question of a sign, but still... > > -gdg > > 2017-05-26 0:07 GMT+03:00 David Goldsmith : > >> I would assume that it's because the "or equal to" option allows greater >> flexibility: if that criterion is allowed by the problem, and the algorithm >> can find such a solution (e.g., by checking all such corner points), then >> that's better than not even providing the option, yes? And if you try to >> "rig" the strict inequality approach by allowing for a little extra room >> around the corner, then the exact solution might not be found, yes? >> Indeed, i'm no expert, but i did have a course in this, and IIRC, if your >> problem allows for equality, then you _must_ separately check all the >> corners, yes? (In other words, what you state about the "classical >> formulation" is not what i was taught: I was taught that the specifics of >> the problem dictate whether any given inequality should be strict or >> "weak.") >> >> DLG >> >> On Thu, May 25, 2017 at 1:49 PM Kirill Balunov >> wrote: >> >>> Hi, >>> I've tried some scipy optimization routines, they work great!!! But I >>> wondered, why historically for inequality constraints the type was chosen >>> to be "greater than or equal" type? This is inconsistent with the classical >>> formulation of non-linear programming problems. >>> >>> Thanks! >>> >>> -gdg >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kirillbalunov at gmail.com Thu May 25 17:53:03 2017 From: kirillbalunov at gmail.com (Kirill Balunov) Date: Fri, 26 May 2017 00:53:03 +0300 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: No, no, the constraints only affect the feseable set of the problem. Min or Max depends on the sign of objective function. From mathemtical point of view, the problem is that the KKT conditions are derived for standard formulation (with "less than ..") of NLP. Cheers, -gdg 2017-05-26 0:36 GMT+03:00 David Goldsmith : > Ah, yes, that convention i am familiar with; maybe to accommodate the > "inflexibility" of less numerate potential users (who may be fixated, e.g., > on wanting to maximize profit or yield)? Of course, at some point such > people may want to minimize something, so hopefully they have someone > around to tell them to simply multiply by negative one. ;-) > > DLG > > On Thu, May 25, 2017 at 2:29 PM Kirill Balunov > wrote: > >> I'm sorry, perhaps I should more clearly formulate the question. David >> you are totally right. What I mean by classical: is "less than or equal" >> type. Of course it's a question of a sign, but still... >> >> -gdg >> >> 2017-05-26 0:07 GMT+03:00 David Goldsmith : >> >>> I would assume that it's because the "or equal to" option allows greater >>> flexibility: if that criterion is allowed by the problem, and the algorithm >>> can find such a solution (e.g., by checking all such corner points), then >>> that's better than not even providing the option, yes? And if you try to >>> "rig" the strict inequality approach by allowing for a little extra room >>> around the corner, then the exact solution might not be found, yes? >>> Indeed, i'm no expert, but i did have a course in this, and IIRC, if your >>> problem allows for equality, then you _must_ separately check all the >>> corners, yes? (In other words, what you state about the "classical >>> formulation" is not what i was taught: I was taught that the specifics of >>> the problem dictate whether any given inequality should be strict or >>> "weak.") >>> >>> DLG >>> >>> On Thu, May 25, 2017 at 1:49 PM Kirill Balunov >>> wrote: >>> >>>> Hi, >>>> I've tried some scipy optimization routines, they work great!!! But I >>>> wondered, why historically for inequality constraints the type was chosen >>>> to be "greater than or equal" type? This is inconsistent with the classical >>>> formulation of non-linear programming problems. >>>> >>>> Thanks! >>>> >>>> -gdg >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Thu May 25 18:03:46 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Thu, 25 May 2017 22:03:46 +0000 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: The KKT reference exceeds my numeracy... Anyway, i doubt this is the case, but if it's really a problem, you can always write wrappers to automate the desired transformations, yes? DLG On The, May 25, 2017 at 2:53 PM Kirill Balunov wrote: > No, no, the constraints only affect the feseable set of the problem. Min > or Max depends on the sign of objective function. From mathemtical point of > view, the problem is that the KKT conditions are derived for standard > formulation (with "less than ..") of NLP. > > Cheers, > > -gdg > > > 2017-05-26 0:36 GMT+03:00 David Goldsmith : > >> Ah, yes, that convention i am familiar with; maybe to accommodate the >> "inflexibility" of less numerate potential users (who may be fixated, e.g., >> on wanting to maximize profit or yield)? Of course, at some point such >> people may want to minimize something, so hopefully they have someone >> around to tell them to simply multiply by negative one. ;-) >> >> DLG >> >> On Thu, May 25, 2017 at 2:29 PM Kirill Balunov >> wrote: >> >>> I'm sorry, perhaps I should more clearly formulate the question. David >>> you are totally right. What I mean by classical: is "less than or equal" >>> type. Of course it's a question of a sign, but still... >>> >>> -gdg >>> >>> 2017-05-26 0:07 GMT+03:00 David Goldsmith : >>> >>>> I would assume that it's because the "or equal to" option allows >>>> greater flexibility: if that criterion is allowed by the problem, and the >>>> algorithm can find such a solution (e.g., by checking all such corner >>>> points), then that's better than not even providing the option, yes? And >>>> if you try to "rig" the strict inequality approach by allowing for a little >>>> extra room around the corner, then the exact solution might not be found, >>>> yes? Indeed, i'm no expert, but i did have a course in this, and IIRC, if >>>> your problem allows for equality, then you _must_ separately check all the >>>> corners, yes? (In other words, what you state about the "classical >>>> formulation" is not what i was taught: I was taught that the specifics of >>>> the problem dictate whether any given inequality should be strict or >>>> "weak.") >>>> >>>> DLG >>>> >>>> On Thu, May 25, 2017 at 1:49 PM Kirill Balunov >>>> wrote: >>>> >>>>> Hi, >>>>> I've tried some scipy optimization routines, they work great!!! But I >>>>> wondered, why historically for inequality constraints the type was chosen >>>>> to be "greater than or equal" type? This is inconsistent with the classical >>>>> formulation of non-linear programming problems. >>>>> >>>>> Thanks! >>>>> >>>>> -gdg >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kirillbalunov at gmail.com Fri May 26 07:18:26 2017 From: kirillbalunov at gmail.com (Kirill Balunov) Date: Fri, 26 May 2017 14:18:26 +0300 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: David, KKT conditions are first-order necessary conditions which are applicable if some assumptions are satisfied (regularity, continuity,..). The points which satisfy KKT are said to be stationary (or candidate) points (min, max or inflation). Internally this conditions use gradient of Lagrange function -> c^T * x + lambda^T * h(x) + mu^T * g(x) (this is classical notation), where h(x) are equality and g(x) are inequality (g(x) <= 0) constraints respectively. Also there is a restriction among others for `mu` to be non-negative. This is classical formulation of NLP (which looks for minimum). Of course mathematics is simply a human game with some rules. So you can choose from four cases (for "inequality term"): 1) + mu^T * g(x), mu >= 0, g(x) <=0 (classical) 2) - mu^T * g(x), mu >= 0, g(x) >= 0 3) + mu^T * g(x), mu <= 0, g(x) >= 0 ( awkward) 4) - mu^T * g(x), mu => 0, g(x) <= 0 (awkward) T?e last two are awkward they roughly crash the dual problem principle (the related maximization problem but in Lagrange multipliers). The first two look good but the the second one is very unusual. That is why I ask about historical reasons, why this form was chosen? -gdg 2017-05-26 1:03 GMT+03:00 David Goldsmith : > The KKT reference exceeds my numeracy... > > Anyway, i doubt this is the case, but if it's really a problem, you can > always write wrappers to automate the desired transformations, yes? > > DLG > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kirillbalunov at gmail.com Fri May 26 07:39:04 2017 From: kirillbalunov at gmail.com (Kirill Balunov) Date: Fri, 26 May 2017 14:39:04 +0300 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: I'm sorry, I understand that's because the only solver for constrained optimization in scipy is *SLSQP*? But, It is not good idea to write about a particular case as if it is a general -> In documentation -> "In general, the optimization problems are of the form: ..." https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.optimize.minimize.html Which is not true in general :) -gdg 2017-05-26 14:18 GMT+03:00 Kirill Balunov : > David, KKT conditions are first-order necessary conditions which are > applicable if some assumptions are satisfied (regularity, continuity,..). > The points which satisfy KKT are said to be stationary (or candidate) > points (min, max or inflation). Internally this conditions use gradient of > Lagrange function -> c^T * x + lambda^T * h(x) + mu^T * g(x) (this is > classical notation), where h(x) are equality and g(x) are inequality (g(x) > <= 0) constraints respectively. Also there is a restriction among others > for `mu` to be non-negative. This is classical formulation of NLP (which > looks for minimum). Of course mathematics is simply a human game with some > rules. So you can choose from four cases (for "inequality term"): > 1) + mu^T * g(x), mu >= 0, g(x) <=0 (classical) > 2) - mu^T * g(x), mu >= 0, g(x) >= 0 > 3) + mu^T * g(x), mu <= 0, g(x) >= 0 ( awkward) > 4) - mu^T * g(x), mu => 0, g(x) <= 0 (awkward) > > T?e last two are awkward they roughly crash the dual problem principle > (the related maximization problem but in Lagrange multipliers). The first > two look good but the the second one is very unusual. That is why I ask > about historical reasons, why this form was chosen? > > -gdg > > > > 2017-05-26 1:03 GMT+03:00 David Goldsmith : > >> The KKT reference exceeds my numeracy... >> >> Anyway, i doubt this is the case, but if it's really a problem, you can >> always write wrappers to automate the desired transformations, yes? >> >> DLG >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eulergaussriemann at gmail.com Fri May 26 11:15:11 2017 From: eulergaussriemann at gmail.com (David Goldsmith) Date: Fri, 26 May 2017 15:15:11 +0000 Subject: [SciPy-User] scipy optimization: inequality constraint type In-Reply-To: References: Message-ID: It looks like it could simply be a typo: could you be troubled to file a ticket? Worst case: someone will explain, in that record, why it isn't a typo; best case: it will get fixed. Thanks for your conscientiousness. DLG On Fri, May 26, 2017 at 4:40 AM Kirill Balunov wrote: > I'm sorry, I understand that's because the only solver for constrained > optimization in scipy is *SLSQP*? > > But, It is not good idea to write about a particular case as if it is a > general -> > > In documentation -> "In general, the optimization problems are of the > form: ..." > > https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.optimize.minimize.html > > Which is not true in general :) > > > -gdg > > 2017-05-26 14:18 GMT+03:00 Kirill Balunov : > >> David, KKT conditions are first-order necessary conditions which are >> applicable if some assumptions are satisfied (regularity, continuity,..). >> The points which satisfy KKT are said to be stationary (or candidate) >> points (min, max or inflation). Internally this conditions use gradient of >> Lagrange function -> c^T * x + lambda^T * h(x) + mu^T * g(x) (this is >> classical notation), where h(x) are equality and g(x) are inequality (g(x) >> <= 0) constraints respectively. Also there is a restriction among others >> for `mu` to be non-negative. This is classical formulation of NLP (which >> looks for minimum). Of course mathematics is simply a human game with some >> rules. So you can choose from four cases (for "inequality term"): >> 1) + mu^T * g(x), mu >= 0, g(x) <=0 (classical) >> 2) - mu^T * g(x), mu >= 0, g(x) >= 0 >> 3) + mu^T * g(x), mu <= 0, g(x) >= 0 ( awkward) >> 4) - mu^T * g(x), mu => 0, g(x) <= 0 (awkward) >> >> T?e last two are awkward they roughly crash the dual problem principle >> (the related maximization problem but in Lagrange multipliers). The first >> two look good but the the second one is very unusual. That is why I ask >> about historical reasons, why this form was chosen? >> >> -gdg >> >> >> >> 2017-05-26 1:03 GMT+03:00 David Goldsmith : >> >>> The KKT reference exceeds my numeracy... >>> >>> Anyway, i doubt this is the case, but if it's really a problem, you can >>> always write wrappers to automate the desired transformations, yes? >>> >>> DLG >>> >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edwardlrichards at gmail.com Sat May 27 13:22:45 2017 From: edwardlrichards at gmail.com (Edward Richards) Date: Sat, 27 May 2017 10:22:45 -0700 Subject: [SciPy-User] low memory labeled component clustering Message-ID: <5929B5E5.8000408@gmail.com> I would like a LCC algorithm that produces sparse results. What I have in mind would be similar to MATLAB's bwconncomp and return a sparse data structure instead of a dense label array. I am new to image processing, and have only a surface understanding of the LCC algorithm. I am hoping that the experts out there can let me know if my goal is tenable (or hopefully already implemented), before I dive too deeply into the ndimage.label source code. I am working on a detection algorithm, and the data is 3D and can be very large in shape but sparse. I would like to process a bunch of data at once to avoid edge effects. Currently I am maxing out my computers RAM (32 GB) processing a (50, 200, 523264) element array. This array has only ~3 million non-zero elements, or 0.00056 sparsity. For each cluster I only need; the max value, it's location, and the number of elements. Currently, the data is stored as a list of sparse matrices. Is it possible to have the only 3D representation of the data be a Boolean array that I pass to a clustering algorithm? For my problem, it is better to take reasonable performance hits than hold temporary arrays in memory. Thank you, Ned From mainakjas at gmail.com Sun May 28 19:58:46 2017 From: mainakjas at gmail.com (Mainak Jas) Date: Mon, 29 May 2017 01:58:46 +0200 Subject: [SciPy-User] scale parameter at 0 Message-ID: Hi everyone, I was wondering if anyone had an insight into what happens in scipy when the scale parameter becomes 0. In particular, I was looking into l?vy stable distribution. If we do in scipy: >>> from scipy.stats import levy_stable >>> levy_stable.rvs(alpha=0.99, beta=1, scale=0, loc=1, size=5) I get: >>> array([ 1., 1., 1., 1., 1.]) But when I look in wikipedia , it tells me that the scale parameter should be greater than 0. Maybe it's a more general question for all distributions, but I'm interested in this one particularly. Why doesn't scipy throw an error when scale parameter is 0? Thanks. Best regards, Mainak -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun May 28 20:34:26 2017 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 28 May 2017 20:34:26 -0400 Subject: [SciPy-User] scale parameter at 0 In-Reply-To: References: Message-ID: On Sun, May 28, 2017 at 7:58 PM, Mainak Jas wrote: > Hi everyone, > > I was wondering if anyone had an insight into what happens in scipy when the > scale parameter becomes 0. In particular, I was looking into l?vy stable > distribution. If we do in scipy: > >>>> from scipy.stats import levy_stable >>>> levy_stable.rvs(alpha=0.99, beta=1, scale=0, loc=1, size=5) > > I get: > >>>> array([ 1., 1., 1., 1., 1.]) > > But when I look in wikipedia, it tells me that the scale parameter should be > greater than 0. Maybe it's a more general question for all distributions, > but I'm interested in this one particularly. Why doesn't scipy throw an > error when scale parameter is 0? scale defines the spread of the distribution, and is equal to the standard deviation in the normal distribution. As the scale goes to zero, the distribution collapses to a single point, i.e. a Dirac measure. Whether that makes numerical sense for a distribution depends on the floating point computation details. In general it can still be useful to allow for the degenerate case with scale=0, but I don't remember much discussion about it. From what I remember scale=0 produces for many methods the correct limit for scale -> 0, then there is no real reason to exclude it. (A long time ago there was a discussion about adding a degenerate one point distribution, but it doesn't fit well in any of the current base classes.) Josef > > Thanks. > > Best regards, > Mainak > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > From mainakjas at gmail.com Mon May 29 11:03:16 2017 From: mainakjas at gmail.com (Mainak Jas) Date: Mon, 29 May 2017 17:03:16 +0200 Subject: [SciPy-User] scale parameter at 0 In-Reply-To: References: Message-ID: Hi Josef, scale defines the spread of the distribution, and is equal to the > standard deviation in the normal distribution. > > As the scale goes to zero, the distribution collapses to a single > point, i.e. a Dirac measure. Whether that makes numerical sense for a > distribution depends on the floating point computation details. > > In general it can still be useful to allow for the degenerate case > with scale=0, but I don't remember much discussion about it. From what > I remember scale=0 produces for many methods the correct limit for > scale -> 0, then there is no real reason to exclude it. > (A long time ago there was a discussion about adding a degenerate one > point distribution, but it doesn't fit well in any of the current base > classes.) > Thanks a lot for the explanation. However, in this particular case for the l?vy stable distribution, I am not sure if it produces the correct limit for scale -> 0. I worked out the Math with a colleague which I provide here: https://www.overleaf.com/read/qgwtvtrkcrqc. Please let me know if we missed something. Mainak > Josef > > > > > > Thanks. > > > > Best regards, > > Mainak > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Mon May 29 11:33:44 2017 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Mon, 29 May 2017 17:33:44 +0200 Subject: [SciPy-User] scale parameter at 0 In-Reply-To: References: Message-ID: On 29 May 2017 at 17:03, Mainak Jas wrote: > I worked out the Math with a colleague which I provide here: > https://www.overleaf.com/read/qgwtvtrkcrqc. Please let me know if we > missed something. > You are using a property of the symetric stable distribution, but you consider beta = 1, which is asymmetric. /David. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mainakjas at gmail.com Mon May 29 11:42:29 2017 From: mainakjas at gmail.com (Mainak Jas) Date: Mon, 29 May 2017 17:42:29 +0200 Subject: [SciPy-User] scale parameter at 0 In-Reply-To: References: Message-ID: On Mon, May 29, 2017 at 5:33 PM, Da?id wrote: > > On 29 May 2017 at 17:03, Mainak Jas wrote: > >> I worked out the Math with a colleague which I provide here: >> https://www.overleaf.com/read/qgwtvtrkcrqc. Please let me know if we >> missed something. >> > > You are using a property of the symetric stable distribution, but you > consider beta = 1, which is asymmetric. > Yes, but the symmetric stable distribution can be expressed in terms of the positive stable distribution (which is asymmetric) according to Equation (2) Mainak > /David. > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: