From cjw at ncf.ca Fri Jan 1 18:41:25 2016 From: cjw at ncf.ca (Colin W.) Date: Fri, 1 Jan 2016 18:41:25 -0500 Subject: [Numpy-discussion] Unable to install a copy of Numpy, retrieved from the Git Repository. Message-ID: <56870EA5.90901@ncf.ca> An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Fri Jan 1 21:59:46 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 1 Jan 2016 19:59:46 -0700 Subject: [Numpy-discussion] Unable to install a copy of Numpy, retrieved from the Git Repository. In-Reply-To: <56870EA5.90901@ncf.ca> References: <56870EA5.90901@ncf.ca> Message-ID: On Fri, Jan 1, 2016 at 4:41 PM, Colin W. wrote: > > > > > > > > > > > > > > > > > > > > > > > > > > > *Advice sought Numpy Install > Failure Using Numpy from Git with setup.py install __init__ 156 if > __NUMPY_SETUP__: import sys as _sys _sys.stderr.write('Running from > numpy source directory.\n') del _sys else: try: from > numpy.__config__ import show as show_config except ImportError: > msg = """Error importing numpy: you should not try to import numpy from > its source directory; please exit the numpy source tree, and > relaunch your python interpreter from there.""" raise > ImportError(msg) from .version import git_revision as __git_revision__ > from .version import version as __version__ from ._import_tools > import PackageLoader Fails here. Reason unclear to me. * > Can you be more specific about what you have done. Step by step, if you will. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jan 2 03:20:25 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 2 Jan 2016 09:20:25 +0100 Subject: [Numpy-discussion] Windows build/distribute plan & MingwPy funding Message-ID: Hi all, You probably know that building Numpy, Scipy and the rest of the Scipy Stack on Windows is problematic. And that there are plans to adopt the static MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on for the last two years to fix that situation. The good news is: this has become a lot more concrete just now, with this proposal for funding: http://mingwpy.github.io/proposal_december2015.html Funding for phases 1 and 2 is already confirmed; the phase 3 part has been submitted to the PSF. Phase 1 (of $1000) is funded by donations made to Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS directly. So a big thank you to everyone who made a donation to Numpy, Scipy and NumFOCUS! I hope that that proposal gives a clear idea of the work that's going to be done over the next months. Note that the http://mingwpy.github.io contains a lot more background info, description of technical issues, etc. Feedback & ideas very welcome of course! Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jan 2 03:23:14 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 2 Jan 2016 09:23:14 +0100 Subject: [Numpy-discussion] Numpy funding update In-Reply-To: References: Message-ID: On Fri, Jan 1, 2016 at 12:17 AM, Robert Kern wrote: > On Wed, Dec 30, 2015 at 10:54 AM, Ralf Gommers > wrote: > > > > Hi all, > > > > A quick good news message: OSDC has made a $5k contribution to NumFOCUS, > which is split between support for a women in technology workshop and > support for Numpy: > http://www.numfocus.org/blog/osdc-donates-5k-to-support-numpy-women-in-tech > > This was a very nice surprise to me, and a first sign that the FSA > (fiscal sponsorship agreement) we recently signed with NumFOCUS is going to > yield significant benefits for Numpy. > > > > NumFOCUS is also doing a special end-of-year fundraiser. Funds donated > (up to $5k) will be tripled by anonymous sponsors: > http://www.numfocus.org/blog/numfocus-end-of-year-fundraising-drive-5000-matching-gift-challenge > > So think of Numpy (or your other favorite NumFOCUS-sponsored project of > course) if you're considering a holiday season charitable gift! > > That sounds great! Do we have any concrete plans for spending that money, > yet? > There are: 1. developer meetings 2. MingwPy 3. TBD - Numpy code/infrastructure work Some initial plans were discussed and documented during the first Numpy developer meeting at Scipy'15. See wiki page [1] and meeting minutes [2]. Funding for facilitating more developer meetings (~1 a year) is definitely on the radar. Plans for funding development of new features or other code/infrastructure work are still fuzzy (except for MingwPy, see my email of today on that). We need to transfer the relevant part of those meeting minutes [2] into the docs, and expand upon those to get a clear roadmap online that we can build on. This is work for the next months I think. Cheers, Ralf [1] https://github.com/numpy/numpy/wiki/SciPy-2015-developer-meeting [2] https://docs.google.com/document/d/1IJcYdsHtk8MVAM4AZqFDBSf_nVG-mrB4Tv2bh9u1g4Y/edit?usp=sharing -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jan 2 16:47:51 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 2 Jan 2016 14:47:51 -0700 Subject: [Numpy-discussion] Numpy 1.10.3 release. Message-ID: Hi All, A significant segfault problem has been reported against Numpy 1.10.2 and I want to make a quick 1.10.3 release to get it fixed. Two questions - What exactly is the release process that has been decided on? AFAIK, I should just do a source release on Sourceforge, ping Matthew to produce wheels for Mac and wait for him to put them on pypi, and then upload the sources to pypi. No windows binaries are to be produced. - Is there anything else that needs fixing for 1.10.3? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Sat Jan 2 17:45:15 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Sat, 02 Jan 2016 23:45:15 +0100 Subject: [Numpy-discussion] Proposal for a new function: np.moveaxis In-Reply-To: References: Message-ID: <1451774715.18409.1.camel@sipsolutions.net> Just a heads up, I am planning to put in Stephans pull request (more info, see original mail below) as soon as some minor things are cleared. So if you have any objections or better ideas for the name, now is the time. - Sebastian On Mi, 2015-11-04 at 23:42 -0800, Stephan Hoyer wrote: > I've put up a pull request implementing a new function, np.moveaxis, > as an alternative to np.transpose and np.rollaxis: > https://github.com/numpy/numpy/pull/6630 > > This functionality has been discussed (even the exact function name) > several times over the years, but it never made it into a pull > request. The most pressing issue is that the behavior of np.rollaxis > is not intuitive to most users: > https://mail.scipy.org/pipermail/numpy-discussion/2010-September/0528 > 82.html > https://github.com/numpy/numpy/issues/2039 > http://stackoverflow.com/questions/29891583/reason-why-numpy-rollaxis > -is-so-confusing > > In this pull request, I also allow the source and destination axes to > be sequences as well as scalars. This does not add much complexity to > the code, solves some additional use cases and makes np.moveaxis a > proper generalization of the other axes manipulation routines (see > the pull requests for details). > > Best of all, it already works on ndarray duck types (like masked > array and dask.array), because they have already implemented > transpose. > > I think np.moveaxis would be a useful addition to NumPy -- I've found > myself writing helper functions with a subset of its functionality > several times over the past few years. What do you think? > > Cheers, > Stephan > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From matthew.brett at gmail.com Sat Jan 2 18:51:14 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 2 Jan 2016 23:51:14 +0000 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: Hi, On Sat, Jan 2, 2016 at 9:47 PM, Charles R Harris wrote: > Hi All, > > A significant segfault problem has been reported against Numpy 1.10.2 and I > want to make a quick 1.10.3 release to get it fixed. Two questions > > What exactly is the release process that has been decided on? AFAIK, I > should just do a source release on Sourceforge, ping Matthew to produce > wheels for Mac and wait for him to put them on pypi, and then upload the > sources to pypi. You're welcome to ping me to do the Mac wheels, but I'd be even happier if you could have a go at triggering the build, to see if you can make it work: https://github.com/MacPython/numpy-wheels Cheers, Mattthew From charlesr.harris at gmail.com Sat Jan 2 19:01:50 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 2 Jan 2016 17:01:50 -0700 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Sat, Jan 2, 2016 at 4:51 PM, Matthew Brett wrote: > Hi, > > On Sat, Jan 2, 2016 at 9:47 PM, Charles R Harris > wrote: > > Hi All, > > > > A significant segfault problem has been reported against Numpy 1.10.2 > and I > > want to make a quick 1.10.3 release to get it fixed. Two questions > > > > What exactly is the release process that has been decided on? AFAIK, I > > should just do a source release on Sourceforge, ping Matthew to produce > > wheels for Mac and wait for him to put them on pypi, and then upload the > > sources to pypi. > > You're welcome to ping me to do the Mac wheels, but I'd be even > happier if you could have a go at triggering the build, to see if you > can make it work: > > https://github.com/MacPython/numpy-wheels Hmm... Since it needs to build off the tag to have the correct behavior, I assume that that needs to be set explicitly. Also, it didn't look as if your last pypi uploads were signed. Did you upload with twine? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Sat Jan 2 19:15:43 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 3 Jan 2016 00:15:43 +0000 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Sun, Jan 3, 2016 at 12:01 AM, Charles R Harris wrote: > > > On Sat, Jan 2, 2016 at 4:51 PM, Matthew Brett > wrote: >> >> Hi, >> >> On Sat, Jan 2, 2016 at 9:47 PM, Charles R Harris >> wrote: >> > Hi All, >> > >> > A significant segfault problem has been reported against Numpy 1.10.2 >> > and I >> > want to make a quick 1.10.3 release to get it fixed. Two questions >> > >> > What exactly is the release process that has been decided on? AFAIK, I >> > should just do a source release on Sourceforge, ping Matthew to produce >> > wheels for Mac and wait for him to put them on pypi, and then upload the >> > sources to pypi. >> >> You're welcome to ping me to do the Mac wheels, but I'd be even >> happier if you could have a go at triggering the build, to see if you >> can make it work: >> >> https://github.com/MacPython/numpy-wheels > > > Hmm... Since it needs to build off the tag to have the correct behavior, I > assume that that needs to be set explicitly. No, the build system uses a script I wrote to find the tag closest in development history to master, by default - see : http://stackoverflow.com/a/24557377/1939576 So, if you push a tag, then trigger a build, there's a good chance that will do what you want. If not, then you can set the tag or commit explicitly at : https://github.com/MacPython/numpy-wheels/blob/master/.travis.yml#L5 > Also, it didn't look as if your > last pypi uploads were signed. Did you upload with twine? I did yes - with a custom uploader script [1], but I guess it would be better to download, sign, then upload again using some other mechanism that preserves the signature. Cheers, Matthew [1] https://github.com/MacPython/terryfy/blob/master/wheel-uploader From josef.pktd at gmail.com Sun Jan 3 00:05:19 2016 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 2 Jan 2016 21:05:19 -0800 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Sat, Jan 2, 2016 at 4:47 PM, Charles R Harris wrote: > Hi All, > > A significant segfault problem has been reported against Numpy 1.10.2 and I > want to make a quick 1.10.3 release to get it fixed. Two questions > > What exactly is the release process that has been decided on? AFAIK, I > should just do a source release on Sourceforge, ping Matthew to produce > wheels for Mac and wait for him to put them on pypi, and then upload the > sources to pypi. No windows binaries are to be produced. > Is there anything else that needs fixing for 1.10.3? I'm running the 1.10.2 tests on Windows 10 in a virtualbox on Windows 8.1 using Gohlke binary for MKL on a fresh Python 3.5 This test "Test workarounds for 32-bit limited fwrite, fseek, and ftell ..." is taking a very long time. Is this expected? I get the following errors on Windows 10, and also on Windows 8.1 Winpython 3.4 (except for the last "Unable to find vcvarsall.bat" because it's set up for compiling with mingw) Earlier I also got a ref count error message but I don't see it anymore, so maybe I messed up when trying Ctrl+C to kill the tests. ====================================================================== ERROR: Failure: ImportError (cannot import name 'fib2') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_ext\tests\test_fib2.py", line 4, in from f2py_ext import fib2 ImportError: cannot import name 'fib2' ====================================================================== ERROR: Failure: ImportError (cannot import name 'foo') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_f90_ext\tests\test_foo.py", line 4, in from f2py_f90_ext import foo ImportError: cannot import name 'foo' ====================================================================== ERROR: Failure: ImportError (cannot import name 'fib3') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\gen_ext\tests\test_fib3.py", line 3, in from gen_ext import fib3 ImportError: cannot import name 'fib3' ====================================================================== ERROR: Failure: ImportError (No module named 'pyrex_ext.primes') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\pyrex_ext\tests\test_primes.py", line 4, in from pyrex_ext.primes import primes ImportError: No module named 'pyrex_ext.primes' ====================================================================== ERROR: Failure: ImportError (cannot import name 'example') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example.py", line 4, in from swig_ext import example ImportError: cannot import name 'example' ====================================================================== ERROR: Failure: ImportError (cannot import name 'example2') ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 234, in load_module return load_source(name, filename, file) File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", line 172, in load_source module = _load(spec) File "", line 693, in _load File "", line 673, in _load_unlocked File "", line 662, in exec_module File "", line 222, in _call_with_frames_removed File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example2.py", line 4, in from swig_ext import example2 ImportError: cannot import name 'example2' ====================================================================== ERROR: test_compile1 (test_system_info.TestSystemInfoReading) ---------------------------------------------------------------------- Traceback (most recent call last): File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\test_system_info.py", line 182, in test_compile1 c.compile([os.path.basename(self._src1)], output_dir=self._dir1) File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", line 317, in compile self.initialize() File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", line 210, in initialize vc_env = _get_vc_env(plat_spec) File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", line 85, in _get_vc_env raise DistutilsPlatformError("Unable to find vcvarsall.bat") distutils.errors.DistutilsPlatformError: Unable to find vcvarsall.bat Running it again for Windows 8.1 python 3.4 The above is with `nosetests numpy` Ran 5232 tests in 134.933s FAILED (SKIP=40, errors=16) if I run numpy.test() in the python session, I get Ran 5191 tests in 22.506s OK (KNOWNFAIL=10, SKIP=19) for Windows 10 python 3.5 commandline nostetests numpy Ran 5615 tests in 423.469s FAILED (SKIP=33, errors=15) numpy.test() Ran 5574 tests in 21.546s FAILED (KNOWNFAIL=8, SKIP=12, errors=1) error is only missing vcvarsall.bat Why is there the large difference in how we run the tests? Related: The example from #6923 doesn't crash in my settings, neither python 3.4 nor 3.5. Josef > > Chuck > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > From josef.pktd at gmail.com Sun Jan 3 11:54:24 2016 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 3 Jan 2016 11:54:24 -0500 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Sun, Jan 3, 2016 at 12:05 AM, wrote: > On Sat, Jan 2, 2016 at 4:47 PM, Charles R Harris > wrote: >> Hi All, >> >> A significant segfault problem has been reported against Numpy 1.10.2 and I >> want to make a quick 1.10.3 release to get it fixed. Two questions >> >> What exactly is the release process that has been decided on? AFAIK, I >> should just do a source release on Sourceforge, ping Matthew to produce >> wheels for Mac and wait for him to put them on pypi, and then upload the >> sources to pypi. No windows binaries are to be produced. >> Is there anything else that needs fixing for 1.10.3? > > > I'm running the 1.10.2 tests on Windows 10 in a virtualbox on Windows 8.1 > using Gohlke binary for MKL on a fresh Python 3.5 > > This test > "Test workarounds for 32-bit limited fwrite, fseek, and ftell ..." > is taking a very long time. Is this expected? > > > I get the following errors on Windows 10, and also on Windows 8.1 > Winpython 3.4 (except for the last "Unable to find vcvarsall.bat" > because it's set up for compiling with mingw) > > Earlier I also got a ref count error message but I don't see it > anymore, so maybe I messed up when trying Ctrl+C to kill the tests. > > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'fib2') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_ext\tests\test_fib2.py", > line 4, in > from f2py_ext import fib2 > ImportError: cannot import name 'fib2' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'foo') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_f90_ext\tests\test_foo.py", > line 4, in > from f2py_f90_ext import foo > ImportError: cannot import name 'foo' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'fib3') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\gen_ext\tests\test_fib3.py", > line 3, in > from gen_ext import fib3 > ImportError: cannot import name 'fib3' > > ====================================================================== > ERROR: Failure: ImportError (No module named 'pyrex_ext.primes') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\pyrex_ext\tests\test_primes.py", > line 4, in > from pyrex_ext.primes import primes > ImportError: No module named 'pyrex_ext.primes' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'example') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example.py", > line 4, in > from swig_ext import example > ImportError: cannot import name 'example' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'example2') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in _call_with_frames_removed > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example2.py", > line 4, in > from swig_ext import example2 > ImportError: cannot import name 'example2' > > ====================================================================== > ERROR: test_compile1 (test_system_info.TestSystemInfoReading) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\test_system_info.py", > line 182, in test_compile1 > c.compile([os.path.basename(self._src1)], output_dir=self._dir1) > File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 317, in compile > self.initialize() > File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 210, in initialize > vc_env = _get_vc_env(plat_spec) > File "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 85, in _get_vc_env > raise DistutilsPlatformError("Unable to find vcvarsall.bat") > distutils.errors.DistutilsPlatformError: Unable to find vcvarsall.bat > > > Running it again > > for Windows 8.1 python 3.4 > > The above is with `nosetests numpy` > Ran 5232 tests in 134.933s > FAILED (SKIP=40, errors=16) > > if I run numpy.test() in the python session, I get > Ran 5191 tests in 22.506s > OK (KNOWNFAIL=10, SKIP=19) > > > for Windows 10 python 3.5 > > commandline nostetests numpy > Ran 5615 tests in 423.469s > FAILED (SKIP=33, errors=15) > > numpy.test() > Ran 5574 tests in 21.546s > FAILED (KNOWNFAIL=8, SKIP=12, errors=1) > error is only missing vcvarsall.bat > > > Why is there the large difference in how we run the tests? (I'm getting old or forgetting non-stats things.) default of .test() is skip slow, nosetests doesn't skip I'm getting the same results also on Winpython 3.4.3.6 on Windows 10. But python 3.5 install without distribution is not working for me. I get dll load failed, missing dll, all over the place with the default PSF install when importing Gohlke scipy. So, I'm back to winpython and python 3.4 Josef > > > Related: The example from #6923 doesn't crash in my settings, neither > python 3.4 nor 3.5. > > > Josef > > >> >> Chuck >> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> From gfyoung17 at gmail.com Sun Jan 3 18:51:28 2016 From: gfyoung17 at gmail.com (G Young) Date: Sun, 3 Jan 2016 15:51:28 -0800 Subject: [Numpy-discussion] deprecate random.random_integers Message-ID: Hello all, In light of the discussion in #6910 , I have gone ahead and deprecated random_integers in my most recent PR here . As this is an API change (sort of), what are people's thoughts on this deprecation? Thanks! Greg -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Sun Jan 3 18:54:27 2016 From: gfyoung17 at gmail.com (G Young) Date: Sun, 3 Jan 2016 15:54:27 -0800 Subject: [Numpy-discussion] dtype random.rand Message-ID: Hello, Issue #6790 had requested the enhancement of adding the dtype argument to both random.randint and random.rand. With #6910 merged in, that addresses the first half of the request. What do people think of adding a dtype argument to random.rand? Thanks! Greg -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray+numpy at gmail.com Mon Jan 4 12:35:36 2016 From: erik.m.bray+numpy at gmail.com (Erik Bray) Date: Mon, 4 Jan 2016 12:35:36 -0500 Subject: [Numpy-discussion] Windows build/distribute plan & MingwPy funding Message-ID: On Sat, Jan 2, 2016 at 3:20 AM, Ralf Gommers wrote: > Hi all, > > You probably know that building Numpy, Scipy and the rest of the Scipy Stack > on Windows is problematic. And that there are plans to adopt the static > MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on for > the last two years to fix that situation. > > The good news is: this has become a lot more concrete just now, with this > proposal for funding: http://mingwpy.github.io/proposal_december2015.html > > Funding for phases 1 and 2 is already confirmed; the phase 3 part has been > submitted to the PSF. Phase 1 (of $1000) is funded by donations made to > Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS > directly. So a big thank you to everyone who made a donation to Numpy, Scipy > and NumFOCUS! > > I hope that that proposal gives a clear idea of the work that's going to be > done over the next months. Note that the http://mingwpy.github.io contains a > lot more background info, description of technical issues, etc. > > Feedback & ideas very welcome of course! > > Cheers, > Ralf Hi Ralph, I've seen you drop hints about this recently, and am interested to follow this work. I've been hired as part of the OpenDreamKit project to work, in large part, on developing a sensible toolchain for building and distributing Sage on Windows. I know you've been following the thread on that too. Although the primary goal there is "whatever works", I'm personally inclined to focus on the mingwpy / mingw-w64 approach, due in large part with my past success with the MinGW 32-bit toolchain. (I personally have a desire to improve support for building with MSVC as well, but that's a less important goal as far as the funding is concerned.) So anyways, please keep me in the loop about this, as I will also be putting effort into this over the next year as well. Has there been any discussion about setting up a mailing list specifically for this project? Thanks, Erik From matthew.brett at gmail.com Mon Jan 4 13:38:51 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 4 Jan 2016 18:38:51 +0000 Subject: [Numpy-discussion] Windows build/distribute plan & MingwPy funding In-Reply-To: References: Message-ID: Hi, On Mon, Jan 4, 2016 at 5:35 PM, Erik Bray wrote: > On Sat, Jan 2, 2016 at 3:20 AM, Ralf Gommers wrote: >> Hi all, >> >> You probably know that building Numpy, Scipy and the rest of the Scipy Stack >> on Windows is problematic. And that there are plans to adopt the static >> MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on for >> the last two years to fix that situation. >> >> The good news is: this has become a lot more concrete just now, with this >> proposal for funding: http://mingwpy.github.io/proposal_december2015.html >> >> Funding for phases 1 and 2 is already confirmed; the phase 3 part has been >> submitted to the PSF. Phase 1 (of $1000) is funded by donations made to >> Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS >> directly. So a big thank you to everyone who made a donation to Numpy, Scipy >> and NumFOCUS! >> >> I hope that that proposal gives a clear idea of the work that's going to be >> done over the next months. Note that the http://mingwpy.github.io contains a >> lot more background info, description of technical issues, etc. >> >> Feedback & ideas very welcome of course! >> >> Cheers, >> Ralf > > Hi Ralph, > > I've seen you drop hints about this recently, and am interested to > follow this work. I've been hired as part of the OpenDreamKit project > to work, in large part, on developing a sensible toolchain for > building and distributing Sage on Windows. I know you've been > following the thread on that too. Although the primary goal there is > "whatever works", I'm personally inclined to focus on the mingwpy / > mingw-w64 approach, due in large part with my past success with the > MinGW 32-bit toolchain. (I personally have a desire to improve > support for building with MSVC as well, but that's a less important > goal as far as the funding is concerned.) > > So anyways, please keep me in the loop about this, as I will also be > putting effort into this over the next year as well. Has there been > any discussion about setting up a mailing list specifically for this > project? Yes, it exists already, but not well advertised : https://groups.google.com/forum/#!forum/mingwpy It would be great to share work. Cheers, Matthew From robert.kern at gmail.com Mon Jan 4 14:41:45 2016 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 4 Jan 2016 19:41:45 +0000 Subject: [Numpy-discussion] deprecate random.random_integers In-Reply-To: References: Message-ID: On Sun, Jan 3, 2016 at 11:51 PM, G Young wrote: > > Hello all, > > In light of the discussion in #6910, I have gone ahead and deprecated random_integers in my most recent PR here. As this is an API change (sort of), what are people's thoughts on this deprecation? I'm reasonably in favor. random_integers() with its closed-interval convention only exists because it existed in Numeric's RandomArray module. The closed-interval convention was broadly been considered to be a mistake introduced early in the stdlib random module and rectified with the introduction and promotion of random.randrange() instead. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Jan 4 15:38:42 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 4 Jan 2016 13:38:42 -0700 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Sat, Jan 2, 2016 at 10:05 PM, wrote: > On Sat, Jan 2, 2016 at 4:47 PM, Charles R Harris > wrote: > > Hi All, > > > > A significant segfault problem has been reported against Numpy 1.10.2 > and I > > want to make a quick 1.10.3 release to get it fixed. Two questions > > > > What exactly is the release process that has been decided on? AFAIK, I > > should just do a source release on Sourceforge, ping Matthew to produce > > wheels for Mac and wait for him to put them on pypi, and then upload the > > sources to pypi. No windows binaries are to be produced. > > Is there anything else that needs fixing for 1.10.3? > > > I'm running the 1.10.2 tests on Windows 10 in a virtualbox on Windows 8.1 > using Gohlke binary for MKL on a fresh Python 3.5 > > This test > "Test workarounds for 32-bit limited fwrite, fseek, and ftell ..." > is taking a very long time. Is this expected? > > > I get the following errors on Windows 10, and also on Windows 8.1 > Winpython 3.4 (except for the last "Unable to find vcvarsall.bat" > because it's set up for compiling with mingw) > > Earlier I also got a ref count error message but I don't see it > anymore, so maybe I messed up when trying Ctrl+C to kill the tests. > > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'fib2') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_ext\tests\test_fib2.py", > line 4, in > from f2py_ext import fib2 > ImportError: cannot import name 'fib2' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'foo') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\f2py_f90_ext\tests\test_foo.py", > line 4, in > from f2py_f90_ext import foo > ImportError: cannot import name 'foo' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'fib3') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\gen_ext\tests\test_fib3.py", > line 3, in > from gen_ext import fib3 > ImportError: cannot import name 'fib3' > > ====================================================================== > ERROR: Failure: ImportError (No module named 'pyrex_ext.primes') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\pyrex_ext\tests\test_primes.py", > line 4, in > from pyrex_ext.primes import primes > ImportError: No module named 'pyrex_ext.primes' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'example') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example.py", > line 4, in > from swig_ext import example > ImportError: cannot import name 'example' > > ====================================================================== > ERROR: Failure: ImportError (cannot import name 'example2') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\failure.py", > line 39, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\loader.py", > line 418, in loadTestsFromName > addr.filename, addr.module) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\nose\importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 234, in load_module > return load_source(name, filename, file) > File "c:\users\josef\appdata\local\programs\python\python35\lib\imp.py", > line 172, in load_source > module = _load(spec) > File "", line 693, in _load > File "", line 673, in _load_unlocked > File "", line 662, in exec_module > File "", line 222, in > _call_with_frames_removed > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\swig_ext\tests\test_example2.py", > line 4, in > from swig_ext import example2 > ImportError: cannot import name 'example2' > > ====================================================================== > ERROR: test_compile1 (test_system_info.TestSystemInfoReading) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "c:\users\josef\appdata\local\programs\python\python35\lib\site-packages\numpy\distutils\tests\test_system_info.py", > line 182, in test_compile1 > c.compile([os.path.basename(self._src1)], output_dir=self._dir1) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 317, in compile > self.initialize() > File > "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 210, in initialize > vc_env = _get_vc_env(plat_spec) > File > "c:\users\josef\appdata\local\programs\python\python35\lib\distutils\_msvccompiler.py", > line 85, in _get_vc_env > raise DistutilsPlatformError("Unable to find vcvarsall.bat") > distutils.errors.DistutilsPlatformError: Unable to find vcvarsall.bat > > > Running it again > > for Windows 8.1 python 3.4 > > The above is with `nosetests numpy` > Ran 5232 tests in 134.933s > FAILED (SKIP=40, errors=16) > > if I run numpy.test() in the python session, I get > Ran 5191 tests in 22.506s > OK (KNOWNFAIL=10, SKIP=19) > > > for Windows 10 python 3.5 > > commandline nostetests numpy > Ran 5615 tests in 423.469s > FAILED (SKIP=33, errors=15) > > numpy.test() > Ran 5574 tests in 21.546s > FAILED (KNOWNFAIL=8, SKIP=12, errors=1) > error is only missing vcvarsall.bat > > > Why is there the large difference in how we run the tests? > > > Related: The example from #6923 doesn't crash in my settings, neither > python 3.4 nor 3.5. > The failed tests require pyrex, fortran, and swig. The refcount error comes and goes, probably the test isn't very good. Ralf, is there any reason to keep the various extension building tests? They are very old. I don't know what the recommendation on using nosetests is, but probably it is finding too many tests. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Mon Jan 4 21:15:07 2016 From: gfyoung17 at gmail.com (G Young) Date: Mon, 4 Jan 2016 18:15:07 -0800 Subject: [Numpy-discussion] Building master issues on Windows Message-ID: <8AD23173-C365-4F3B-BC58-0A31D06F2590@gmail.com> Hello all, I've recently encountered issues building numpy off master on Windows in which setup.py complains that it can't find the Advapi library in any directories (which is an empty list). I scanned the DLL under System32 and ran /sfc scannow as Administrator, and both came up clean. Is anyone else encountering (or can reproduce) this issue, or is it just me? Any suggestions about how to resolve this problem? Thanks! Greg From charlesr.harris at gmail.com Mon Jan 4 21:20:23 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 4 Jan 2016 19:20:23 -0700 Subject: [Numpy-discussion] 1.10.3 release tomorrow, 1.11.x branch this month. Message-ID: Hi All, I'm going to attempt a 1.10.3 release tomorrow to address the segfault reported in #6922 . I'd also like to branch 1.11.x sometime this month, hopefully around Jan 18 (two weeks from now). There are some unresolved issues, in particular __numpy_ufunc__, but there is plenty of accumulated material and I'd like to get something out before the tinder buildup constitutes a fire hazard. Releases generate plenty of sparks and it would be good if the 1.11.0 release was less flamable than the 1.10 release was. I will take a look through the current bug fix PRs with intent to merge as many as possible before the branch, but enhancements will not be a high priority except for a couple that have been sitting ready in the queue for awhile. If there are some enhancements that you think need to be in the release, mention them here. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaime.frio at gmail.com Mon Jan 4 23:17:11 2016 From: jaime.frio at gmail.com (=?UTF-8?Q?Jaime_Fern=C3=A1ndez_del_R=C3=ADo?=) Date: Tue, 5 Jan 2016 05:17:11 +0100 Subject: [Numpy-discussion] Building master issues on Windows In-Reply-To: <8AD23173-C365-4F3B-BC58-0A31D06F2590@gmail.com> References: <8AD23173-C365-4F3B-BC58-0A31D06F2590@gmail.com> Message-ID: On Tue, Jan 5, 2016 at 3:15 AM, G Young wrote: > Hello all, > > I've recently encountered issues building numpy off master on Windows in > which setup.py complains that it can't find the Advapi library in any > directories (which is an empty list). I scanned the DLL under System32 and > ran /sfc scannow as Administrator, and both came up clean. Is anyone else > encountering (or can reproduce) this issue, or is it just me? Any > suggestions about how to resolve this problem? > Can you give more details on your setup? Jaime (\__/) ( O.o) ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de dominaci?n mundial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Tue Jan 5 02:45:08 2016 From: gfyoung17 at gmail.com (G Young) Date: Mon, 4 Jan 2016 23:45:08 -0800 Subject: [Numpy-discussion] Building master issues on Windows In-Reply-To: References: <8AD23173-C365-4F3B-BC58-0A31D06F2590@gmail.com> Message-ID: Sure. I'm running Windows 7 with Python 2.7.11, gcc 4.8.1, and GNU Fortran 5.2.0. It's very similar to the Appveyor Python 2 build. When I am in the numpy root directory, I run "*python setup.py build_ext -i*". I've attached the output I get from the terminal. Perhaps that might illuminate some issues that I'm not seeing. Greg On Mon, Jan 4, 2016 at 8:17 PM, Jaime Fern?ndez del R?o < jaime.frio at gmail.com> wrote: > On Tue, Jan 5, 2016 at 3:15 AM, G Young wrote: > >> Hello all, >> >> I've recently encountered issues building numpy off master on Windows in >> which setup.py complains that it can't find the Advapi library in any >> directories (which is an empty list). I scanned the DLL under System32 and >> ran /sfc scannow as Administrator, and both came up clean. Is anyone else >> encountering (or can reproduce) this issue, or is it just me? Any >> suggestions about how to resolve this problem? >> > > Can you give more details on your setup? > > Jaime > > (\__/) > ( O.o) > ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes > de dominaci?n mundial. > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- Processing numpy/random\mtrand\mtrand.pyx Cythonizing sources blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE openblas_info: libraries openblas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE atlas_3_10_blas_threads_info: Setting PTATLAS=ATLAS libraries tatlas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE atlas_3_10_blas_info: libraries satlas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE atlas_blas_info: libraries f77blas,cblas,atlas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE blas_info: FOUND: libraries = ['blas'] library_dirs = ['C:\\Python27\\libs'] language = f77 FOUND: libraries = ['blas'] library_dirs = ['C:\\Python27\\libs'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 non-existing path in 'numpy\\distutils': 'site.cfg' F2PY Version 2 lapack_opt_info: openblas_lapack_info: libraries openblas not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE lapack_mkl_info: mkl_info: libraries mkl,vml,guide not found in ['C:\\Python27\\lib', 'C:\\', 'C:\\Python27\\libs'] NOT AVAILABLE NOT AVAILABLE atlas_3_10_threads_info: Setting PTATLAS=ATLAS libraries tatlas,tatlas not found in C:\Python27\lib libraries lapack_atlas not found in C:\Python27\lib libraries tatlas,tatlas not found in C:\ libraries lapack_atlas not found in C:\ libraries tatlas,tatlas not found in C:\Python27\libs libraries lapack_atlas not found in C:\Python27\libs NOT AVAILABLE atlas_3_10_info: libraries satlas,satlas not found in C:\Python27\lib libraries lapack_atlas not found in C:\Python27\lib libraries satlas,satlas not found in C:\ libraries lapack_atlas not found in C:\ libraries satlas,satlas not found in C:\Python27\libs libraries lapack_atlas not found in C:\Python27\libs NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in C:\Python27\lib libraries lapack_atlas not found in C:\Python27\lib libraries ptf77blas,ptcblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries ptf77blas,ptcblas,atlas not found in C:\Python27\libs libraries lapack_atlas not found in C:\Python27\libs NOT AVAILABLE atlas_info: libraries f77blas,cblas,atlas not found in C:\Python27\lib libraries lapack_atlas not found in C:\Python27\lib libraries f77blas,cblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries f77blas,cblas,atlas not found in C:\Python27\libs libraries lapack_atlas not found in C:\Python27\libs NOT AVAILABLE lapack_info: FOUND: libraries = ['lapack'] library_dirs = ['C:\\Python27\\libs'] language = f77 FOUND: libraries = ['lapack', 'blas'] library_dirs = ['C:\\Python27\\libs'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 running build_ext running build_src build_src building py_modules sources building library "npymath" sources creating build creating build\src.win32-2.7 customize GnuFCompiler Could not locate executable g77 Could not locate executable f77 customize IntelVisualFCompiler Could not locate executable ifort Could not locate executable ifl customize AbsoftFCompiler Could not locate executable f90 customize CompaqVisualFCompiler Found executable C:\MinGW\msys\1.0\bin\DF.exe Found executable C:\MinGW\msys\1.0\bin\DF.exe customize IntelItaniumVisualFCompiler Could not locate executable efl customize Gnu95FCompiler Found executable C:\MinGW\mingw64\bin\gfortran.exe Found executable C:\MinGW\mingw64\bin\gfortran.exe customize Gnu95FCompiler customize Gnu95FCompiler using config C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj Found executable C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest Found executable C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 Found executable C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\WinSDK\Bin\mt.exe success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe conv_template:> numpy\core\src\npymath\npy_math.c conv_template:> numpy\core\src\npymath\ieee754.c conv_template:> numpy\core\src\npymath\npy_math_complex.c building library "npysort" sources conv_template:> numpy\core\src\npysort\quicksort.c conv_template:> numpy\core\src\npysort\mergesort.c conv_template:> numpy\core\src\npysort\heapsort.c conv_template:> numpy\core\src\private\npy_partition.h adding 'numpy\core\src\private' to include_dirs. conv_template:> numpy\core\src\npysort\selection.c conv_template:> numpy\core\src\private\npy_binsearch.h conv_template:> numpy\core\src\npysort\binsearch.c None - nothing done with h_files = ['numpy\\core\\src\\private\\npy_partition.h', 'numpy\\core\\src\\private\\npy_binsearch.h'] building extension "numpy.core._dummy" sources Generating numpy\core\include/numpy\config.h C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(1) : fatal error C1083: Cannot open include file: 'endian.h': No such file or directory failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(7) : error C2065: 'SIZEOF_LONGDOUBLE' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(5) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(5) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj _configtest.c _configtest.obj _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(5) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(5) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj _configtest.c _configtest.obj _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(7) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(7) : error C2118: negative subscript C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_EXPM1' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_LOG1P' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_ACOSH' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_ATANH' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_ASINH' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_FTELLO' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_FSEEKO' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fallocate referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _strtold_l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _cbrt referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _strtoull referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _strtoll referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _fseeko referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _ftello referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _nextafter referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log2 referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _exp2 referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _trunc referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _rint referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atanh referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _asinh referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _acosh referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log1p referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _expm1 referenced in function _main _configtest.exe : fatal error LNK1120: 17 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _expm1 referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log1p referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _acosh referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _asinh referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atanh referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _rint referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _trunc referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _exp2 referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log2 referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _nextafter referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _ftello referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fseeko referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _strtoll referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _strtoull referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cbrt referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _strtold_l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fallocate referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(1) : fatal error C1083: Cannot open include file: 'features.h': No such file or directory failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_isnan referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_isinf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_isfinite referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_bswap32 referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_bswap64 referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_expect referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_mul_overflow referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol ___builtin_prefetch referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : warning C4068: unknown pragma _configtest.c(3) : warning C4068: unknown pragma _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing ')' before 'string' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing '{' before 'string' _configtest.c(5) : error C2059: syntax error : '' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : warning C4068: unknown pragma _configtest.c(3) : warning C4068: unknown pragma _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing ')' before 'string' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing '{' before 'string' _configtest.c(5) : error C2059: syntax error : '' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : warning C4068: unknown pragma _configtest.c(3) : warning C4068: unknown pragma _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2143: syntax error : missing ')' before '(' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing ')' before 'constant' _configtest.c(5) : error C2091: function returns function _configtest.c(5) : error C2143: syntax error : missing '{' before 'constant' _configtest.c(5) : error C2059: syntax error : '' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' _configtest.c(5) : error C2059: syntax error : ')' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : warning C4068: unknown pragma _configtest.c(3) : warning C4068: unknown pragma _configtest.c(5) : error C2061: syntax error : identifier 'foo' _configtest.c(5) : error C2059: syntax error : ';' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cbrtf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _nextafterf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _copysignf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log2f referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _exp2f referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _ldexpf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _frexpf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _modff referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _fmodf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _powf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atan2f referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _hypotf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atanhf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _acoshf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _asinhf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atanf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _acosf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _asinf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _expm1f referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _expf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log1pf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _logf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log10f referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sqrtf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _truncf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _rintf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _ceilf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _floorf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _fabsf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _tanhf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _coshf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sinhf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _tanf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _cosf referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sinf referenced in function _main _configtest.exe : fatal error LNK1120: 35 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sinf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cosf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _tanf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sinhf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _coshf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _tanhf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fabsf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _floorf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _ceilf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _rintf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _truncf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sqrtf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log10f referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _logf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log1pf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _expf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _expm1f referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _asinf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _acosf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atanf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _asinhf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _acoshf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atanhf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _hypotf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atan2f referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _powf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fmodf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _modff referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _frexpf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _ldexpf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _exp2f referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log2f referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _copysignf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _nextafterf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cbrtf referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cbrtl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _nextafterl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _copysignl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log2l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _exp2l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _ldexpl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _frexpl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _modfl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _fmodl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _powl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atan2l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _hypotl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atanhl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _acoshl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _asinhl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _atanl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _acosl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _asinl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _expm1l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _expl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log1pl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _logl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _log10l referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sqrtl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _truncl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _rintl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _ceill referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _floorl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _fabsl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _tanhl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _coshl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sinhl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _tanl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _cosl referenced in function _main _configtest.obj : error LNK2019: unresolved external symbol _sinl referenced in function _main _configtest.exe : fatal error LNK1120: 35 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sinl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cosl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _tanl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sinhl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _coshl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _tanhl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fabsl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _floorl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _ceill referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _rintl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _truncl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _sqrtl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log10l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _logl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log1pl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _expl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _expm1l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _asinl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _acosl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atanl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _asinhl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _acoshl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atanhl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _hypotl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _atan2l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _powl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _fmodl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _modfl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _frexpl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _ldexpl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _exp2l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _log2l referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _copysignl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _nextafterl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest _configtest.obj : error LNK2019: unresolved external symbol _cbrtl referenced in function _main _configtest.exe : fatal error LNK1120: 1 unresolved externals failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_DECL_SIGNBIT' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'HAVE_DECL_ISFINITE' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'signbit' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(8) : error C2065: 'isfinite' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(1) : fatal error C1083: Cannot open include file: 'complex.h': No such file or directory failure. removing: _configtest.c _configtest.obj BUILD_ARCHITECTURE: 'Intel', os.name='nt', sys.platform='win32' C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : error C2146: syntax error : missing ')' before identifier 'a' _configtest.c(2) : error C2061: syntax error : identifier 'a' _configtest.c(2) : error C2059: syntax error : ';' _configtest.c(2) : error C2059: syntax error : ')' _configtest.c(3) : error C2449: found '{' at file scope (missing function header?) _configtest.c(5) : error C2059: syntax error : '}' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(2) : error C2146: syntax error : missing ')' before identifier 'a' _configtest.c(2) : error C2061: syntax error : identifier 'a' _configtest.c(2) : error C2059: syntax error : ';' _configtest.c(2) : error C2059: syntax error : ')' _configtest.c(3) : error C2449: found '{' at file scope (missing function header?) _configtest.c(5) : error C2059: syntax error : '}' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(3) : error C2054: expected '(' to follow 'inline' _configtest.c(4) : error C2085: 'static_func' : not in formal parameter list _configtest.c(4) : error C2143: syntax error : missing ';' before '{' _configtest.c(7) : error C2054: expected '(' to follow 'inline' _configtest.c(8) : error C2085: 'nostatic_func' : not in formal parameter list _configtest.c(8) : error C2143: syntax error : missing ';' before '{' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(3) : error C2054: expected '(' to follow '__inline__' _configtest.c(4) : error C2085: 'static_func' : not in formal parameter list _configtest.c(4) : error C2143: syntax error : missing ';' before '{' _configtest.c(7) : error C2054: expected '(' to follow '__inline__' _configtest.c(8) : error C2085: 'nostatic_func' : not in formal parameter list _configtest.c(8) : error C2143: syntax error : missing ';' before '{' failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj success! removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(7) : error C2065: 'Py_UNICODE_WIDE' : undeclared identifier failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj removing: _configtest.c _configtest.obj File: numpy\core\include/numpy\config.h #define SIZEOF_PY_INTPTR_T 4 #define SIZEOF_OFF_T 4 #define SIZEOF_PY_LONG_LONG 8 #define MATHLIB #define HAVE_SIN 1 #define HAVE_COS 1 #define HAVE_TAN 1 #define HAVE_SINH 1 #define HAVE_COSH 1 #define HAVE_TANH 1 #define HAVE_FABS 1 #define HAVE_FLOOR 1 #define HAVE_CEIL 1 #define HAVE_SQRT 1 #define HAVE_LOG10 1 #define HAVE_LOG 1 #define HAVE_EXP 1 #define HAVE_ASIN 1 #define HAVE_ACOS 1 #define HAVE_ATAN 1 #define HAVE_FMOD 1 #define HAVE_MODF 1 #define HAVE_FREXP 1 #define HAVE_LDEXP 1 #define HAVE_ATAN2 1 #define HAVE_POW 1 #define HAVE_XMMINTRIN_H 1 #define HAVE_EMMINTRIN_H 1 #define HAVE__MM_LOAD_PS 1 #define HAVE__MM_PREFETCH 1 #define HAVE__MM_LOAD_PD 1 #define HAVE___DECLSPEC_THREAD_ 1 #define __NPY_PRIVATE_NO_SIGNAL #define FORCE_NO_LONG_DOUBLE_FORMATTING #define NPY_RESTRICT __restrict #define NPY_RELAXED_STRIDES_CHECKING 1 #define HAVE_LDOUBLE_IEEE_DOUBLE_LE 1 #ifndef __cplusplus #define inline __inline #endif #ifndef _NPY_NPY_CONFIG_H_ #error config.h should never be included directly, include npy_config.h instead #endif EOF adding 'numpy\core\include/numpy\config.h' to sources. Generating numpy\core\include/numpy\_numpyconfig.h C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /nologo /INCREMENTAL:NO _configtest.obj /OUT:_configtest.exe /MANIFESTFILE:_configtest.exe.manifest mt.exe -nologo -manifest _configtest.exe.manifest -outputresource:_configtest.exe;1 success! removing: _configtest.c _configtest.obj _configtest.exe C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(1) : fatal error C1083: Cannot open include file: 'inttypes.h': No such file or directory failure. removing: _configtest.c _configtest.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC /Tc_configtest.c /Fo_configtest.obj _configtest.c _configtest.c(6) : fatal error C1189: #error : gcc >= 4 required failure. removing: _configtest.c _configtest.obj File: numpy\core\include/numpy\_numpyconfig.h #define NPY_SIZEOF_SHORT SIZEOF_SHORT #define NPY_SIZEOF_INT SIZEOF_INT #define NPY_SIZEOF_LONG SIZEOF_LONG #define NPY_SIZEOF_FLOAT 4 #define NPY_SIZEOF_COMPLEX_FLOAT 8 #define NPY_SIZEOF_DOUBLE 8 #define NPY_SIZEOF_COMPLEX_DOUBLE 16 #define NPY_SIZEOF_LONGDOUBLE 8 #define NPY_SIZEOF_COMPLEX_LONGDOUBLE 16 #define NPY_SIZEOF_PY_INTPTR_T 4 #define NPY_SIZEOF_OFF_T 4 #define NPY_SIZEOF_PY_LONG_LONG 8 #define NPY_SIZEOF_LONGLONG 8 #define NPY_NO_SIGNAL 1 #define NPY_NO_SMP 0 #define NPY_HAVE_DECL_ISNAN #define NPY_HAVE_DECL_ISINF #define NPY_RELAXED_STRIDES_CHECKING 1 #define NPY_VISIBILITY_HIDDEN #define NPY_ABI_VERSION 0x01000009 #define NPY_API_VERSION 0x0000000A #ifndef __STDC_FORMAT_MACROS #define __STDC_FORMAT_MACROS 1 #endif EOF adding 'numpy\core\include/numpy\_numpyconfig.h' to sources. executing numpy\core\code_generators\generate_numpy_api.py adding 'numpy\core\include/numpy\__multiarray_api.h' to sources. numpy.core - nothing done with h_files = ['numpy\\core\\include/numpy\\config.h', 'numpy\\core\\include/numpy\\_numpyconfig.h', 'numpy\\core\\include/numpy\\__multiarray_api.h'] building extension "numpy.core.multiarray" sources adding 'numpy\core\include/numpy\config.h' to sources. adding 'numpy\core\include/numpy\_numpyconfig.h' to sources. executing numpy\core\code_generators\generate_numpy_api.py adding 'numpy\core\include/numpy\__multiarray_api.h' to sources. conv_template:> numpy\core\src\multiarray\arraytypes.c conv_template:> numpy\core\src\multiarray\einsum.c conv_template:> numpy\core\src\multiarray\lowlevel_strided_loops.c conv_template:> numpy\core\src\multiarray\nditer_templ.c conv_template:> numpy\core\src\multiarray\scalartypes.c conv_template:> numpy\core\src\private\templ_common.h adding 'numpy\core\src\private' to include_dirs. numpy.core - nothing done with h_files = ['numpy\\core\\src\\private\\templ_common.h', 'numpy\\core\\include/numpy\\config.h', 'numpy\\core\\include/numpy\\_numpyconfig.h', 'numpy\\core\\include/numpy\\__multiarray_api.h'] building extension "numpy.core.umath" sources adding 'numpy\core\include/numpy\config.h' to sources. adding 'numpy\core\include/numpy\_numpyconfig.h' to sources. executing numpy\core\code_generators\generate_ufunc_api.py adding 'numpy\core\include/numpy\__ufunc_api.h' to sources. conv_template:> numpy\core\src\umath\funcs.inc adding 'numpy\core\src\umath' to include_dirs. conv_template:> numpy\core\src\umath\simd.inc conv_template:> numpy\core\src\umath\loops.h conv_template:> numpy\core\src\umath\loops.c conv_template:> numpy\core\src\umath\scalarmath.c numpy.core - nothing done with h_files = ['numpy\\core\\src\\umath\\funcs.inc', 'numpy\\core\\src\\umath\\simd.inc', 'numpy\\core\\src\\umath\\loops.h', 'numpy\\core\\include/numpy\\config.h', 'numpy\\core\\include/numpy\\_numpyconfig.h', 'numpy\\core\\include/numpy\\__ufunc_api.h'] building extension "numpy.core.umath_tests" sources conv_template:> numpy\core\src\umath\umath_tests.c building extension "numpy.core.test_rational" sources conv_template:> numpy\core\src\umath\test_rational.c building extension "numpy.core.struct_ufunc_test" sources conv_template:> numpy\core\src\umath\struct_ufunc_test.c building extension "numpy.core.multiarray_tests" sources conv_template:> numpy\core\src\multiarray\multiarray_tests.c building extension "numpy.core.operand_flag_tests" sources conv_template:> numpy\core\src\umath\operand_flag_tests.c building extension "numpy.fft.fftpack_lite" sources building extension "numpy.linalg.lapack_lite" sources ### Warning: python_xerbla.c is disabled ### building extension "numpy.linalg._umath_linalg" sources ### Warning: python_xerbla.c is disabled ### conv_template:> numpy\linalg\umath_linalg.c building extension "numpy.random.mtrand" sources building data_files sources build_src: building npy-pkg config files customize MSVCCompiler customize MSVCCompiler using build_clib building 'npymath' library compiling C sources creating build\temp.win32-2.7 creating build\temp.win32-2.7\numpy creating build\temp.win32-2.7\numpy\core creating build\temp.win32-2.7\numpy\core\src creating build\temp.win32-2.7\numpy\core\src\npymath C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npymath\npy_math.c /Fobuild\temp.win32-2.7\numpy\core\src\npymath\npy_math.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npymath\ieee754.c /Fobuild\temp.win32-2.7\numpy\core\src\npymath\ieee754.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npymath\npy_math_complex.c /Fobuild\temp.win32-2.7\numpy\core\src\npymath\npy_math_complex.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npymath\halffloat.c /Fobuild\temp.win32-2.7\numpy\core\src\npymath\halffloat.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\lib.exe build\temp.win32-2.7\numpy\core\src\npymath\npy_math.obj build\temp.win32-2.7\numpy\core\src\npymath\ieee754.obj build\temp.win32-2.7\numpy\core\src\npymath\npy_math_complex.obj build\temp.win32-2.7\numpy\core\src\npymath\halffloat.obj /OUT:build\temp.win32-2.7\npymath.lib Found executable C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\lib.exe building 'npysort' library compiling C sources creating build\temp.win32-2.7\numpy\core\src\npysort C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npysort\quicksort.c /Fobuild\temp.win32-2.7\numpy\core\src\npysort\quicksort.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npysort\mergesort.c /Fobuild\temp.win32-2.7\numpy\core\src\npysort\mergesort.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npysort\heapsort.c /Fobuild\temp.win32-2.7\numpy\core\src\npysort\heapsort.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npysort\selection.c /Fobuild\temp.win32-2.7\numpy\core\src\npysort\selection.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\npysort\binsearch.c /Fobuild\temp.win32-2.7\numpy\core\src\npysort\binsearch.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\lib.exe build\temp.win32-2.7\numpy\core\src\npysort\quicksort.obj build\temp.win32-2.7\numpy\core\src\npysort\mergesort.obj build\temp.win32-2.7\numpy\core\src\npysort\heapsort.obj build\temp.win32-2.7\numpy\core\src\npysort\selection.obj build\temp.win32-2.7\numpy\core\src\npysort\binsearch.obj /OUT:build\temp.win32-2.7\npysort.lib customize MSVCCompiler customize MSVCCompiler using build_ext customize GnuFCompiler customize IntelVisualFCompiler customize AbsoftFCompiler customize CompaqVisualFCompiler customize IntelItaniumVisualFCompiler customize Gnu95FCompiler customize Gnu95FCompiler customize Gnu95FCompiler using build_ext building 'numpy.core._dummy' extension compiling C sources creating build\temp.win32-2.7\Release creating build\temp.win32-2.7\Release\numpy creating build\temp.win32-2.7\Release\numpy\core creating build\temp.win32-2.7\Release\numpy\core\src C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\dummymodule.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\dummymodule.obj copying C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0\libgfortran.a -> build\temp.win32-2.7\Release\gfortran.lib copying C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0\libgcc.a -> build\temp.win32-2.7\Release\gcc.lib copying C:\MinGW\mingw64\x86_64-w64-mingw32\lib\libmingw32.a -> build\temp.win32-2.7\Release\mingw32.lib copying C:\MinGW\mingw64\x86_64-w64-mingw32\lib\libmingwex.a -> build\temp.win32-2.7\Release\mingwex.lib C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:init_dummy build\temp.win32-2.7\Release\numpy\core\src\dummymodule.obj /OUT:numpy\core\_dummy.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\_dummy.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\_dummy.pyd.manifest building 'numpy.core.multiarray' extension compiling C sources creating build\temp.win32-2.7\Release\numpy\core\src\multiarray creating build\temp.win32-2.7\Release\numpy\core\src\private C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\alloc.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\alloc.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\arrayobject.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\arrayobject.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\arraytypes.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\arraytypes.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\array_assign.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\array_assign_scalar.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign_scalar.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\array_assign_array.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign_array.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\buffer.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\buffer.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\calculation.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\calculation.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\compiled_base.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\compiled_base.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\common.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\common.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\convert.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\convert.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\convert_datatype.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\convert_datatype.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\conversion_utils.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\conversion_utils.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\ctors.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\ctors.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\datetime.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\datetime_strings.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_strings.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\datetime_busday.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_busday.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\datetime_busdaycal.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_busdaycal.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\descriptor.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\descriptor.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\dtype_transfer.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\dtype_transfer.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\einsum.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\einsum.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\flagsobject.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\flagsobject.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\getset.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\getset.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\hashdescr.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\hashdescr.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\item_selection.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\item_selection.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\iterators.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\iterators.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\lowlevel_strided_loops.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\lowlevel_strided_loops.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\mapping.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\mapping.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\methods.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\methods.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\multiarraymodule.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarraymodule.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\nditer_templ.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_templ.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\nditer_api.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_api.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\nditer_constr.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_constr.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\nditer_pywrap.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_pywrap.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\number.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\number.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\numpymemoryview.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\numpymemoryview.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\numpyos.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\numpyos.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\refcount.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\refcount.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\sequence.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\sequence.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\shape.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\shape.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\scalarapi.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\scalarapi.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\scalartypes.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\scalartypes.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\usertypes.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\usertypes.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\ucsnarrow.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\ucsnarrow.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\vdot.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\vdot.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\private -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\private\mem_overlap.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\private\mem_overlap.obj could not find library 'npymath' in directories [] could not find library 'npysort' in directories [] C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 npymath.lib npysort.lib gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initmultiarray build\temp.win32-2.7\Release\numpy\core\src\multiarray\alloc.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\arrayobject.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\arraytypes.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign_scalar.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\array_assign_array.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\buffer.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\calculation.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\compiled_base.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\common.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\convert.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\convert_datatype.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\conversion_utils.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\ctors.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_strings.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_busday.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\datetime_busdaycal.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\descriptor.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\dtype_transfer.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\einsum.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\flagsobject.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\getset.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\hashdescr.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\item_selection.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\iterators.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\lowlevel_strided_loops.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\mapping.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\methods.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarraymodule.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_templ.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_api.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_constr.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\nditer_pywrap.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\number.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\numpymemoryview.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\numpyos.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\refcount.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\sequence.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\shape.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\scalarapi.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\scalartypes.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\usertypes.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\ucsnarrow.obj build\temp.win32-2.7\Release\numpy\core\src\multiarray\vdot.obj build\temp.win32-2.7\Release\numpy\core\src\private\mem_overlap.obj /OUT:numpy\core\multiarray.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray.pyd.manifest building 'numpy.core.umath' extension compiling C sources creating build\temp.win32-2.7\Release\numpy\core\src\umath C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\umathmodule.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\umathmodule.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\reduction.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\reduction.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\loops.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\loops.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\ufunc_object.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\ufunc_object.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\scalarmath.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\scalarmath.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\src\umath -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\ufunc_type_resolution.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\ufunc_type_resolution.obj could not find library 'npymath' in directories [] C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 npymath.lib gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initumath build\temp.win32-2.7\Release\numpy\core\src\umath\umathmodule.obj build\temp.win32-2.7\Release\numpy\core\src\umath\reduction.obj build\temp.win32-2.7\Release\numpy\core\src\umath\loops.obj build\temp.win32-2.7\Release\numpy\core\src\umath\ufunc_object.obj build\temp.win32-2.7\Release\numpy\core\src\umath\scalarmath.obj build\temp.win32-2.7\Release\numpy\core\src\umath\ufunc_type_resolution.obj /OUT:numpy\core\umath.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\umath\umath.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\umath\umath.pyd.manifest building 'numpy.core.umath_tests' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\umath_tests.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\umath_tests.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initumath_tests build\temp.win32-2.7\Release\numpy\core\src\umath\umath_tests.obj /OUT:numpy\core\umath_tests.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\umath\umath_tests.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\umath\umath_tests.pyd.manifest building 'numpy.core.test_rational' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\test_rational.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\test_rational.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:inittest_rational build\temp.win32-2.7\Release\numpy\core\src\umath\test_rational.obj /OUT:numpy\core\test_rational.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\umath\test_rational.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\umath\test_rational.pyd.manifest building 'numpy.core.struct_ufunc_test' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\struct_ufunc_test.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\struct_ufunc_test.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initstruct_ufunc_test build\temp.win32-2.7\Release\numpy\core\src\umath\struct_ufunc_test.obj /OUT:numpy\core\struct_ufunc_test.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\umath\struct_ufunc_test.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\umath\struct_ufunc_test.pyd.manifest building 'numpy.core.multiarray_tests' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\multiarray\multiarray_tests.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray_tests.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\private\mem_overlap.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\private\mem_overlap.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initmultiarray_tests build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray_tests.obj build\temp.win32-2.7\Release\numpy\core\src\private\mem_overlap.obj /OUT:numpy\core\multiarray_tests.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray_tests.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\multiarray\multiarray_tests.pyd.manifest building 'numpy.core.operand_flag_tests' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\core\src\umath\operand_flag_tests.c /Fobuild\temp.win32-2.7\Release\numpy\core\src\umath\operand_flag_tests.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initoperand_flag_tests build\temp.win32-2.7\Release\numpy\core\src\umath\operand_flag_tests.obj /OUT:numpy\core\operand_flag_tests.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\core\src\umath\operand_flag_tests.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\core\src\umath\operand_flag_tests.pyd.manifest building 'numpy.fft.fftpack_lite' extension compiling C sources creating build\temp.win32-2.7\Release\numpy\fft C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\fft\fftpack_litemodule.c /Fobuild\temp.win32-2.7\Release\numpy\fft\fftpack_litemodule.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\fft\fftpack.c /Fobuild\temp.win32-2.7\Release\numpy\fft\fftpack.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initfftpack_lite build\temp.win32-2.7\Release\numpy\fft\fftpack_litemodule.obj build\temp.win32-2.7\Release\numpy\fft\fftpack.obj /OUT:numpy\fft\fftpack_lite.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\fft\fftpack_lite.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\fft\fftpack_lite.pyd.manifest building 'numpy.linalg.lapack_lite' extension compiling C sources creating build\temp.win32-2.7\Release\numpy\linalg C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DNO_ATLAS_INFO=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\linalg\lapack_litemodule.c /Fobuild\temp.win32-2.7\Release\numpy\linalg\lapack_litemodule.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\Python27\libs /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 lapack.lib blas.lib gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initlapack_lite build\temp.win32-2.7\Release\numpy\linalg\lapack_litemodule.obj /OUT:numpy\linalg\lapack_lite.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\linalg\lapack_lite.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\linalg\lapack_lite.pyd.manifest building 'numpy.linalg._umath_linalg' extension compiling C sources C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -DNO_ATLAS_INFO=1 -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\linalg\umath_linalg.c /Fobuild\temp.win32-2.7\Release\numpy\linalg\umath_linalg.obj could not find library 'npymath' in directories ['C:\\Python27\\libs'] C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\Python27\libs /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 npymath.lib lapack.lib blas.lib gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:init_umath_linalg build\temp.win32-2.7\Release\numpy\linalg\umath_linalg.obj /OUT:numpy\linalg\_umath_linalg.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\linalg\_umath_linalg.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\linalg\_umath_linalg.pyd.manifest building 'numpy.random.mtrand' extension compiling C sources creating build\temp.win32-2.7\Release\numpy\random creating build\temp.win32-2.7\Release\numpy\random\mtrand C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -DNPY_NEEDS_MINGW_TIME_WORKAROUND -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\random\mtrand\mtrand.c /Fobuild\temp.win32-2.7\Release\numpy\random\mtrand\mtrand.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -DNPY_NEEDS_MINGW_TIME_WORKAROUND -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\random\mtrand\randomkit.c /Fobuild\temp.win32-2.7\Release\numpy\random\mtrand\randomkit.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -DNPY_NEEDS_MINGW_TIME_WORKAROUND -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\random\mtrand\initarray.c /Fobuild\temp.win32-2.7\Release\numpy\random\mtrand\initarray.obj C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\cl.exe /c /nologo /Ox /MD /W3 /GS- /DNDEBUG /arch:SSE2 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -DNPY_NEEDS_MINGW_TIME_WORKAROUND -Inumpy\core\include -Inumpy\core\include/numpy -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -IC:\Python27\include -IC:\Python27\PC -Inumpy\core\src\private -Inumpy\core\src\private -Inumpy\core\src\private /Tcnumpy\random\mtrand\distributions.c /Fobuild\temp.win32-2.7\Release\numpy\random\mtrand\distributions.obj could not find library 'Advapi32' in directories [] C:\Users\Greg\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\Bin\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:C:\MinGW\mingw64\lib\gcc\x86_64-w64-mingw32\5.2.0 /LIBPATH:C:\MinGW\mingw64\x86_64-w64-mingw32\lib /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 /LIBPATH:build\temp.win32-2.7\Release /LIBPATH:C:\Python27\libs /LIBPATH:C:\Python27\PCbuild /LIBPATH:C:\Python27\PC\VS9.0 /LIBPATH:build\temp.win32-2.7 Advapi32.lib gfortran.lib gcc.lib mingw32.lib mingwex.lib /EXPORT:initmtrand build\temp.win32-2.7\Release\numpy\random\mtrand\mtrand.obj build\temp.win32-2.7\Release\numpy\random\mtrand\randomkit.obj build\temp.win32-2.7\Release\numpy\random\mtrand\initarray.obj build\temp.win32-2.7\Release\numpy\random\mtrand\distributions.obj /OUT:numpy\random\mtrand.pyd /IMPLIB:build\temp.win32-2.7\Release\numpy\random\mtrand\mtrand.lib /MANIFESTFILE:build\temp.win32-2.7\Release\numpy\random\mtrand\mtrand.pyd.manifest From m.h.vankerkwijk at gmail.com Tue Jan 5 19:55:33 2016 From: m.h.vankerkwijk at gmail.com (Marten van Kerkwijk) Date: Tue, 5 Jan 2016 19:55:33 -0500 Subject: [Numpy-discussion] 1.10.3 release tomorrow, 1.11.x branch this month. In-Reply-To: References: Message-ID: Hi Chuck, others, A propos __numpy_ufunc__, what is the current status? Is it still the undetermined result of the monster-thread ( https://github.com/numpy/numpy/issues/5844 -- just found it again by sorting by number of comments...)? As noted by Stephan and myself when the decision was made to remove it from 1.10, for external libraries it would be really wonderful to have *any* version of __numpy_ufunc__ in 1.11, as it provides great beneifts (instant factor 2 improvement in speed for astropy quantities...). In the end, the proposals were not that different, and, really, what is in current master is quite good. All the best, Marten -------------- next part -------------- An HTML attachment was scrubbed... URL: From perimosocordiae at gmail.com Tue Jan 5 20:17:26 2016 From: perimosocordiae at gmail.com (CJ Carey) Date: Tue, 5 Jan 2016 19:17:26 -0600 Subject: [Numpy-discussion] 1.10.3 release tomorrow, 1.11.x branch this month. In-Reply-To: References: Message-ID: I'll echo Marten's sentiments. I've found __numpy_ufunc__ as it exists in the master branch to be quite useful in my experiments with sparse arrays ( https://github.com/perimosocordiae/sparray), and I think it'll be a net benefit to scipy.sparse as well (despite the unpleasantness with __mul__). -CJ On Tue, Jan 5, 2016 at 6:55 PM, Marten van Kerkwijk < m.h.vankerkwijk at gmail.com> wrote: > Hi Chuck, others, > > A propos __numpy_ufunc__, what is the current status? Is it still the > undetermined result of the monster-thread ( > https://github.com/numpy/numpy/issues/5844 -- just found it again by > sorting by number of comments...)? > > As noted by Stephan and myself when the decision was made to remove it > from 1.10, for external libraries it would be really wonderful to have > *any* version of __numpy_ufunc__ in 1.11, as it provides great beneifts > (instant factor 2 improvement in speed for astropy quantities...). In the > end, the proposals were not that different, and, really, what is in current > master is quite good. > > All the best, > > Marten > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 5 20:30:40 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 5 Jan 2016 18:30:40 -0700 Subject: [Numpy-discussion] 1.10.3 release tomorrow, 1.11.x branch this month. In-Reply-To: References: Message-ID: On Tue, Jan 5, 2016 at 6:17 PM, CJ Carey wrote: > I'll echo Marten's sentiments. I've found __numpy_ufunc__ as it exists in > the master branch to be quite useful in my experiments with sparse arrays ( > https://github.com/perimosocordiae/sparray), and I think it'll be a net > benefit to scipy.sparse as well (despite the unpleasantness with __mul__). > > -CJ > > On Tue, Jan 5, 2016 at 6:55 PM, Marten van Kerkwijk < > m.h.vankerkwijk at gmail.com> wrote: > >> Hi Chuck, others, >> >> A propos __numpy_ufunc__, what is the current status? Is it still the >> undetermined result of the monster-thread ( >> https://github.com/numpy/numpy/issues/5844 -- just found it again by >> sorting by number of comments...)? >> >> As noted by Stephan and myself when the decision was made to remove it >> from 1.10, for external libraries it would be really wonderful to have >> *any* version of __numpy_ufunc__ in 1.11, as it provides great beneifts >> (instant factor 2 improvement in speed for astropy quantities...). In the >> end, the proposals were not that different, and, really, what is in current >> master is quite good. >> >> All the best, >> > Well, I'm trying to gin up some action on the topic ;) The last time I brought it up it vanished into the mists. I don't think it would be a lot of work to finish things up and have comtemplated doing it myself if no one else steps up. I want to make the 1.11 branch this month whatever happens. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Tue Jan 5 21:19:15 2016 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 5 Jan 2016 18:19:15 -0800 Subject: [Numpy-discussion] 1.10.3 release tomorrow, 1.11.x branch this month. In-Reply-To: References: Message-ID: On Tue, Jan 5, 2016 at 4:55 PM, Marten van Kerkwijk wrote: > Hi Chuck, others, > > A propos __numpy_ufunc__, what is the current status? Is it still the > undetermined result of the monster-thread > (https://github.com/numpy/numpy/issues/5844 -- just found it again by > sorting by number of comments...)? Yeah, that's about where everyone collapsed in exhaustion last time :-). > As noted by Stephan and myself when the decision was made to remove it from > 1.10, for external libraries it would be really wonderful to have *any* > version of __numpy_ufunc__ in 1.11, as it provides great beneifts (instant > factor 2 improvement in speed for astropy quantities...). In the end, the > proposals were not that different, and, really, what is in current master is > quite good. I think everyone's agreed that having __numpy_ufunc__ is a great thing; the problem is sorting out the details, which is work that takes some time and attention. At this point I think it's extremely unlikely that that time and attention will be mustered in time for 1.11, just because like Chuck says, that's 2 weeks from now, and none of us have magic wands to get the work done. We could potentially get it into 1.11 by pushing the release back, but that wouldn't really help anything: it wouldn't make the work happen any sooner by the calendar; it would just delay releasing all the other stuff that's in master. I've also been feeling frustrated and guilty about the status of __numpy_ufunc__; I very much want to get back to it, but have been too far underwater from other commitments :-(. One possible next step, if someone does have the time/energy/motivation, would be to review the outcome of that thread and write up a short summary of where we got to. Basically cutting out all the back-and-forth and dead-ends to say "here are the 2-3 main options that are still on the table, here are the details of the best version of each, here are the trade-offs". -n -- Nathaniel J. Smith -- http://vorpus.org From ralf.gommers at gmail.com Wed Jan 6 02:30:52 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 6 Jan 2016 08:30:52 +0100 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Mon, Jan 4, 2016 at 9:38 PM, Charles R Harris wrote: > > > The failed tests require pyrex, fortran, and swig. The refcount error > comes and goes, probably the test isn't very good. Ralf, is there any > reason to keep the various extension building tests? They are very old. > There are f2py tests that are useful - if they fail then building anything else with f2py fails as well. However those are all in f2py/tests/ IRRC, not in distutils/tests. > > I don't know what the recommendation on using nosetests is, > Don't use it (it definitely gives test errors due to not having the knownfailure plugin), use either runtests.py or numpy.test(). > but probably it is finding too many tests. > It's doing its job here in finding these though. It looks like these tests need either fixing (the extensions aren't built) or removing. I'll have a look tonight. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Jan 6 14:54:02 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 6 Jan 2016 20:54:02 +0100 Subject: [Numpy-discussion] Numpy 1.10.3 release. In-Reply-To: References: Message-ID: On Wed, Jan 6, 2016 at 8:30 AM, Ralf Gommers wrote: > > > > On Mon, Jan 4, 2016 at 9:38 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: >> >> >> The failed tests require pyrex, fortran, and swig. The refcount error >> comes and goes, probably the test isn't very good. Ralf, is there any >> reason to keep the various extension building tests? They are very old. >> > > There are f2py tests that are useful - if they fail then building anything > else with f2py fails as well. However those are all in f2py/tests/ IRRC, > not in distutils/tests. > >> >> I don't know what the recommendation on using nosetests is, >> > > Don't use it (it definitely gives test errors due to not having the > knownfailure plugin), use either runtests.py or numpy.test(). > > >> but probably it is finding too many tests. >> > > It's doing its job here in finding these though. It looks like these tests > need either fixing (the extensions aren't built) or removing. I'll have a > look tonight. > Should be fixed by https://github.com/numpy/numpy/pull/6955 `nosetests numpy` will still give test errors (6 right now), but only due to not understanding KnownFailureException. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.h.vankerkwijk at gmail.com Wed Jan 6 17:17:35 2016 From: m.h.vankerkwijk at gmail.com (Marten van Kerkwijk) Date: Wed, 6 Jan 2016 17:17:35 -0500 Subject: [Numpy-discussion] Subclass awareness of outer, inner, dot, ... Message-ID: Hi All, Having just had a look at functions where astropy's quantities are silently converted to ndarray, I came across some that, in principle, are easy to solve, yet for which, as always, there is a worry about backward compatibility. So, the question arises what to do. As a specific example, consider `np.outer` which is defined in `numeric.py` and boils down to multiply(a.ravel()[:, newaxis], b.ravel()[newaxis,:], out) Since multiply is fine with subclasses, all should be well, except that before it is called, there are `a = asarray(a)` and `b = asarray(b)` statements, which mean subclasses are lost. Obviously, in this case, a simple fix would be to use `asanyarray` instead, but ideally, similar routines behave similarly and hence one would also want to change `np.inner` and `np.dot` (and perhaps more)... Hence, before doing anything, I thought I would ask whether: 1. Such changes are or are not too risky for backward compatibility; 2. If so, whether, given that `np.dot` can be caught using `__numpy_ufunc__`, one should perhaps allow functions such as `outer` also to be passed through that? All the best, Marten -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Thu Jan 7 10:15:04 2016 From: gfyoung17 at gmail.com (G Young) Date: Thu, 7 Jan 2016 15:15:04 +0000 Subject: [Numpy-discussion] Building master issues on Windows In-Reply-To: References: <8AD23173-C365-4F3B-BC58-0A31D06F2590@gmail.com> Message-ID: Hello all, Turns out I needed to do a complete re-installation of essentially everything that was involved in my Numpy setup. Now everything is working just fine! Cheers, Greg On Tue, Jan 5, 2016 at 7:45 AM, G Young wrote: > Sure. I'm running Windows 7 with Python 2.7.11, gcc 4.8.1, and GNU > Fortran 5.2.0. It's very similar to the Appveyor Python 2 build. When I > am in the numpy root directory, I run "*python setup.py build_ext -i*". > I've attached the output I get from the terminal. Perhaps that might > illuminate some issues that I'm not seeing. > > Greg > > On Mon, Jan 4, 2016 at 8:17 PM, Jaime Fern?ndez del R?o < > jaime.frio at gmail.com> wrote: > >> On Tue, Jan 5, 2016 at 3:15 AM, G Young wrote: >> >>> Hello all, >>> >>> I've recently encountered issues building numpy off master on Windows in >>> which setup.py complains that it can't find the Advapi library in any >>> directories (which is an empty list). I scanned the DLL under System32 and >>> ran /sfc scannow as Administrator, and both came up clean. Is anyone else >>> encountering (or can reproduce) this issue, or is it just me? Any >>> suggestions about how to resolve this problem? >>> >> >> Can you give more details on your setup? >> >> Jaime >> >> (\__/) >> ( O.o) >> ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes >> de dominaci?n mundial. >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Thu Jan 7 13:39:21 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 7 Jan 2016 11:39:21 -0700 Subject: [Numpy-discussion] Numpy 1.10.4 release. Message-ID: Hi All, I'm pleased the release of the Numpy 1.10.4 (bugs stomped) release. This release was motivated by a reported segfault, but a few additional fixes were made: * gh-6922 BUG: numpy.recarray.sort segfaults on Windows, * gh-6937 BUG: busday_offset does the wrong thing with modifiedpreceding roll, * gh-6949 BUG: Type is lost when slicing a subclass of recarray, together with one minor enhancement * gh-6950 BUG trace is not subclass aware, np.trace(ma) != ma.trace(). The sources and documentation are available on Sourceforge, but no binaries. The usual windows binaries were ommitted due to problems with the toolchain. We hope that will be remedied in the future. Mac binaries can be installed from pypi. "Where is numpy 1.10.3?", you may ask. There were glitches with the uploads to pypi for that version that necessitated a version upgrade. A release manager's life is not a happy one. Many thanks to Marten van Kerkwijk, Sebastian Seberg, and Mark Wiebe for the quick bug fixes. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray+numpy at gmail.com Thu Jan 7 14:01:26 2016 From: erik.m.bray+numpy at gmail.com (Erik Bray) Date: Thu, 7 Jan 2016 14:01:26 -0500 Subject: [Numpy-discussion] Numpy 1.10.4 release. In-Reply-To: References: Message-ID: On Thu, Jan 7, 2016 at 1:39 PM, Charles R Harris wrote: > "Where is numpy 1.10.3?", you may ask. There were glitches with the uploads > to pypi for that version that necessitated a version upgrade. A release > manager's life is not a happy one. Ah, okay, I was confused about that when I pip installed Numpy just a few minutes ago :) Has the Numpy project opted to not do .postN releases? I'm curious since I had to do one of these for Astropy recently and it was not the most pleasant experience... Anyways thanks for the release! Erik From charlesr.harris at gmail.com Thu Jan 7 14:14:12 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 7 Jan 2016 12:14:12 -0700 Subject: [Numpy-discussion] Numpy 1.10.4 release. In-Reply-To: References: Message-ID: On Thu, Jan 7, 2016 at 12:01 PM, Erik Bray wrote: > On Thu, Jan 7, 2016 at 1:39 PM, Charles R Harris > wrote: > > > "Where is numpy 1.10.3?", you may ask. There were glitches with the > uploads > > to pypi for that version that necessitated a version upgrade. A release > > manager's life is not a happy one. > > Ah, okay, I was confused about that when I pip installed Numpy just a > few minutes ago :) > > Has the Numpy project opted to not do .postN releases? I'm curious > since I had to do one of these for Astropy recently and it was not the > most pleasant experience... > Yes, we opted out after an experience of trying to fix up the 1.10.0 release uploads (not signed) with a `.post1` release. The main problem for us was that NumpyVersion did not deal with that form, plus pypi went ahead and created a new release, so it was simpler to just make a real new release. > Anyways thanks for the release! > I confess to being a bit nervous about not making an rc release. Of course, no one tests those anyway ;) Let us know how things work out. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Jan 7 18:24:48 2016 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 7 Jan 2016 23:24:48 +0000 Subject: [Numpy-discussion] ANN: second release candidate for scipy 0.17.0 Message-ID: Hi, I'm pleased to announce the availability of the second release candidate for Scipy 0.17.0. It's two days ahead of the original schedule: based on typical development patterns, I'd like to have two weekends and a full working week before January 17th, when this rc2 is supposed to become the final release. Please try this rc and report any issues on Github tracker or scipy-dev mailing list. Source tarballs and full release notes are available from Github Releases: https://github.com/scipy/scipy/releases/tag/v0.17.0rc2 Compared to rc1, the following PRs were merged: - - `#5624 `__: FIX: Fix interpolate - - `#5625 `__: BUG: msvc9 binaries crash when indexing std::vector of size 0 - - `#5635 `__: BUG: misspelled __dealloc__ in cKDTree. - - `#5642 `__: STY: minor fixup of formatting of 0.17.0 release notes. - - `#5643 `__: BLD: fix a build issue in special/Faddeeva.cc with isnan. - - `#5661 `__: TST: linalg tests used stdlib random instead of numpy.random. There is a bit of an unusual change in this rc. In short, in testing rc1 we found that 1) interpolate.interp1d had an undocumented feature of allowing an array-valued fill_value to be broadcast against the data being interpolated, and 2) we broke it in 0.17.0rc1. (See PR 5624 for details.) We now *think* we fixed it so that no existing code is broken. Nevertheless, I would like to encourage everyone to test their code against this rc2, especially all code which uses interp1d. Cheers, Evgeni From yw5aj at virginia.edu Thu Jan 7 21:18:45 2016 From: yw5aj at virginia.edu (Yuxiang Wang) Date: Thu, 7 Jan 2016 21:18:45 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? Message-ID: Dear all, I know that in Windows, we should use either Christoph's package or Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue is solved, so should I directly used pip install numpy to get numpy with a reasonable BLAS library? Thanks! Shawn -- Yuxiang "Shawn" Wang Gerling Haptics Lab University of Virginia yw5aj at virginia.edu +1 (434) 284-0836 https://sites.google.com/a/virginia.edu/yw5aj/ From njs at pobox.com Thu Jan 7 22:01:53 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 7 Jan 2016 19:01:53 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang wrote: > Dear all, > > I know that in Windows, we should use either Christoph's package or > Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue > is solved, so should I directly used pip install numpy to get numpy > with a reasonable BLAS library? pip install numpy should work fine; whether it gives you a reasonable BLAS library will depend on whether you have the development files for a reasonable BLAS library installed, and whether numpy's build system is able to automatically locate them. Generally this means that if you're on a regular distribution and remember to install a decent BLAS -dev or -devel package, then you'll be fine. On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to ensure something reasonable happens. Anaconda is also an option on linux if you want MKL (or openblas). -n -- Nathaniel J. Smith -- http://vorpus.org From matthew.brett at gmail.com Thu Jan 7 22:10:04 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 8 Jan 2016 03:10:04 +0000 Subject: [Numpy-discussion] ANN: second release candidate for scipy 0.17.0 In-Reply-To: References: Message-ID: Hi, On Thu, Jan 7, 2016 at 11:24 PM, Evgeni Burovski wrote: > Hi, > > I'm pleased to announce the availability of the second release > candidate for Scipy 0.17.0. It's two days ahead of the original > schedule: based on typical development patterns, I'd like to have two > weekends and a full working week before January 17th, when this rc2 is > supposed to become the final release. > > Please try this rc and report any issues on Github tracker or > scipy-dev mailing list. > Source tarballs and full release notes are available from Github > Releases: https://github.com/scipy/scipy/releases/tag/v0.17.0rc2 > > Compared to rc1, the following PRs were merged: > > - - `#5624 `__: FIX: Fix interpolate > - - `#5625 `__: BUG: msvc9 > binaries crash when indexing std::vector of size 0 > - - `#5635 `__: BUG: > misspelled __dealloc__ in cKDTree. > - - `#5642 `__: STY: minor > fixup of formatting of 0.17.0 release notes. > - - `#5643 `__: BLD: fix a > build issue in special/Faddeeva.cc with isnan. > - - `#5661 `__: TST: linalg > tests used stdlib random instead of numpy.random. > > There is a bit of an unusual change in this rc. In short, in testing > rc1 we found that 1) interpolate.interp1d had an undocumented feature > of allowing an array-valued fill_value to be broadcast against the > data being interpolated, and 2) we broke it in 0.17.0rc1. (See PR > 5624 for details.) > We now *think* we fixed it so that no existing code is broken. > Nevertheless, I would like to encourage everyone to test their code > against this rc2, especially all code which uses interp1d. Thanks for doing this. Testing builds of OSX wheels via https://github.com/MacPython/scipy-wheels, I found this problem : https://github.com/scipy/scipy/issues/5689 Cheers, Matthew From jakirkham at gmail.com Thu Jan 7 22:48:29 2016 From: jakirkham at gmail.com (John Kirkham) Date: Thu, 7 Jan 2016 22:48:29 -0500 Subject: [Numpy-discussion] ENH: Add the function 'expand_view' Message-ID: First, off sorry for the long turnaround on responding to these questions. Below I have tried to respond to everyone's questions and comments. I have restructured the order of the messages so that my responses are a little more structured. If anybody has more thoughts or questions, please let me know. >>>> Takes an array and tacks on arbitrary dimensions on either side, which >>> is returned as a view always. Here are the relevant features: >>>> >>>> * Creates a view of the array that has the dimensions before and after >>> tacked on to it. >>>> * Takes the before and after arguments independent of each other and >>> the current shape. >>>> * Allows for read and write access to the underlying array. >>> >>> Can you expand this with some discussion of why you want this function, >>> and why you chose these specific features? (E.g. as mentioned in the PR >>> comments already, the reason broadcast_to returns a read-only array is that >>> it was decided that this was less confusing for users, not because of any >>> technical issue.) Sometimes I find that broadcasting is insufficient or gets confused with some more complex cases. Even adding the extra dimensions using `np.newaxis`/`None` doesn't always cut it. So, being able to add the dimensions with the right shape and without copying is very nice. However, I wanted to do this in a way where this would always be a view onto the original data to avoid any copying penalty. It also came in handy when I didn't know how many dimensions I would be dealing with. The alternative would be to do something doing things like `a[..., (b.ndim-1)*(None,)]` and possibly something similar with `b`, which end up being pretty gross and unreadable as opposed to doing `expand_view(a, reps_after=b.shape[1:])`, which is quite clear. The latter also conveys to reader what those dimensions are doing. Finally in cases where `None` or `np.newaxis` work fine, it is possible for you to combine them in some operation you are doing in way which you did not intend. If broadcasting would have worked in this case, you won't get an error, but you will get the wrong answer, which you may discover later than you would like. When using `expand_view` an explicit shape can be set, which when mismatched will give you an error allowing you discover the problem as soon as you run the code as opposed after when you are trying to figure out your answer is wrong. The write option I never actually use so making it readonly is fine. I just never blocked that behavior. Perhaps it would be preferable as has been suggested. Ensuring a view is nice because this remains performant for large arrays. Using a view seems reasonable here based on the fact that we are never trying to restructure the data here. Instead, we are trying to make it appear as if its shape were different. > How is this different from using np.newaxis and broadcasting? Or am I > misunderstanding this? I hope the description above also helps clarify this a bit. To be more succinct and explicit: * This can always be used even when broadcasting may get confused. * Works cleanly and easily with variable numbers of dimensions unlike `np.newaxis`/`None`. * Using explicit shapes more tightly constrains behavior with the resulting view (helping catch bugs). * Provides the reader more information about how the other dimensions should work. >> Why is this a stride_trick? >> >> I thought this looks similar to expand_dims and could maybe be implemented >> with some extra options there. So, `expand_dims` adds a single dimension with a length of 1 where specified, but not multiple. Though I am going to infer by extra options you mean change it some way that it can add multiple axes. Though I suppose you could do this and it may even be worth pursing, it ends up being a similar idea to using `np.newaxis`/`None` a few times and so I have basically the same points to make here as I made in the last section's response. Also, I feel like using `expand_dims` with more axes specified might make it a little harder to follow than just using `np.newaxis`/`None`, but it could have its merrits. >From a performance standpoint, `expand_dims` using `reshape` to add these extra dimensions. So, it has the possibility of not returning a view, but a copy, which could take some time to build if the array is large. By using our method, we guarantee a view will always be returned; so, this allocation will never be encountered. From yw5aj at virginia.edu Fri Jan 8 11:28:04 2016 From: yw5aj at virginia.edu (Yuxiang Wang) Date: Fri, 8 Jan 2016 11:28:04 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Dear Nathaniel, Gotcha. That's very helpful. Thank you so much! Shawn On Thu, Jan 7, 2016 at 10:01 PM, Nathaniel Smith wrote: > On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang wrote: >> Dear all, >> >> I know that in Windows, we should use either Christoph's package or >> Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue >> is solved, so should I directly used pip install numpy to get numpy >> with a reasonable BLAS library? > > pip install numpy should work fine; whether it gives you a reasonable > BLAS library will depend on whether you have the development files for > a reasonable BLAS library installed, and whether numpy's build system > is able to automatically locate them. Generally this means that if > you're on a regular distribution and remember to install a decent BLAS > -dev or -devel package, then you'll be fine. > > On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to > ensure something reasonable happens. > > Anaconda is also an option on linux if you want MKL (or openblas). > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -- Yuxiang "Shawn" Wang Gerling Haptics Lab University of Virginia yw5aj at virginia.edu +1 (434) 284-0836 https://sites.google.com/a/virginia.edu/yw5aj/ From matthew.brett at gmail.com Fri Jan 8 12:12:19 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 8 Jan 2016 17:12:19 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Hi, On Fri, Jan 8, 2016 at 4:28 PM, Yuxiang Wang wrote: > Dear Nathaniel, > > Gotcha. That's very helpful. Thank you so much! > > Shawn > > On Thu, Jan 7, 2016 at 10:01 PM, Nathaniel Smith wrote: >> On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang wrote: >>> Dear all, >>> >>> I know that in Windows, we should use either Christoph's package or >>> Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue >>> is solved, so should I directly used pip install numpy to get numpy >>> with a reasonable BLAS library? >> >> pip install numpy should work fine; whether it gives you a reasonable >> BLAS library will depend on whether you have the development files for >> a reasonable BLAS library installed, and whether numpy's build system >> is able to automatically locate them. Generally this means that if >> you're on a regular distribution and remember to install a decent BLAS >> -dev or -devel package, then you'll be fine. >> >> On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to >> ensure something reasonable happens. >> >> Anaconda is also an option on linux if you want MKL (or openblas). I wrote a page on using pip with Debian / Ubuntu here : https://matthew-brett.github.io/pydagogue/installing_on_debian.html Cheers, Matthew From rmcgibbo at gmail.com Fri Jan 8 14:07:32 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 8 Jan 2016 11:07:32 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Does anyone know if there's been any movements with the PyPI folks on allowing linux wheels to be uploaded? I know you can never be certain what's provided by the distro, but it seems like if Anaconda can solve the cross-distro-binary-distribution-of-compiled-python-extensions problem, there shouldn't be much technically different for Linux wheels. -Robert On Fri, Jan 8, 2016 at 9:12 AM, Matthew Brett wrote: > Hi, > > On Fri, Jan 8, 2016 at 4:28 PM, Yuxiang Wang wrote: > > Dear Nathaniel, > > > > Gotcha. That's very helpful. Thank you so much! > > > > Shawn > > > > On Thu, Jan 7, 2016 at 10:01 PM, Nathaniel Smith wrote: > >> On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang > wrote: > >>> Dear all, > >>> > >>> I know that in Windows, we should use either Christoph's package or > >>> Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue > >>> is solved, so should I directly used pip install numpy to get numpy > >>> with a reasonable BLAS library? > >> > >> pip install numpy should work fine; whether it gives you a reasonable > >> BLAS library will depend on whether you have the development files for > >> a reasonable BLAS library installed, and whether numpy's build system > >> is able to automatically locate them. Generally this means that if > >> you're on a regular distribution and remember to install a decent BLAS > >> -dev or -devel package, then you'll be fine. > >> > >> On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to > >> ensure something reasonable happens. > >> > >> Anaconda is also an option on linux if you want MKL (or openblas). > > I wrote a page on using pip with Debian / Ubuntu here : > https://matthew-brett.github.io/pydagogue/installing_on_debian.html > > Cheers, > > Matthew > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oscar.j.benjamin at gmail.com Fri Jan 8 15:06:53 2016 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Fri, 8 Jan 2016 20:06:53 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On 8 Jan 2016 19:07, "Robert McGibbon" wrote: > > Does anyone know if there's been any movements with the PyPI folks on allowing linux wheels to be uploaded? > > I know you can never be certain what's provided by the distro, but it seems like if Anaconda can solve the cross-distro-binary-distribution-of-compiled-python-extensions problem, there shouldn't be much technically different for Linux wheels. > > -Robert > > On Fri, Jan 8, 2016 at 9:12 AM, Matthew Brett wrote: >> >> Hi, >> >> On Fri, Jan 8, 2016 at 4:28 PM, Yuxiang Wang wrote: >> > Dear Nathaniel, >> > >> > Gotcha. That's very helpful. Thank you so much! >> > >> > Shawn >> > >> > On Thu, Jan 7, 2016 at 10:01 PM, Nathaniel Smith wrote: >> >> On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang wrote: >> >>> Dear all, >> >>> >> >>> I know that in Windows, we should use either Christoph's package or >> >>> Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue >> >>> is solved, so should I directly used pip install numpy to get numpy >> >>> with a reasonable BLAS library? >> >> >> >> pip install numpy should work fine; whether it gives you a reasonable >> >> BLAS library will depend on whether you have the development files for >> >> a reasonable BLAS library installed, and whether numpy's build system >> >> is able to automatically locate them. Generally this means that if >> >> you're on a regular distribution and remember to install a decent BLAS >> >> -dev or -devel package, then you'll be fine. >> >> >> >> On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to >> >> ensure something reasonable happens. >> >> >> >> Anaconda is also an option on linux if you want MKL (or openblas). >> >> I wrote a page on using pip with Debian / Ubuntu here : >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >> >> Cheers, >> >> Matthew >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oscar.j.benjamin at gmail.com Fri Jan 8 15:12:06 2016 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Fri, 8 Jan 2016 20:12:06 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On 8 Jan 2016 19:07, "Robert McGibbon" wrote: > > Does anyone know if there's been any movements with the PyPI folks on allowing linux wheels to be uploaded? > > I know you can never be certain what's provided by the distro, but it seems like if Anaconda can solve the cross-distro-binary-distribution-of-compiled-python-extensions problem, there shouldn't be much technically different for Linux wheels. Anaconda controls all of the dependent non-Python libraries which are outside of the pip/pypi ecosystem. Pip/wheel doesn't have that option until such libraries are packaged up for PyPI (e.g. pyopenblas). -- Oscar -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcgibbo at gmail.com Fri Jan 8 16:58:18 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 8 Jan 2016 13:58:18 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Well, it's always possible to copy the dependencies like libopenblas.so into the wheel and fix up the RPATHs, similar to the way the Windows wheels work. I'm not sure if this is the right path for numpy or not, but it seems like something would be suitable for some projects with compiled extensions. But it's categorically ruled out by the PyPI policy, IIUC. Perhaps this is OT for this thread, and I should ask on distutils-sig. -Robert On Fri, Jan 8, 2016 at 12:12 PM, Oscar Benjamin wrote: > > On 8 Jan 2016 19:07, "Robert McGibbon" wrote: > > > > Does anyone know if there's been any movements with the PyPI folks on > allowing linux wheels to be uploaded? > > > > I know you can never be certain what's provided by the distro, but it > seems like if Anaconda can solve the > cross-distro-binary-distribution-of-compiled-python-extensions problem, > there shouldn't be much technically different for Linux wheels. > > Anaconda controls all of the dependent non-Python libraries which are > outside of the pip/pypi ecosystem. Pip/wheel doesn't have that option until > such libraries are packaged up for PyPI (e.g. pyopenblas). > > -- > Oscar > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri Jan 8 18:27:31 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 8 Jan 2016 15:27:31 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Fri, Jan 8, 2016 at 1:58 PM, Robert McGibbon wrote: > I'm not sure if this is the right path for numpy or not, > probably not -- AFAICT, the PyPa folks aren't interested in solving teh problems we have in the scipy community -- we can tweak around the edges, but we wont get there without a commitment to really solve the issues -- and if pip did that, it would essentially be conda -- non one wants to re-impliment conda. > > Perhaps this is OT for this thread, and I should ask on distutils-sig. > there has been a lot of discussion of this issue on the distutils-sig in the last couple months (quiet lately). So yes, that's the place to go. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Fri Jan 8 18:50:31 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 8 Jan 2016 23:50:31 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Hi, On Fri, Jan 8, 2016 at 11:27 PM, Chris Barker wrote: > On Fri, Jan 8, 2016 at 1:58 PM, Robert McGibbon wrote: >> >> I'm not sure if this is the right path for numpy or not, > > > probably not -- AFAICT, the PyPa folks aren't interested in solving teh > problems we have in the scipy community -- we can tweak around the edges, > but we wont get there without a commitment to really solve the issues -- and > if pip did that, it would essentially be conda -- non one wants to > re-impliment conda. Well - as the OP was implying, it really should not be too difficult. We (here in Berkeley) have discussed how to do this for Linux, including (Nathaniel mainly) what would be sensible for pypi to do, in terms of platform labels. Both Anaconda and Canopy build on a base default Linux system so that the built binaries will work on many Linux systems. At the moment, Linux wheels have the platform tag of either linux_i686 (32-bit) or linux_x86_64 - example filenames: numpy-1.9.2-cp27-none-linux_i686.whl numpy-1.9.2-cp27-none-linux_x86_64.whl Obviously these platform tags are rather useless, because they don't tell you very much about whether this wheel will work on your own system. If we started building Linux wheels on a base system like that of Anaconda or Canopy we might like another platform tag that tells you that this wheel is compatible with a wide range of systems. So the job of negotiating with distutils-sig is trying to find a good name for this base system - we thought that 'manylinux' was a good one - and then put in a pull request to pip to recognize 'manylinux' as compatible when running pip install from a range of Linux systems. Cheers, Matthew From rmcgibbo at gmail.com Fri Jan 8 19:31:12 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 8 Jan 2016 16:31:12 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > Both Anaconda and Canopy build on a base default Linux system so that > the built binaries will work on many Linux systems. I think the base linux system is CentOS 5, and from my experience, it seems like this approach has worked very well. Those packages are compatible with all essentially all Linuxes that are more recent than CentOS 5 (which is ancient). I have not heard of anyone complaining that the packages they install through conda don't work on their CentOS 4 or Ubuntu 6.06 box. I assume Python / pip is probably used on a wider diversity of linux flavors than conda is, so I'm sure that binaries built on CentOS 5 won't work for absolutely _every_ linux user, but it does seem to cover the substantial majority of linux users. Building redistributable linux binaries that work across a large number of distros and distro versions is definitely tricky. If you run ``python setup.py bdist_wheel`` on your Fedora Rawhide box, you can't really expect the wheel to work for too many other linux users. So given that, I can see why PyPI would want to be careful about accepting Linux wheels. But it seems like, if they make the upload something like ``` twine upload numpy-1.9.2-cp27-none-linux_x86_64.whl \ --yes-yes-i-know-this-is-dangerous-but-i-know-what-i'm-doing ``` that this would potentially be able to let packages like numpy serve their linux users better without risking too much junk being uploaded to PyPI. -Robert On Fri, Jan 8, 2016 at 3:50 PM, Matthew Brett wrote: > Hi, > > On Fri, Jan 8, 2016 at 11:27 PM, Chris Barker > wrote: > > On Fri, Jan 8, 2016 at 1:58 PM, Robert McGibbon > wrote: > >> > >> I'm not sure if this is the right path for numpy or not, > > > > > > probably not -- AFAICT, the PyPa folks aren't interested in solving teh > > problems we have in the scipy community -- we can tweak around the edges, > > but we wont get there without a commitment to really solve the issues -- > and > > if pip did that, it would essentially be conda -- non one wants to > > re-impliment conda. > > Well - as the OP was implying, it really should not be too difficult. > > We (here in Berkeley) have discussed how to do this for Linux, > including (Nathaniel mainly) what would be sensible for pypi to do, in > terms of platform labels. > > Both Anaconda and Canopy build on a base default Linux system so that > the built binaries will work on many Linux systems. > > At the moment, Linux wheels have the platform tag of either linux_i686 > (32-bit) or linux_x86_64 - example filenames: > > numpy-1.9.2-cp27-none-linux_i686.whl > numpy-1.9.2-cp27-none-linux_x86_64.whl > > Obviously these platform tags are rather useless, because they don't > tell you very much about whether this wheel will work on your own > system. > > If we started building Linux wheels on a base system like that of > Anaconda or Canopy we might like another platform tag that tells you > that this wheel is compatible with a wide range of systems. So the > job of negotiating with distutils-sig is trying to find a good name > for this base system - we thought that 'manylinux' was a good one - > and then put in a pull request to pip to recognize 'manylinux' as > compatible when running pip install from a range of Linux systems. > > Cheers, > > Matthew > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Fri Jan 8 19:36:32 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 9 Jan 2016 00:36:32 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 12:31 AM, Robert McGibbon wrote: >> Both Anaconda and Canopy build on a base default Linux system so that >> the built binaries will work on many Linux systems. > > I think the base linux system is CentOS 5, and from my experience, it seems > like this approach > has worked very well. Those packages are compatible with all essentially all > Linuxes that are > more recent than CentOS 5 (which is ancient). I have not heard of anyone > complaining that the > packages they install through conda don't work on their CentOS 4 or Ubuntu > 6.06 box. I assume > Python / pip is probably used on a wider diversity of linux flavors than > conda is, so I'm sure that > binaries built on CentOS 5 won't work for absolutely _every_ linux user, but > it does seem to > cover the substantial majority of linux users. > > Building redistributable linux binaries that work across a large number of > distros and distro > versions is definitely tricky. If you run ``python setup.py bdist_wheel`` on > your Fedora Rawhide > box, you can't really expect the wheel to work for too many other linux > users. So given that, I > can see why PyPI would want to be careful about accepting Linux wheels. > > But it seems like, if they make the upload something like > > ``` > twine upload numpy-1.9.2-cp27-none-linux_x86_64.whl \ > --yes-yes-i-know-this-is-dangerous-but-i-know-what-i'm-doing > ``` > > that this would potentially be able to let packages like numpy serve their > linux > users better without risking too much junk being uploaded to PyPI. I could well understand it if the pypa folks thought that was a bad idea. There are so many Linux distributions and therefore so many ways for this to go wrong, that the most likely outcome would be a relentless flood of people saying "ouch this doesn't work for me", and therefore decreased trust for pip / Linux in general. On the other hand, having a base build system and matching platform tag seems like it is well within reach, and, if we provided proof of concept, I guess that pypa would agree. Cheers, Matthew From njs at pobox.com Fri Jan 8 22:13:56 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 8 Jan 2016 19:13:56 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Fri, Jan 8, 2016 at 4:31 PM, Robert McGibbon wrote: >> Both Anaconda and Canopy build on a base default Linux system so that >> the built binaries will work on many Linux systems. > > I think the base linux system is CentOS 5, and from my experience, it seems > like this approach > has worked very well. Those packages are compatible with all essentially all > Linuxes that are > more recent than CentOS 5 (which is ancient). I have not heard of anyone > complaining that the > packages they install through conda don't work on their CentOS 4 or Ubuntu > 6.06 box. Right. There's a small problem which is that the base linux system isn't just "CentOS 5", it's "CentOS 5 and here's the list of libraries that you're allowed to link to: ...", where that list is empirically chosen to include only stuff that really is installed on ~all linux machines and for which the ABI really has been stable in practice over multiple years and distros (so e.g. no OpenSSL). So the key next step is for someone to figure out and write down that list. Continuum and Enthought both have versions of it that we know are good... Does anyone know who maintains Anaconda's linux build environment? > I assume > Python / pip is probably used on a wider diversity of linux flavors than > conda is, so I'm sure that > binaries built on CentOS 5 won't work for absolutely _every_ linux user, but > it does seem to > cover the substantial majority of linux users. > > Building redistributable linux binaries that work across a large number of > distros and distro > versions is definitely tricky. If you run ``python setup.py bdist_wheel`` on > your Fedora Rawhide > box, you can't really expect the wheel to work for too many other linux > users. So given that, I > can see why PyPI would want to be careful about accepting Linux wheels. > > But it seems like, if they make the upload something like > > ``` > twine upload numpy-1.9.2-cp27-none-linux_x86_64.whl \ > --yes-yes-i-know-this-is-dangerous-but-i-know-what-i'm-doing > ``` > > that this would potentially be able to let packages like numpy serve their > linux > users better without risking too much junk being uploaded to PyPI. That will never fly. But like Matthew says, I think we can probably get them to accept a PEP saying "here's a new well-specified platform tag that means that this wheel works on all linux systems meet the following list of criteria: ...", and then allow that new platform tag onto PyPI. -n -- Nathaniel J. Smith -- http://vorpus.org From nathan12343 at gmail.com Fri Jan 8 22:17:51 2016 From: nathan12343 at gmail.com (Nathan Goldbaum) Date: Fri, 8 Jan 2016 21:17:51 -0600 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Doesn't building on CentOS 5 also mean using a quite old version of gcc? I've never tested this, but I've seen claims on the anaconda mailing list of ~25% slowdowns compared to building from source or using system packages, which was attributed to building using an older gcc that doesn't optimize as well as newer versions. On Fri, Jan 8, 2016 at 9:13 PM, Nathaniel Smith wrote: > On Fri, Jan 8, 2016 at 4:31 PM, Robert McGibbon > wrote: > >> Both Anaconda and Canopy build on a base default Linux system so that > >> the built binaries will work on many Linux systems. > > > > I think the base linux system is CentOS 5, and from my experience, it > seems > > like this approach > > has worked very well. Those packages are compatible with all essentially > all > > Linuxes that are > > more recent than CentOS 5 (which is ancient). I have not heard of anyone > > complaining that the > > packages they install through conda don't work on their CentOS 4 or > Ubuntu > > 6.06 box. > > Right. There's a small problem which is that the base linux system > isn't just "CentOS 5", it's "CentOS 5 and here's the list of libraries > that you're allowed to link to: ...", where that list is empirically > chosen to include only stuff that really is installed on ~all linux > machines and for which the ABI really has been stable in practice over > multiple years and distros (so e.g. no OpenSSL). > > So the key next step is for someone to figure out and write down that > list. Continuum and Enthought both have versions of it that we know > are good... > > Does anyone know who maintains Anaconda's linux build environment? > > > I assume > > Python / pip is probably used on a wider diversity of linux flavors than > > conda is, so I'm sure that > > binaries built on CentOS 5 won't work for absolutely _every_ linux user, > but > > it does seem to > > cover the substantial majority of linux users. > > > > Building redistributable linux binaries that work across a large number > of > > distros and distro > > versions is definitely tricky. If you run ``python setup.py > bdist_wheel`` on > > your Fedora Rawhide > > box, you can't really expect the wheel to work for too many other linux > > users. So given that, I > > can see why PyPI would want to be careful about accepting Linux wheels. > > > > But it seems like, if they make the upload something like > > > > ``` > > twine upload numpy-1.9.2-cp27-none-linux_x86_64.whl \ > > --yes-yes-i-know-this-is-dangerous-but-i-know-what-i'm-doing > > ``` > > > > that this would potentially be able to let packages like numpy serve > their > > linux > > users better without risking too much junk being uploaded to PyPI. > > That will never fly. But like Matthew says, I think we can probably > get them to accept a PEP saying "here's a new well-specified platform > tag that means that this wheel works on all linux systems meet the > following list of criteria: ...", and then allow that new platform tag > onto PyPI. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jan 8 22:38:19 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 8 Jan 2016 19:38:19 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Fri, Jan 8, 2016 at 7:17 PM, Nathan Goldbaum wrote: > Doesn't building on CentOS 5 also mean using a quite old version of gcc? Yes. IIRC CentOS 5 ships with gcc 4.4, and you can bump that up to gcc 4.8 by using the Redhat Developer Toolset release (which is gcc + special backport libraries to let it generate RHEL5/CentOS5-compatible binaries). (I might have one or both of those version numbers slightly wrong.) > I've never tested this, but I've seen claims on the anaconda mailing list of > ~25% slowdowns compared to building from source or using system packages, > which was attributed to building using an older gcc that doesn't optimize as > well as newer versions. I'd be very surprised if that were a 25% slowdown in general, as opposed to a 25% slowdown on some particular inner loop that happened to neatly match some new feature in a new gcc (e.g. something where the new autovectorizer kicked in). But yeah, in general this is just an inevitable trade-off when it comes to distributing binaries: you're always going to pay some penalty for achieving broad compatibility as compared to artisanally hand-tuned binaries specialized for your machine's exact OS version, processor, etc. Not much to be done, really. At some point the baseline for compatibility will switch to "compile everything on CentOS 6", and that will be better but it will still be worse than binaries that target CentOS 7, and so on and so forth. -n -- Nathaniel J. Smith -- http://vorpus.org From rmcgibbo at gmail.com Fri Jan 8 22:41:26 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 8 Jan 2016 19:41:26 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > Doesn't building on CentOS 5 also mean using a quite old version of gcc? I have had pretty good luck using the (awesomely named) Holy Build Box , which is a CentOS 5 docker image with a newer gcc version installed (but I guess the same old libc). I'm not 100% sure how it works, but it's quite nice. For example, you can use c++11 and still keep all the binary compatibility benefits of CentOS 5. -Robert On Fri, Jan 8, 2016 at 7:38 PM, Nathaniel Smith wrote: > On Fri, Jan 8, 2016 at 7:17 PM, Nathan Goldbaum > wrote: > > Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > Yes. IIRC CentOS 5 ships with gcc 4.4, and you can bump that up to gcc > 4.8 by using the Redhat Developer Toolset release (which is gcc + > special backport libraries to let it generate RHEL5/CentOS5-compatible > binaries). (I might have one or both of those version numbers slightly > wrong.) > > > I've never tested this, but I've seen claims on the anaconda mailing > list of > > ~25% slowdowns compared to building from source or using system packages, > > which was attributed to building using an older gcc that doesn't > optimize as > > well as newer versions. > > I'd be very surprised if that were a 25% slowdown in general, as > opposed to a 25% slowdown on some particular inner loop that happened > to neatly match some new feature in a new gcc (e.g. something where > the new autovectorizer kicked in). But yeah, in general this is just > an inevitable trade-off when it comes to distributing binaries: you're > always going to pay some penalty for achieving broad compatibility as > compared to artisanally hand-tuned binaries specialized for your > machine's exact OS version, processor, etc. Not much to be done, > really. At some point the baseline for compatibility will switch to > "compile everything on CentOS 6", and that will be better but it will > still be worse than binaries that target CentOS 7, and so on and so > forth. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jan 8 23:03:44 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 8 Jan 2016 20:03:44 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Fri, Jan 8, 2016 at 7:41 PM, Robert McGibbon wrote: >> Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > I have had pretty good luck using the (awesomely named) Holy Build Box, > which is a CentOS 5 docker image with a newer gcc version installed (but I > guess the same old libc). I'm not 100% sure how it works, but it's quite > nice. For example, you can use c++11 and still keep all the binary > compatibility benefits of CentOS 5. They say they have gcc 4.8: https://github.com/phusion/holy-build-box#isolated-build-environment-based-on-docker-and-centos-5 so I bet they're using RH's devtools gcc. This means that it works via the labor of some unsung programmers at RH who went through all the library changes between gcc 4.4 and 4.8, and put together a version of 4.8 that for every important symbol knows whether it's available in the old 4.4 libraries or not; for the ones that are, it dynamically links them; for the ones that aren't, it has a special static library that it pulls them out of. Like sewer cleaning, it's the kind of very impressive, incredibly valuable infrastructure work that I'm really glad someone does. Someone else who's not me... Continuum and Enthought both have a whole list of packages beyond glibc that are safe enough to link to, including a bunch of ones that would be big pains to statically link everywhere (libX11, etc.). That's the useful piece of information that goes beyond just CentOS5 + RH devtools + static linking -- can't tell of the "Holy Build Box" has anything like that. -n -- Nathaniel J. Smith -- http://vorpus.org From rmcgibbo at gmail.com Fri Jan 8 23:08:01 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 8 Jan 2016 20:08:01 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > Continuum and Enthought both have a whole list of packages beyond glibc that are safe enough to link to, including a bunch of ones that would be big pains to statically link everywhere (libX11, etc.). That's the useful piece of information that goes beyond just CentOS5 + RH devtools + static linking -- can't tell of the "Holy Build Box" has anything like that. Probably-crazy Idea: One could reconstruct that list by downloading all of https://repo.continuum.io/pkgs/free/linux-64/, untarring everything, and running `ldd` on all of the binaries and .so files. Can't be that hard... right? -Robert On Fri, Jan 8, 2016 at 8:03 PM, Nathaniel Smith wrote: > On Fri, Jan 8, 2016 at 7:41 PM, Robert McGibbon > wrote: > >> Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > > > I have had pretty good luck using the (awesomely named) Holy Build Box, > > which is a CentOS 5 docker image with a newer gcc version installed (but > I > > guess the same old libc). I'm not 100% sure how it works, but it's quite > > nice. For example, you can use c++11 and still keep all the binary > > compatibility benefits of CentOS 5. > > They say they have gcc 4.8: > > https://github.com/phusion/holy-build-box#isolated-build-environment-based-on-docker-and-centos-5 > so I bet they're using RH's devtools gcc. This means that it works via > the labor of some unsung programmers at RH who went through all the > library changes between gcc 4.4 and 4.8, and put together a version of > 4.8 that for every important symbol knows whether it's available in > the old 4.4 libraries or not; for the ones that are, it dynamically > links them; for the ones that aren't, it has a special static library > that it pulls them out of. Like sewer cleaning, it's the kind of very > impressive, incredibly valuable infrastructure work that I'm really > glad someone does. Someone else who's not me... > > Continuum and Enthought both have a whole list of packages beyond > glibc that are safe enough to link to, including a bunch of ones that > would be big pains to statically link everywhere (libX11, etc.). > That's the useful piece of information that goes beyond just CentOS5 + > RH devtools + static linking -- can't tell of the "Holy Build Box" has > anything like that. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jan 8 23:19:29 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 8 Jan 2016 20:19:29 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Jan 8, 2016 20:08, "Robert McGibbon" wrote: > > > Continuum and Enthought both have a whole list of packages beyond > glibc that are safe enough to link to, including a bunch of ones that > would be big pains to statically link everywhere (libX11, etc.). > That's the useful piece of information that goes beyond just CentOS5 + > RH devtools + static linking -- can't tell of the "Holy Build Box" has > anything like that. > > Probably-crazy Idea: One could reconstruct that list by downloading all of > https://repo.continuum.io/pkgs/free/linux-64/, untarring everything, and > running `ldd` on all of the binaries and .so files. Can't be that hard... right? You'd have to be slightly careful to not count libraries that they ship themselves, and then use the resulting list to recreate a build environment that contains those libraries (using a dockerfile, I guess). But yeah, should work. Are you feeling inspired? :-) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Sat Jan 9 05:34:54 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Sat, 09 Jan 2016 11:34:54 +0100 Subject: [Numpy-discussion] ENH: Add the function 'expand_view' In-Reply-To: References: Message-ID: <1452335694.2827.20.camel@sipsolutions.net> On Do, 2016-01-07 at 22:48 -0500, John Kirkham wrote: > First, off sorry for the long turnaround on responding to these > questions. Below I have tried to respond to everyone's questions and > comments. I have restructured the order of the messages so that my > responses are a little more structured. If anybody has more thoughts > or questions, please let me know. > > From a performance standpoint, `expand_dims` using `reshape` to add > these extra dimensions. So, it has the possibility of not returning a > view, but a copy, which could take some time to build if the array is > large. By using our method, we guarantee a view will always be > returned; so, this allocation will never be encountered. Actually reshape can be used, if stride tricks can do the trick, then reshape is guaranteed to not do a copy. Still can't say I feel it is a worthy addition (but others may disagree), especially since I realized we have `expand_dims` already. I just am not sure it would actually be used reasonably often. But how about adding a `dims=1` argument to `expand_dims`, so that your function becomes at least a bit easier: newarr = np.expand_dims(arr, -1, after.ndim) # If manual broadcast is necessary: newarr = np.broadcast_to(arr, before.shape + arr.shape + after.shape) Otherwise I guess we could morph `expand_dims` partially to your function, though it would return a read-only view if broadcasting is active, so I am a bit unsure I like it. In other words, my personal currently preferred solution to get some of this would be to add a simple `dims` argument to `expand_dims` and add `broadcast_to` to its "See Also" section (also it could be in the `squeeze` see also section I think). To further help out other people, we could maybe even mention the combination in the examples (i.e. broadcast_to example?). - Sebastian > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From oscar.j.benjamin at gmail.com Sat Jan 9 06:39:22 2016 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Sat, 9 Jan 2016 11:39:22 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On 8 January 2016 at 23:27, Chris Barker wrote: > On Fri, Jan 8, 2016 at 1:58 PM, Robert McGibbon wrote: >> >> I'm not sure if this is the right path for numpy or not, > > > probably not -- AFAICT, the PyPa folks aren't interested in solving teh > problems we have in the scipy community -- we can tweak around the edges, > but we wont get there without a commitment to really solve the issues -- and > if pip did that, it would essentially be conda -- non one wants to > re-impliment conda. I think that's a little unfair to the PyPA people. They would like to solve all of these problems is just a question of priority and expertise. As always in open source you have to scratch your own itch and those guys are working on other things like the security, stability, scalability of the infrastructure, consistency of pip's version handling and dependency resolution etc. Linux wheels is a problem that has been discussed on distutils-sig. The reason it hasn't happened is that it's a lower priority than wheels for OSX/Windows because: 1) Most distros already package this stuff i.e. apt-get numpy 2) On Linux it's much easier to get the appropriate compilers so that pip can build e.g. numpy. 3) The average Linux user is more capable of solving these problems. 4) Getting binary distribution to work across all Linux distributions is significantly harder than for Windows/OSX because of the myriad different distros/versions. Considering point 2 pip install numpy etc. already works for a lot of Linux users even if it is slow because of the time taken to compile (3 minutes on my system). Depending on your use case that problem is partially solved by wheel caching. So if Linux wheels were allowed but didn't always work then that would be a regression for many users. On OSX binary wheels for numpy are already available and work fine AFAIK. The absence of binary numpy wheels for Windows is not down to PyPA. Considering point 4 the idea of compiling on an old base Linux system has been discussed on distutils-sig before and it seems likely to work. The problem is really about the external non-libc dependencies though. The reason progress there has stalled is not because the PyPA folks don't want to solve it but rather because they have other priorities and are hoping that people with more expertise in that area will step up to address those problems. Most of the issues stem from the scientific Python community so ideally someone from the scientific Python community would address how to solve those problems. Recently Nathaniel brought some suggestions to distutils-sig to address the problem of build-requires which is a particular pain point. I think that people there appreciated the effort from someone who understands the needs of hard-to-build packages to improve the way that pip/PyPI works in that area. There was a lot of confusion from people not understanding each others needs but ultimately I thought there was agreement on how to move forward. (Although what happened to that in the end?) The same can happen with other problems like Linux wheels. If you guys here have a clear idea of how to solve the external dependency problem then I'm sure they'll be receptive. Personally I think the best approach is the pyopenblas approach: internalise the external dependency so that pip can work with it. This is precisely what Anaconda does and there's actually no need to make substantive changes to the way pip/pypi/wheel works in order to achieve that. It just needs someone to package the external dependencies as sdist/wheel (and for PyPI to allow Linux wheels). -- Oscar From rmcgibbo at gmail.com Sat Jan 9 06:52:39 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sat, 9 Jan 2016 03:52:39 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] Message-ID: Hi all, I went ahead and tried to collect a list of all of the libraries that could be considered to constitute the "base" system for linux-64. The strategy I used was to leverage off the work done by the folks at Continuum by searching through their pre-compiled binaries from https://repo.continuum.io/pkgs/free/linux-64/ to find shared libraries that were dependened on (according to ldd) that were not accounted for by the declared dependencies that each package made known to the conda package manager. The full list of these system libraries, sorted in from most-commonly-depend-on to rarest, is below. There are 158 of them. ['linux-vdso.so.1', 'libc.so.6', 'libpthread.so.0', 'libm.so.6', 'libdl.so.2', 'libutil.so.1', 'libgcc_s.so.1', 'libstdc++.so.6', 'libexpat.so.1', 'librt.so.1', 'libpng12.so.0', 'libcrypt.so.1', 'libffi.so.6', 'libresolv.so.2', 'libkeyutils.so.1', 'libcom_err.so.2', 'libp11-kit.so.0', 'libkrb5.so.26', 'libheimntlm.so.0', 'libtasn1.so.6', 'libheimbase.so.1', 'libgssapi.so.3', 'libroken.so.18', 'libhcrypto.so.4', 'libhogweed.so.4', 'libnettle.so.6', 'libhx509.so.5', 'libwind.so.0', 'libgnutls-deb0.so.28', 'libasn1.so.8', 'libgmp.so.10', 'libsasl2.so.2', 'libidn.so.11', 'librtmp.so.1', 'liblber-2.4.so.2', 'libldap_r-2.4.so.2', 'libXdmcp.so.6', 'libX11.so.6', 'libXau.so.6', 'libxcb.so.1', 'libgssapi_krb5.so.2', 'libkrb5.so.3', 'libk5crypto.so.3', 'libkrb5support.so.0', 'libicudata.so.55', 'libicuuc.so.55', 'libhdf5_serial.so.10', 'libcurl-gnutls.so.4', 'libhdf5_serial_hl.so.10', 'libtinfo.so.5', 'libgcrypt.so.20', 'libgpg-error.so.0', 'libnsl.so.1', 'libXext.so.6', 'libncursesw.so.5', 'libpanelw.so.5', 'libXrender.so.1', 'libjbig.so.0', 'libpcre.so.3', 'libglib-2.0.so.0', 'libnvidia-tls.so.352.41', 'libnvidia-glcore.so.352.41', 'libGL.so.1', 'libuuid.so.1', 'libSM.so.6', 'libICE.so.6', 'libgobject-2.0.so.0', 'libgfortran.so.1', 'liblzma.so.5', 'libXt.so.6', 'libgmodule-2.0.so.0', 'libXi.so.6', 'libgstpbutils-1.0.so.0', 'liborc-0.4.so.0', 'libgstreamer-1.0.so.0', 'libgsttag-1.0.so.0', 'libgstvideo-1.0.so.0', 'libxslt.so.1', 'libaudio.so.2', 'libjpeg.so.8', 'libgstaudio-1.0.so.0', 'libgstbase-1.0.so.0', 'libgstapp-1.0.so.0', 'libz.so.1', 'libgthread-2.0.so.0', 'libfreetype.so.6', 'libfontconfig.so.1', 'libdbus-1.so.3', 'libsystemd.so.0', 'libltdl.so.7', 'libGLU.so.1', 'libsqlite3.so.0', 'libpgm-5.1.so.0', 'libgomp.so.1', 'libxcb-render.so.0', 'libxcb-shm.so.0', 'libncurses.so.5', 'libxml2.so.2', 'libXss.so.1', 'libXft.so.2', 'libtk.so', 'libtcl.so', 'libasound.so.2', 'libharfbuzz.so.0', 'libpixman-1.so.0', 'libgio-2.0.so.0', 'libXinerama.so.1', 'libselinux.so.1', 'libXcomposite.so.1', 'libthai.so.0', 'libXdamage.so.1', 'libgdk-x11-2.0.so.0', 'libpangoft2-1.0.so.0', 'libcairo.so.2', 'libpangocairo-1.0.so.0', 'libdatrie.so.1', 'libatk-1.0.so.0', 'libXcursor.so.1', 'libXfixes.so.3', 'libgraphite2.so.3', 'libgdk_pixbuf-2.0.so.0', 'libgtk-x11-2.0.so.0', 'libquadmath.so.0', 'libpango-1.0.so.0', 'libXrandr.so.2', 'libgfortran.so.3', 'libjson-c.so.2', 'libshiboken-python2.7.so.1.1', 'libogg.so.0', 'libvorbis.so.0', 'libatlas.so.3', 'libcurl.so.4', 'libhdf5.so.9', 'libodbcinst.so.1', 'libpcap.so.0.9', 'libnetcdf.so.7', 'libblas.so.3', 'libpulse.so.0', 'libcaca.so.0', 'libgstreamer-0.10.so.0', 'libXxf86vm.so.1', 'libhdf5_hl.so.9', 'libpulse-simple.so.0', 'libasyncns.so.0', 'libwrap.so.0', 'libvorbisenc.so.2', 'libmagic.so.1', 'libssl.so.1.0.0', 'libFLAC.so.8', 'libSDL-1.2.so.0', 'libsndfile.so.1', 'libslang.so.2', 'libglapi.so.0', 'libaio.so.1', 'libgstinterfaces-0.10.so.0', 'libpulsecommon-6.0.so', 'libjpeg.so.62', 'libcrypto.so.1.0.0'] This list actually contains a fair number of false positives, so it would need to be pruned manually. If you stare at it a little while, you might see some libraries in there that you recognize that shouldn't be part of the base system, like libatlas.so.3. This gist https://gist.github.com/rmcgibbo/a13e7623c38ec54fcc93 contains some more detailed data -- for each of libraries in the list above, it gives a list of names of the packages that depend on this library. For example, for libatlas.so.3, the there is only a single package which depends on it, ["scikit-learn-0.11-np16py27_ce0"]. So, probably a bug. "libgfortran.so.1" is also in the list. It's depended on by ["cvxopt-1.1.6-py27_0", "cvxopt-1.1.7-py27_0", "cvxopt-1.1.7-py34_0", "cvxopt-1.1.7-py35_0", "numpy-1.5.1-py27_1", "numpy-1.5.1-py27_3", "numpy-1.5.1-py27_4", "numpy-1.5.1-py27_ce0", "numpy-1.6.2-py27_1", "numpy-1.6.2-py27_3", "numpy-1.6.2-py27_4", "numpy-1.6.2-py27_ce0", "numpy-1.7.0-py27_0", "numpy-1.7.0b2-py27_ce0", "numpy-1.7.0rc1-py27_0", "numpy-1.7.1-py27_0", "numpy-1.7.1-py27_2", "numpy-1.8.0-py27_0", "numpy-1.8.1-py27_0", "numpy-1.8.1-py34_0", "numpy-1.8.2-py27_0", "numpy-1.8.2-py34_0", "numpy-1.9.0-py27_0", "numpy-1.9.0-py34_0", "numpy-1.9.1-py27_0", "numpy-1.9.1-py34_0", "numpy-1.9.2-py27_0", "numpy-1.9.2-py34_0"]. Note that this list of numpy versions doesn't include the latest ones -- all of the numpy-1.10 binaries made by Continuum pick up libgfortan from a conda package and don't depend on it being provided by the system. Also, the final '_0' or '_1' segment of many of these package names is the build number, which is to make a new release of the same release of a package, usually because of a packaging problem. So many of these packages were probably built incorrectly and superseded by new builds with a higher build number. So it's not perfect. But it might be a useful starting place. -Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan.otte at gmail.com Sat Jan 9 06:58:58 2016 From: stefan.otte at gmail.com (Stefan Otte) Date: Sat, 9 Jan 2016 12:58:58 +0100 Subject: [Numpy-discussion] Generalize hstack/vstack --> stack; Block matrices like in matlab In-Reply-To: References: <101656916431878296.890307sturla.molden-gmail.com@news.gmane.org> Message-ID: Hey, one of my new year's resolutions is to get my pull requests accepted (or closed). So here we go... Here is the update pull request: https://github.com/numpy/numpy/pull/5057 Here is the docstring: https://github.com/sotte/numpy/commit/3d4c5d19a8f15b35df50d945b9c8853b683f7ab6#diff-2270128d50ff15badd1aba4021c50a8cR358 The new `block` function is very similar to matlab's `[A, B; C, D]`. Pros: - it's very useful (in my experiment) - less friction for people coming from matlab - it's conceptually simple - the implementation is simple - it's documented - it's tested Cons: - the implementation is not super efficient. Temporary copies are created. However, bmat also does that. Feedback is very welcome! Best, Stefan On Sun, May 10, 2015 at 12:33 PM, Stefan Otte wrote: > Hey, > > Just a quick update. I updated the pull request and renamed `stack` into > `block`. Have a look: https://github.com/numpy/numpy/pull/5057 > > I'm sticking with simple initial implementation because it's simple and > does what you think it does. > > > Cheers, > Stefan > > > > On Fri, Oct 31, 2014 at 2:13 PM Stefan Otte wrote: > >> To make the last point more concrete the implementation could look >> something like this (note that I didn't test it and that it still >> takes some work): >> >> >> def bmat(obj, ldict=None, gdict=None): >> return matrix(stack(obj, ldict, gdict)) >> >> >> def stack(obj, ldict=None, gdict=None): >> # the old bmat code minus the matrix calls >> if isinstance(obj, str): >> if gdict is None: >> # get previous frame >> frame = sys._getframe().f_back >> glob_dict = frame.f_globals >> loc_dict = frame.f_locals >> else: >> glob_dict = gdict >> loc_dict = ldict >> return _from_string(obj, glob_dict, loc_dict) >> >> if isinstance(obj, (tuple, list)): >> # [[A,B],[C,D]] >> arr_rows = [] >> for row in obj: >> if isinstance(row, N.ndarray): # not 2-d >> return concatenate(obj, axis=-1) >> else: >> arr_rows.append(concatenate(row, axis=-1)) >> return concatenate(arr_rows, axis=0) >> >> if isinstance(obj, N.ndarray): >> return obj >> >> >> I basically turned the old `bmat` into `stack` and removed the matrix >> calls. >> >> >> Best, >> Stefan >> >> >> >> On Wed, Oct 29, 2014 at 3:59 PM, Stefan Otte >> wrote: >> > Hey, >> > >> > there are several ways how to proceed. >> > >> > - My proposed solution covers the 80% case quite well (at least I use >> > it all the time). I'd convert the doctests into unittests and we're >> > done. >> > >> > - We could slightly change the interface to leave out the surrounding >> > square brackets, i.e. turning `stack([[a, b], [c, d]])` into >> > `stack([a, b], [c, d])` >> > >> > - We could extend it even further allowing a "filler value" for non >> > set values and a "shape" argument. This could be done later as well. >> > >> > - `bmat` is not really matrix specific. We could refactor `bmat` a bit >> > to use the same logic in `stack`. Except the `matrix` calls `bmat` and >> > `_from_string` are pretty agnostic to the input. >> > >> > I'm in favor of the first or last approach. The first: because it >> > already works and is quite simple. The last: because the logic and >> > tests of both `bmat` and `stack` would be the same and the feature to >> > specify a string representation of the block matrix is nice. >> > >> > >> > Best, >> > Stefan >> > >> > >> > >> > On Tue, Oct 28, 2014 at 7:46 PM, Nathaniel Smith wrote: >> >> On 28 Oct 2014 18:34, "Stefan Otte" wrote: >> >>> >> >>> Hey, >> >>> >> >>> In the last weeks I tested `np.asarray(np.bmat(....))` as `stack` >> >>> function and it works quite well. So the question persits: If `bmat` >> >>> already offers something like `stack` should we even bother >> >>> implementing `stack`? More code leads to more >> >>> bugs and maintenance work. (However, the current implementation is >> >>> only 5 lines and by using `bmat` which would reduce that even more.) >> >> >> >> In the long run we're trying to reduce usage of np.matrix and ideally >> >> deprecate it entirely. So yes, providing ndarray equivalents of matrix >> >> functionality (like bmat) is valuable. >> >> >> >> -n >> >> >> >> >> >> _______________________________________________ >> >> NumPy-Discussion mailing list >> >> NumPy-Discussion at scipy.org >> >> http://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jtaylor.debian at googlemail.com Sat Jan 9 07:12:11 2016 From: jtaylor.debian at googlemail.com (Julian Taylor) Date: Sat, 9 Jan 2016 13:12:11 +0100 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: <5690F91B.8080108@googlemail.com> On 09.01.2016 04:38, Nathaniel Smith wrote: > On Fri, Jan 8, 2016 at 7:17 PM, Nathan Goldbaum wrote: >> Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > Yes. IIRC CentOS 5 ships with gcc 4.4, and you can bump that up to gcc > 4.8 by using the Redhat Developer Toolset release (which is gcc + > special backport libraries to let it generate RHEL5/CentOS5-compatible > binaries). (I might have one or both of those version numbers slightly > wrong.) > >> I've never tested this, but I've seen claims on the anaconda mailing list of >> ~25% slowdowns compared to building from source or using system packages, >> which was attributed to building using an older gcc that doesn't optimize as >> well as newer versions. > > I'd be very surprised if that were a 25% slowdown in general, as > opposed to a 25% slowdown on some particular inner loop that happened > to neatly match some new feature in a new gcc (e.g. something where > the new autovectorizer kicked in). But yeah, in general this is just > an inevitable trade-off when it comes to distributing binaries: you're > always going to pay some penalty for achieving broad compatibility as > compared to artisanally hand-tuned binaries specialized for your > machine's exact OS version, processor, etc. Not much to be done, > really. At some point the baseline for compatibility will switch to > "compile everything on CentOS 6", and that will be better but it will > still be worse than binaries that target CentOS 7, and so on and so > forth. > I have over the years put in one gcc specific optimization after the other so yes using an ancient version will make many parts significantly slower. Though that is not really a problem, updating a compiler is easy even without redhats devtoolset. At least as far as numpy is concerned linux binaries should not be a very big problem. The only dependency where the version matters is glibc which has updated its interfaces we use (in a backward compatible way) many times. But here if we use a old enough baseline glibc (e.g. centos5 or ubuntu 10.04) we are fine at reasonable performance costs, basically only slower memcpy. Scipy on the other hand is a larger problem as it contains C++ code. Linux systems are now transitioning to C++11 which is binary incompatible in parts to the old standard. There a lot of testing is necessary to check if we are affected. How does Anaconda deal with C++11? -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From jtaylor.debian at googlemail.com Sat Jan 9 07:20:38 2016 From: jtaylor.debian at googlemail.com (Julian Taylor) Date: Sat, 9 Jan 2016 13:20:38 +0100 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: <5690FB16.7010605@googlemail.com> On 09.01.2016 12:52, Robert McGibbon wrote: > Hi all, > > I went ahead and tried to collect a list of all of the libraries that > could be considered to constitute the "base" system for linux-64. The > strategy I used was to leverage off the work done by the folks at > Continuum by searching through their pre-compiled binaries > from https://repo.continuum.io/pkgs/free/linux-64/ to find shared > libraries that were dependened on (according to ldd) that were not > accounted for by the declared dependencies that each package made known > to the conda package manager. > do those packages use ld --as-needed for linking? there are a lot libraries in that list that I highly doubt are directly used by the packages. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From rmcgibbo at gmail.com Sat Jan 9 07:29:13 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sat, 9 Jan 2016 04:29:13 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: <5690FB16.7010605@googlemail.com> References: <5690FB16.7010605@googlemail.com> Message-ID: > do those packages use ld --as-needed for linking? Is it possible to check this? I mean, there are over 7000 packages that I check. I don't know how they were all built. It's totally possible for many of them to be unused. A reasonably common thing might be that packages use ctypes or dlopen to dynamically load shared libraries that are actually just optional (and catch the error and recover gracefully if the library can't be loaded). -Robert On Sat, Jan 9, 2016 at 4:20 AM, Julian Taylor wrote: > On 09.01.2016 12:52, Robert McGibbon wrote: > > Hi all, > > > > I went ahead and tried to collect a list of all of the libraries that > > could be considered to constitute the "base" system for linux-64. The > > strategy I used was to leverage off the work done by the folks at > > Continuum by searching through their pre-compiled binaries > > from https://repo.continuum.io/pkgs/free/linux-64/ to find shared > > libraries that were dependened on (according to ldd) that were not > > accounted for by the declared dependencies that each package made known > > to the conda package manager. > > > > do those packages use ld --as-needed for linking? > there are a lot libraries in that list that I highly doubt are directly > used by the packages. > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From morph at debian.org Sat Jan 9 07:44:30 2016 From: morph at debian.org (Sandro Tosi) Date: Sat, 9 Jan 2016 12:44:30 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > I wrote a page on using pip with Debian / Ubuntu here : > https://matthew-brett.github.io/pydagogue/installing_on_debian.html Speaking with my numpy debian maintainer hat on, I would really appreciate if you dont suggest to use pip to install packages in Debian, or at least not as the only solution. -- Sandro "morph" Tosi My website: http://matrixhasu.altervista.org/ Me at Debian: http://wiki.debian.org/SandroTosi G+: https://plus.google.com/u/0/+SandroTosi From cournape at gmail.com Sat Jan 9 08:23:42 2016 From: cournape at gmail.com (David Cournapeau) Date: Sat, 9 Jan 2016 13:23:42 +0000 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: <5690FB16.7010605@googlemail.com> References: <5690FB16.7010605@googlemail.com> Message-ID: On Sat, Jan 9, 2016 at 12:20 PM, Julian Taylor < jtaylor.debian at googlemail.com> wrote: > On 09.01.2016 12:52, Robert McGibbon wrote: > > Hi all, > > > > I went ahead and tried to collect a list of all of the libraries that > > could be considered to constitute the "base" system for linux-64. The > > strategy I used was to leverage off the work done by the folks at > > Continuum by searching through their pre-compiled binaries > > from https://repo.continuum.io/pkgs/free/linux-64/ to find shared > > libraries that were dependened on (according to ldd) that were not > > accounted for by the declared dependencies that each package made known > > to the conda package manager. > > > > do those packages use ld --as-needed for linking? > there are a lot libraries in that list that I highly doubt are directly > used by the packages. > It is also a common problem when building packages without using a "clean" build environment, as it is too easy to pick up dependencies accidentally, especially for autotools-based packages (unless one uses pbuilder or similar tools). David -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sat Jan 9 08:48:40 2016 From: cournape at gmail.com (David Cournapeau) Date: Sat, 9 Jan 2016 13:48:40 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <5690F91B.8080108@googlemail.com> References: <5690F91B.8080108@googlemail.com> Message-ID: On Sat, Jan 9, 2016 at 12:12 PM, Julian Taylor < jtaylor.debian at googlemail.com> wrote: > On 09.01.2016 04:38, Nathaniel Smith wrote: > > On Fri, Jan 8, 2016 at 7:17 PM, Nathan Goldbaum > wrote: > >> Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > > > Yes. IIRC CentOS 5 ships with gcc 4.4, and you can bump that up to gcc > > 4.8 by using the Redhat Developer Toolset release (which is gcc + > > special backport libraries to let it generate RHEL5/CentOS5-compatible > > binaries). (I might have one or both of those version numbers slightly > > wrong.) > > > >> I've never tested this, but I've seen claims on the anaconda mailing > list of > >> ~25% slowdowns compared to building from source or using system > packages, > >> which was attributed to building using an older gcc that doesn't > optimize as > >> well as newer versions. > > > > I'd be very surprised if that were a 25% slowdown in general, as > > opposed to a 25% slowdown on some particular inner loop that happened > > to neatly match some new feature in a new gcc (e.g. something where > > the new autovectorizer kicked in). But yeah, in general this is just > > an inevitable trade-off when it comes to distributing binaries: you're > > always going to pay some penalty for achieving broad compatibility as > > compared to artisanally hand-tuned binaries specialized for your > > machine's exact OS version, processor, etc. Not much to be done, > > really. At some point the baseline for compatibility will switch to > > "compile everything on CentOS 6", and that will be better but it will > > still be worse than binaries that target CentOS 7, and so on and so > > forth. > > > > I have over the years put in one gcc specific optimization after the > other so yes using an ancient version will make many parts significantly > slower. Though that is not really a problem, updating a compiler is easy > even without redhats devtoolset. > > At least as far as numpy is concerned linux binaries should not be a > very big problem. The only dependency where the version matters is glibc > which has updated its interfaces we use (in a backward compatible way) > many times. > But here if we use a old enough baseline glibc (e.g. centos5 or ubuntu > 10.04) we are fine at reasonable performance costs, basically only > slower memcpy. > > Scipy on the other hand is a larger problem as it contains C++ code. > Linux systems are now transitioning to C++11 which is binary > incompatible in parts to the old standard. There a lot of testing is > necessary to check if we are affected. > How does Anaconda deal with C++11? > For canopy packages, we use the RH devtoolset w/ gcc 4.8.X, and statically link the C++ stdlib. It has worked so far for the few packages requiring C++11 and gcc > 4.4 (llvm/llvmlite/dynd), but that's not a solution I am a fan of myself, as the implications are not always very clear. David -------------- next part -------------- An HTML attachment was scrubbed... URL: From gyromagnetic at gmail.com Sat Jan 9 12:11:24 2016 From: gyromagnetic at gmail.com (Gyro Funch) Date: Sat, 9 Jan 2016 10:11:24 -0700 Subject: [Numpy-discussion] Generalize hstack/vstack --> stack; Block matrices like in matlab In-Reply-To: References: <101656916431878296.890307sturla.molden-gmail.com@news.gmane.org> Message-ID: On 1/9/2016 4:58 AM, Stefan Otte wrote: > Hey, > > one of my new year's resolutions is to get my pull requests > accepted (or closed). So here we go... > > Here is the update pull request: > https://github.com/numpy/numpy/pull/5057 Here is the docstring: > https://github.com/sotte/numpy/commit/3d4c5d19a8f15b35df50d945b9c8853b683f7ab6#diff-2270128d50ff15badd1aba4021c50a8cR358 > > The new `block` function is very similar to matlab's `[A, B; C, > D]`. > > Pros: - it's very useful (in my experiment) - less friction for > people coming from matlab - it's conceptually simple - the > implementation is simple - it's documented - it's tested > > Cons: - the implementation is not super efficient. Temporary > copies are created. However, bmat also does that. > > Feedback is very welcome! > > > Best, Stefan > > Without commenting on the implementation, I would find this function *very* useful and convenient in my own work. -gyro From matthew.brett at gmail.com Sat Jan 9 13:08:17 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 9 Jan 2016 10:08:17 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Hi Sandro, On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: >> I wrote a page on using pip with Debian / Ubuntu here : >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html > > Speaking with my numpy debian maintainer hat on, I would really > appreciate if you dont suggest to use pip to install packages in > Debian, or at least not as the only solution. I'm very happy to accept alternative suggestions or PRs. I know what you mean, but I can't yet see how to write a page that would be good for explaining the benefits / tradeoffs of using deb packages vs mainly or only pip packages vs a mix of the two. Do you have any thoughts? Cheers, Matthew From ralf.gommers at gmail.com Sat Jan 9 16:11:12 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 9 Jan 2016 22:11:12 +0100 Subject: [Numpy-discussion] Windows build/distribute plan & MingwPy funding In-Reply-To: References: Message-ID: On Mon, Jan 4, 2016 at 7:38 PM, Matthew Brett wrote: > Hi, > > On Mon, Jan 4, 2016 at 5:35 PM, Erik Bray > wrote: > > On Sat, Jan 2, 2016 at 3:20 AM, Ralf Gommers > wrote: > >> Hi all, > >> > >> You probably know that building Numpy, Scipy and the rest of the Scipy > Stack > >> on Windows is problematic. And that there are plans to adopt the static > >> MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on > for > >> the last two years to fix that situation. > >> > >> The good news is: this has become a lot more concrete just now, with > this > >> proposal for funding: > http://mingwpy.github.io/proposal_december2015.html > >> > >> Funding for phases 1 and 2 is already confirmed; the phase 3 part has > been > >> submitted to the PSF. Phase 1 (of $1000) is funded by donations made to > >> Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS > >> directly. So a big thank you to everyone who made a donation to Numpy, > Scipy > >> and NumFOCUS! > More good news: the PSF has approved phase 3! > > >> I hope that that proposal gives a clear idea of the work that's going > to be > >> done over the next months. Note that the http://mingwpy.github.io > contains a > >> lot more background info, description of technical issues, etc. > >> > >> Feedback & ideas very welcome of course! > >> > >> Cheers, > >> Ralf > > > > Hi Ralph, > > > > I've seen you drop hints about this recently, and am interested to > > follow this work. I've been hired as part of the OpenDreamKit project > > to work, in large part, on developing a sensible toolchain for > > building and distributing Sage on Windows. I know you've been > > following the thread on that too. Although the primary goal there is > > "whatever works", I'm personally inclined to focus on the mingwpy / > > mingw-w64 approach, due in large part with my past success with the > > MinGW 32-bit toolchain. > That's good to hear Erik. > (I personally have a desire to improve > > support for building with MSVC as well, but that's a less important > > goal as far as the funding is concerned.) > > > > So anyways, please keep me in the loop about this, as I will also be > > putting effort into this over the next year as well. Has there been > > any discussion about setting up a mailing list specifically for this > > project? > We'll definitely keep you in the loop. We try to do as much of the discussion as possible on the mingwpy mailing list and on https://github.com/mingwpy. If there's significant progress then I guess that'll be announced on this list as well. Cheers, Ralf > Yes, it exists already, but not well advertised : > https://groups.google.com/forum/#!forum/mingwpy > > It would be great to share work. > > Cheers, > > Matthew > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat Jan 9 16:49:29 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 9 Jan 2016 13:49:29 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <5690F91B.8080108@googlemail.com> References: <5690F91B.8080108@googlemail.com> Message-ID: On Jan 9, 2016 04:12, "Julian Taylor" wrote: > > On 09.01.2016 04:38, Nathaniel Smith wrote: > > On Fri, Jan 8, 2016 at 7:17 PM, Nathan Goldbaum wrote: > >> Doesn't building on CentOS 5 also mean using a quite old version of gcc? > > > > Yes. IIRC CentOS 5 ships with gcc 4.4, and you can bump that up to gcc > > 4.8 by using the Redhat Developer Toolset release (which is gcc + > > special backport libraries to let it generate RHEL5/CentOS5-compatible > > binaries). (I might have one or both of those version numbers slightly > > wrong.) > > > >> I've never tested this, but I've seen claims on the anaconda mailing list of > >> ~25% slowdowns compared to building from source or using system packages, > >> which was attributed to building using an older gcc that doesn't optimize as > >> well as newer versions. > > > > I'd be very surprised if that were a 25% slowdown in general, as > > opposed to a 25% slowdown on some particular inner loop that happened > > to neatly match some new feature in a new gcc (e.g. something where > > the new autovectorizer kicked in). But yeah, in general this is just > > an inevitable trade-off when it comes to distributing binaries: you're > > always going to pay some penalty for achieving broad compatibility as > > compared to artisanally hand-tuned binaries specialized for your > > machine's exact OS version, processor, etc. Not much to be done, > > really. At some point the baseline for compatibility will switch to > > "compile everything on CentOS 6", and that will be better but it will > > still be worse than binaries that target CentOS 7, and so on and so > > forth. > > > > I have over the years put in one gcc specific optimization after the > other so yes using an ancient version will make many parts significantly > slower. Though that is not really a problem, updating a compiler is easy > even without redhats devtoolset. > > At least as far as numpy is concerned linux binaries should not be a > very big problem. The only dependency where the version matters is glibc > which has updated its interfaces we use (in a backward compatible way) > many times. > But here if we use a old enough baseline glibc (e.g. centos5 or ubuntu > 10.04) we are fine at reasonable performance costs, basically only > slower memcpy. Are you saying that it's easy to use, say, gcc 5.3's C compiler to produce binaries that will run on an out-of-the-box centos 5 install? I assumed that there'd be issues with things like new symbol versions in libgcc, not just glibc, but if not then that would be great... > Scipy on the other hand is a larger problem as it contains C++ code. > Linux systems are now transitioning to C++11 which is binary > incompatible in parts to the old standard. There a lot of testing is > necessary to check if we are affected. > How does Anaconda deal with C++11? IIUC the situation with the C++ stdlib changes in gcc 5 is that old binaries will continue to work on new systems. The only thing that breaks is that if two libraries want to pass objects of the affected types back and forth (e.g. std::string), then either they both need to be compiled with the old abi or they both need to be compiled with the new abi. (And when using a new compiler it's still possible to choose the old abi with a #define; old compilers of course only support the old abi.) See: http://developerblog.redhat.com/2015/02/05/gcc5-and-the-c11-abi/ So the answer is that most python packages don't care, because even the ones written in C++ don't generally talk C++ across package boundaries, and for the ones that do care then the people making the binary packages will have to coordinate to use the same abi. And for local builds on modern systems that link against binary packages built using the old abi, people might have to use -D_GLIBCXX_USE_CXX11_ABI=0. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat Jan 9 18:04:44 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 9 Jan 2016 15:04:44 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 3:52 AM, Robert McGibbon wrote: > Hi all, > > I went ahead and tried to collect a list of all of the libraries that could > be considered to constitute the "base" system for linux-64. The strategy I > used was to leverage off the work done by the folks at Continuum by > searching through their pre-compiled binaries from > https://repo.continuum.io/pkgs/free/linux-64/ to find shared libraries that > were dependened on (according to ldd) that were not accounted for by the > declared dependencies that each package made known to the conda package > manager. > > The full list of these system libraries, sorted in from > most-commonly-depend-on to rarest, is below. There are 158 of them. [...] > So it's not perfect. But it might be a useful starting place. Unfortunately, yeah, it looks like there's a lot of false positives in here :-(. For example your list contains liblzma and libsqlite, but both of these are shipped as dependencies of python itself. So probably someone just forgot to declare the dependency explicitly, but got away with it because the libraries were pulled in anyway. Maybe a better approach would be to look at what libraries are used on by an up-to-date default Anaconda install (on the assumption that this is the best tested configuration), and then erase from the list all libraries that are shipped by this configuration (ignoring declared dependencies since those seem to be unreliable)? It's better to be conservative here, since the end goal is to come up with a list of external libraries that we're confident have actually been tested for compatibility by lots and lots of different users. -n -- Nathaniel J. Smith -- http://vorpus.org From rmcgibbo at gmail.com Sat Jan 9 19:42:48 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sat, 9 Jan 2016 16:42:48 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: > Maybe a better approach would be to look at what libraries are used on by an up-to-date default Anaconda install (on the assumption that this is the best tested configuration) That's not a bad idea. I also have a couple other ideas about how to filter this based on using debian popularity-contests and the package graph. I will report back when I have more info. -Robert On Sat, Jan 9, 2016 at 3:04 PM, Nathaniel Smith wrote: > On Sat, Jan 9, 2016 at 3:52 AM, Robert McGibbon > wrote: > > Hi all, > > > > I went ahead and tried to collect a list of all of the libraries that > could > > be considered to constitute the "base" system for linux-64. The strategy > I > > used was to leverage off the work done by the folks at Continuum by > > searching through their pre-compiled binaries from > > https://repo.continuum.io/pkgs/free/linux-64/ to find shared libraries > that > > were dependened on (according to ldd) that were not accounted for by the > > declared dependencies that each package made known to the conda package > > manager. > > > > The full list of these system libraries, sorted in from > > most-commonly-depend-on to rarest, is below. There are 158 of them. > [...] > > So it's not perfect. But it might be a useful starting place. > > Unfortunately, yeah, it looks like there's a lot of false positives in > here :-(. For example your list contains liblzma and libsqlite, but > both of these are shipped as dependencies of python itself. So > probably someone just forgot to declare the dependency explicitly, but > got away with it because the libraries were pulled in anyway. > > Maybe a better approach would be to look at what libraries are used on > by an up-to-date default Anaconda install (on the assumption that this > is the best tested configuration), and then erase from the list all > libraries that are shipped by this configuration (ignoring declared > dependencies since those seem to be unreliable)? It's better to be > conservative here, since the end goal is to come up with a list of > external libraries that we're confident have actually been tested for > compatibility by lots and lots of different users. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From morph at debian.org Sat Jan 9 21:57:41 2016 From: morph at debian.org (Sandro Tosi) Date: Sun, 10 Jan 2016 02:57:41 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 6:08 PM, Matthew Brett wrote: > Hi Sandro, > > On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: >>> I wrote a page on using pip with Debian / Ubuntu here : >>> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >> >> Speaking with my numpy debian maintainer hat on, I would really >> appreciate if you dont suggest to use pip to install packages in >> Debian, or at least not as the only solution. > > I'm very happy to accept alternative suggestions or PRs. > > I know what you mean, but I can't yet see how to write a page that > would be good for explaining the benefits / tradeoffs of using deb > packages vs mainly or only pip packages vs a mix of the two. Do you > have any thoughts? you can start by making extremely clear that this is not the Debian supported way to install python modules on a Debian system, that if a user uses pip to do it, it's very likely other applications or modules will fail, that if they have any problem with anything python related, they are on their own as they "broke" their system on purpose. thanks for considering -- Sandro "morph" Tosi My website: http://matrixhasu.altervista.org/ Me at Debian: http://wiki.debian.org/SandroTosi G+: https://plus.google.com/u/0/+SandroTosi From matthew.brett at gmail.com Sat Jan 9 22:55:28 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 9 Jan 2016 19:55:28 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 6:57 PM, Sandro Tosi wrote: > On Sat, Jan 9, 2016 at 6:08 PM, Matthew Brett wrote: >> Hi Sandro, >> >> On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: >>>> I wrote a page on using pip with Debian / Ubuntu here : >>>> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >>> >>> Speaking with my numpy debian maintainer hat on, I would really >>> appreciate if you dont suggest to use pip to install packages in >>> Debian, or at least not as the only solution. >> >> I'm very happy to accept alternative suggestions or PRs. >> >> I know what you mean, but I can't yet see how to write a page that >> would be good for explaining the benefits / tradeoffs of using deb >> packages vs mainly or only pip packages vs a mix of the two. Do you >> have any thoughts? > > you can start by making extremely clear that this is not the Debian > supported way to install python modules on a Debian system, that if a > user uses pip to do it, it's very likely other applications or modules > will fail, that if they have any problem with anything python related, > they are on their own as they "broke" their system on purpose. thanks > for considering I updated the page with more on reasons to prefer Debian packages over installing with pip: https://matthew-brett.github.io/pydagogue/installing_on_debian.html Is that enough to get the message across? Cheers, Matthew From njs at pobox.com Sat Jan 9 23:49:48 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 9 Jan 2016 20:49:48 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Jan 9, 2016 10:09, "Matthew Brett" wrote: > > Hi Sandro, > > On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: > >> I wrote a page on using pip with Debian / Ubuntu here : > >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html > > > > Speaking with my numpy debian maintainer hat on, I would really > > appreciate if you dont suggest to use pip to install packages in > > Debian, or at least not as the only solution. > > I'm very happy to accept alternative suggestions or PRs. > > I know what you mean, but I can't yet see how to write a page that > would be good for explaining the benefits / tradeoffs of using deb > packages vs mainly or only pip packages vs a mix of the two. Do you > have any thoughts? Why not replace all the "sudo pip" calls with "pip --user"? The trade offs between Debian-installed packages versus pip --user installed packages are subtle, and both are good options. Personal I'd generally recommend anyone actively developing python code to skip straight to pip for most things, since you'll eventually end up there anyway, but this is definitely debatable and situation dependent. On the other hand, "sudo pip" specifically is something I'd never recommend, and indeed has the potential to totally break your system. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Sat Jan 9 23:58:36 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 9 Jan 2016 20:58:36 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 8:49 PM, Nathaniel Smith wrote: > On Jan 9, 2016 10:09, "Matthew Brett" wrote: >> >> Hi Sandro, >> >> On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: >> >> I wrote a page on using pip with Debian / Ubuntu here : >> >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >> > >> > Speaking with my numpy debian maintainer hat on, I would really >> > appreciate if you dont suggest to use pip to install packages in >> > Debian, or at least not as the only solution. >> >> I'm very happy to accept alternative suggestions or PRs. >> >> I know what you mean, but I can't yet see how to write a page that >> would be good for explaining the benefits / tradeoffs of using deb >> packages vs mainly or only pip packages vs a mix of the two. Do you >> have any thoughts? > > Why not replace all the "sudo pip" calls with "pip --user"? The trade offs > between Debian-installed packages versus pip --user installed packages are > subtle, and both are good options. Personal I'd generally recommend anyone > actively developing python code to skip straight to pip for most things, > since you'll eventually end up there anyway, but this is definitely > debatable and situation dependent. On the other hand, "sudo pip" > specifically is something I'd never recommend, and indeed has the potential > to totally break your system. Sure, but I don't think the page is suggesting doing ``sudo pip`` for anything other than upgrading pip and virtualenv(wrapper) - and I don't think that is likely to break the system. Matthew From njs at pobox.com Sun Jan 10 01:22:42 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 9 Jan 2016 22:22:42 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sat, Jan 9, 2016 at 8:58 PM, Matthew Brett wrote: > On Sat, Jan 9, 2016 at 8:49 PM, Nathaniel Smith wrote: >> On Jan 9, 2016 10:09, "Matthew Brett" wrote: >>> >>> Hi Sandro, >>> >>> On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi wrote: >>> >> I wrote a page on using pip with Debian / Ubuntu here : >>> >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >>> > >>> > Speaking with my numpy debian maintainer hat on, I would really >>> > appreciate if you dont suggest to use pip to install packages in >>> > Debian, or at least not as the only solution. >>> >>> I'm very happy to accept alternative suggestions or PRs. >>> >>> I know what you mean, but I can't yet see how to write a page that >>> would be good for explaining the benefits / tradeoffs of using deb >>> packages vs mainly or only pip packages vs a mix of the two. Do you >>> have any thoughts? >> >> Why not replace all the "sudo pip" calls with "pip --user"? The trade offs >> between Debian-installed packages versus pip --user installed packages are >> subtle, and both are good options. Personal I'd generally recommend anyone >> actively developing python code to skip straight to pip for most things, >> since you'll eventually end up there anyway, but this is definitely >> debatable and situation dependent. On the other hand, "sudo pip" >> specifically is something I'd never recommend, and indeed has the potential >> to totally break your system. > > Sure, but I don't think the page is suggesting doing ``sudo pip`` for > anything other than upgrading pip and virtualenv(wrapper) - and I > don't think that is likely to break the system. It could... a quick glance suggests that currently installing virtualenvwrapper like that will also pull in some random pypi snapshot of stevedore, which will shadow the built-in package version. And then stevedore is used by tons of different debian packages, including large parts of openstack... But more to the point, the target audience for your page is hardly equipped to perform that kind of analysis, never mind in the general case of using 'sudo pip' for arbitrary Python packages, and your very first example is one that demonstrates bad habits... So personally I'd avoid mentioning the possibility of 'sudo pip', or better yet explicitly warn against it. -n -- Nathaniel J. Smith -- http://vorpus.org From rmcgibbo at gmail.com Sun Jan 10 04:19:07 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sun, 10 Jan 2016 01:19:07 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: Hi all, I followed Nathaniel's advice and restricted the search down to the packages included in the Anaconda release (as opposed to all of the packages in their repositories), and fixed some technical issues with the way I was doing the analysis. The new list is much smaller. Here are the shared libraries that the components of Anaconda require that the system provides on Linux 64: libpanelw.so.5, libncursesw.so.5, libgcc_s.so.1, libstdc++.so.6, libm.so.6, libdl.so.2, librt.so.1, libcrypt.so.1, libc.so.6, libnsl.so.1, libutil.so.1, libpthread.so.0, libX11.so.6, libXext.so.6, libgobject-2.0.so.0, libgthread-2.0.so.0, libglib-2.0.so.0, libXrender.so.1, libICE.so.6, libSM.so.6, libGL.so.1. Many of these libraries are required simply for the interpreter. The remaining ones that aren't required by the interpreter are, but are required by some other package in Anaconda are: libgcc_s.so.1, libstdc++.so.6, libXext.so.6, libSM.so.6, libgthread-2.0.so.0, libgobject-2.0.so.0, libglib-2.0.so.0, libICE.so.6, libXrender.so.1, and libGL.so.1. Most of these are parts of X11 required by Qt ( http://doc.qt.io/qt-5/linux-requirements.html). -Robert On Sat, Jan 9, 2016 at 4:42 PM, Robert McGibbon wrote: > > Maybe a better approach would be to look at what libraries are used on > by an up-to-date default Anaconda install (on the assumption that this > is the best tested configuration) > > That's not a bad idea. I also have a couple other ideas about how to filter > this based on using debian popularity-contests and the package graph. I > will report back when I have more info. > > -Robert > > On Sat, Jan 9, 2016 at 3:04 PM, Nathaniel Smith wrote: > >> On Sat, Jan 9, 2016 at 3:52 AM, Robert McGibbon >> wrote: >> > Hi all, >> > >> > I went ahead and tried to collect a list of all of the libraries that >> could >> > be considered to constitute the "base" system for linux-64. The >> strategy I >> > used was to leverage off the work done by the folks at Continuum by >> > searching through their pre-compiled binaries from >> > https://repo.continuum.io/pkgs/free/linux-64/ to find shared libraries >> that >> > were dependened on (according to ldd) that were not accounted for by >> the >> > declared dependencies that each package made known to the conda package >> > manager. >> > >> > The full list of these system libraries, sorted in from >> > most-commonly-depend-on to rarest, is below. There are 158 of them. >> [...] >> > So it's not perfect. But it might be a useful starting place. >> >> Unfortunately, yeah, it looks like there's a lot of false positives in >> here :-(. For example your list contains liblzma and libsqlite, but >> both of these are shipped as dependencies of python itself. So >> probably someone just forgot to declare the dependency explicitly, but >> got away with it because the libraries were pulled in anyway. >> >> Maybe a better approach would be to look at what libraries are used on >> by an up-to-date default Anaconda install (on the assumption that this >> is the best tested configuration), and then erase from the list all >> libraries that are shipped by this configuration (ignoring declared >> dependencies since those seem to be unreliable)? It's better to be >> conservative here, since the end goal is to come up with a list of >> external libraries that we're confident have actually been tested for >> compatibility by lots and lots of different users. >> >> -n >> >> -- >> Nathaniel J. Smith -- http://vorpus.org >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From morph at debian.org Sun Jan 10 06:40:49 2016 From: morph at debian.org (Sandro Tosi) Date: Sun, 10 Jan 2016 11:40:49 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sun, Jan 10, 2016 at 3:55 AM, Matthew Brett wrote: > I updated the page with more on reasons to prefer Debian packages over > installing with pip: > > https://matthew-brett.github.io/pydagogue/installing_on_debian.html > > Is that enough to get the message across? That looks a lot better, thanks! I also kinda agree with all Nathaniel said on the matter -- Sandro "morph" Tosi My website: http://matrixhasu.altervista.org/ Me at Debian: http://wiki.debian.org/SandroTosi G+: https://plus.google.com/u/0/+SandroTosi From matthew.brett at gmail.com Sun Jan 10 14:25:48 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 10 Jan 2016 11:25:48 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Sun, Jan 10, 2016 at 3:40 AM, Sandro Tosi wrote: > On Sun, Jan 10, 2016 at 3:55 AM, Matthew Brett wrote: >> I updated the page with more on reasons to prefer Debian packages over >> installing with pip: >> >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html >> >> Is that enough to get the message across? > > That looks a lot better, thanks! I also kinda agree with all Nathaniel > said on the matter Good so. You probably saw, I already removed use of sudo on the page as well. Matthew From tony at kelman.net Sun Jan 10 21:36:57 2016 From: tony at kelman.net (Tony Kelman) Date: Mon, 11 Jan 2016 02:36:57 +0000 (UTC) Subject: [Numpy-discussion] Should I use pip install numpy in linux? References: Message-ID: > Right. There's a small problem which is that the base linux system > isn't just "CentOS 5", it's "CentOS 5 and here's the list of libraries > that you're allowed to link to: ...", where that list is empirically > chosen to include only stuff that really is installed on ~all linux > machines and for which the ABI really has been stable in practice over > multiple years and distros (so e.g. no OpenSSL). > > So the key next step is for someone to figure out and write down that > list. Continuum and Enthought both have versions of it that we know > are good... > > Does anyone know who maintains Anaconda's linux build environment? I strongly suspect it was originally set up by Aaron Meurer. Who maintains it now that he is no longer at Continuum is a good question. We build "generic Linux binaries" for Julia and I co-maintain that environment. It's using CentOS 5 (for now, until we hit an issue that is easiest to fix by upgrading the builders to CentOS 6 - I don't think we have any real users on CentOS 5 any more so no one would notice the bump), but we avoid using the Red Hat devtoolset. We tried it initially, but had issues due to the way they attempt to statically link libstdc++ and libgfortran. The -static-libgfortran GCC flag has been broken since GCC 4.5 because libgfortran now depends on libquadmath, and it requires rather messy modification of every link line to change -lgfortran to an explicit absolute path to libgfortran.a (and possibly also libquadmath.a) to really get libraries like BLAS and LAPACK statically linked. And if you want to statically link the GCC runtime libraries into a .so, your distribution's copies of libstdc++.a, libgfortran.a, etc must have been built with -fPIC, which many are not. Some of the issues where this was discussed and worked out were: https://github.com/JuliaLang/julia/issues/8397 https://github.com/JuliaLang/julia/issues/8433 https://github.com/JuliaLang/julia/pull/8442 https://github.com/JuliaLang/julia/pull/10043 So for Julia we wound up at building our own copy of latest GCC from source on CentOS 5 buildbots, and shipping private shared-library copies of libgfortran, libstdc++, libgcc_s, and a few others. This works on pretty much any glibc-using Linux distribution as new or newer than the buildbot. It sadly doesn't work on musl distributions, ah well. Rust has been experimenting with truly static libraries/executables that statically link musl libc in addition to everything else, I'm not sure how practical that would be for numpy, scipy, etc. If you go this route of building gcc from source and depending on private copies of the shared runtime libraries, it ends up important that you pick a newer version of gcc than any of your users. The reason is for packages that require compilation on the user's machine. If you build and distribute a set of shared libraries using GCC 4.8 but then someone on Ubuntu 15.10 tries to build something else (say as an example, pyipopt which needs a library that uses both C++ and Fortran) from source using GCC 5.2, the GCC 4.8 runtime libraries that you built against will get loaded first, and they won't contain newer-ABI symbols needed by the GCC 5.2-built user library. Sometimes this can be fixed by just deleting the older bundled copies of the runtime libraries and relying on the users' system copies, but for example libgfortran typically needs to be manually installed first. From rmcgibbo at gmail.com Sun Jan 10 22:14:49 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Sun, 10 Jan 2016 19:14:49 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > > Right. There's a small problem which is that the base linux system >> isn't just "CentOS 5", it's "CentOS 5 and here's the list of libraries > > that you're allowed to link to: ...", where that list is empirically > > chosen to include only stuff that really is installed on ~all linux >> machines and for which the ABI really has been stable in practice over > > multiple years and distros (so e.g. no OpenSSL). > > > > Does anyone know who maintains Anaconda's linux build environment? > I strongly suspect it was originally set up by Aaron Meurer. Who maintains it now that he is no longer at Continuum is a good question. >From looking at all of the external libraries referenced by binaries included in Anaconda and the conda repos, I am not confident that they have a totally strict policy here, or at least not ones that is enforced by tooling. The sonames I listed here cover all of the external dependencies used by the latest Anaconda release, but earlier releases and other conda-installable packages from the default channel are not so strict. -Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcgibbo at gmail.com Mon Jan 11 09:06:31 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Mon, 11 Jan 2016 06:06:31 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: I started working on a tool for checking linux wheels for "manylinux" compatibility, and fixing them up if possible, based on the same ideas as Matthew Brett's delocate for OS X. Current WIP code, if anyone wants to help / throw penuts, is here: https://github.com/rmcgibbo/deloc8. It's currently fairly modest and can only list non-whitelisted external shared library dependencies, and verify that sufficiently old versioned symbols for glibc and its ilk are used. -Robert On Sun, Jan 10, 2016 at 1:19 AM, Robert McGibbon wrote: > Hi all, > > I followed Nathaniel's advice and restricted the search down to the > packages included in the Anaconda release (as opposed to all of the > packages in their repositories), and fixed some technical issues with the > way I was doing the analysis. > > The new list is much smaller. Here are the shared libraries that the > components of Anaconda require that the system provides on Linux 64: > > libpanelw.so.5, libncursesw.so.5, libgcc_s.so.1, libstdc++.so.6, > libm.so.6, libdl.so.2, librt.so.1, libcrypt.so.1, libc.so.6, libnsl.so.1, > libutil.so.1, libpthread.so.0, libX11.so.6, libXext.so.6, > libgobject-2.0.so.0, libgthread-2.0.so.0, libglib-2.0.so.0, > libXrender.so.1, libICE.so.6, libSM.so.6, libGL.so.1. > > Many of these libraries are required simply for the interpreter. The > remaining ones that aren't required by the interpreter are, but are > required by some other package in Anaconda are: > > libgcc_s.so.1, libstdc++.so.6, libXext.so.6, libSM.so.6, > libgthread-2.0.so.0, libgobject-2.0.so.0, libglib-2.0.so.0, libICE.so.6, > libXrender.so.1, and libGL.so.1. > > Most of these are parts of X11 required by Qt ( > http://doc.qt.io/qt-5/linux-requirements.html). > > -Robert > > > > On Sat, Jan 9, 2016 at 4:42 PM, Robert McGibbon > wrote: > >> > Maybe a better approach would be to look at what libraries are used on >> by an up-to-date default Anaconda install (on the assumption that this >> is the best tested configuration) >> >> That's not a bad idea. I also have a couple other ideas about how to >> filter >> this based on using debian popularity-contests and the package graph. I >> will report back when I have more info. >> >> -Robert >> >> On Sat, Jan 9, 2016 at 3:04 PM, Nathaniel Smith wrote: >> >>> On Sat, Jan 9, 2016 at 3:52 AM, Robert McGibbon >>> wrote: >>> > Hi all, >>> > >>> > I went ahead and tried to collect a list of all of the libraries that >>> could >>> > be considered to constitute the "base" system for linux-64. The >>> strategy I >>> > used was to leverage off the work done by the folks at Continuum by >>> > searching through their pre-compiled binaries from >>> > https://repo.continuum.io/pkgs/free/linux-64/ to find shared >>> libraries that >>> > were dependened on (according to ldd) that were not accounted for by >>> the >>> > declared dependencies that each package made known to the conda package >>> > manager. >>> > >>> > The full list of these system libraries, sorted in from >>> > most-commonly-depend-on to rarest, is below. There are 158 of them. >>> [...] >>> > So it's not perfect. But it might be a useful starting place. >>> >>> Unfortunately, yeah, it looks like there's a lot of false positives in >>> here :-(. For example your list contains liblzma and libsqlite, but >>> both of these are shipped as dependencies of python itself. So >>> probably someone just forgot to declare the dependency explicitly, but >>> got away with it because the libraries were pulled in anyway. >>> >>> Maybe a better approach would be to look at what libraries are used on >>> by an up-to-date default Anaconda install (on the assumption that this >>> is the best tested configuration), and then erase from the list all >>> libraries that are shipped by this configuration (ignoring declared >>> dependencies since those seem to be unreliable)? It's better to be >>> conservative here, since the end goal is to come up with a list of >>> external libraries that we're confident have actually been tested for >>> compatibility by lots and lots of different users. >>> >>> -n >>> >>> -- >>> Nathaniel J. Smith -- http://vorpus.org >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Mon Jan 11 09:41:46 2016 From: travis at continuum.io (Travis Oliphant) Date: Mon, 11 Jan 2016 09:41:46 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: Anaconda "build environment" was setup by Ilan and me. Aaron helped to build packages while he was at Continuum but spent most of his time on the open-source conda project. It is important to understand the difference between Anaconda and conda in this respect. Anaconda is a particular dependency foundation that Continuum supports and releases -- it will have a particular set of expected libraries on each platform (we work to keep this fairly limited and on Linux currently use CentOS 5 as the base). conda is a general package manager that is open-source and that anyone can use to produce a set of consistent binaries (there can be many conda-based distributions). It solves the problem of multiple binary dependency chains generally using the concept of "features". This concept of "features" allows you to create environments with different base dependencies. What packages you install when you "conda install" depends on which channels you are pointing to and which features you have "turned on" in the environment. It's a general system that extends the notions that were started by the PyPA. -Travis On Sun, Jan 10, 2016 at 10:14 PM, Robert McGibbon wrote: > > > Right. There's a small problem which is that the base linux system > >> isn't just "CentOS 5", it's "CentOS 5 and here's the list of libraries > > > that you're allowed to link to: ...", where that list is empirically > > > chosen to include only stuff that really is installed on ~all linux > >> machines and for which the ABI really has been stable in practice over > > > multiple years and distros (so e.g. no OpenSSL). > > > > > > Does anyone know who maintains Anaconda's linux build environment? > > > I strongly suspect it was originally set up by Aaron Meurer. Who > maintains it now that he is no longer at Continuum is a good question. > > From looking at all of the external libraries referenced by binaries > included in Anaconda > and the conda repos, I am not confident that they have a totally strict > policy here, or at least > not ones that is enforced by tooling. The sonames I listed here > cover > all of the external > dependencies used by the latest Anaconda release, but earlier releases and > other > conda-installable packages from the default channel are not so strict. > > -Robert > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- *Travis Oliphant* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Jan 11 13:25:05 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 11 Jan 2016 10:25:05 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Fri, Jan 8, 2016 at 7:13 PM, Nathaniel Smith wrote: > > that this would potentially be able to let packages like numpy serve > their > > linux > > users better without risking too much junk being uploaded to PyPI. > > That will never fly. But like Matthew says, I think we can probably > get them to accept a PEP saying "here's a new well-specified platform > tag that means that this wheel works on all linux systems meet the > following list of criteria: ...", and then allow that new platform tag > onto PyPI. > The second step is a trick though -- how does pip know, when being run on a client, that the system meets those requirements? Do we put a bunch of code in that checks for those libs, etc??? If we get all that worked out, we still haven't made any progress toward the non-standard libs that aren't python. This is the big "scipy problem" -- fortran, BLAS, hdf, ad infinitum. I argued for years that we could build binary wheels that hold each of these, and other python packages could depend on them, but pypa never seemed to like that idea. In the end, if you did all this right, you'd have something like conda -- so why not just use conda? All that being said, if you folks can get the core scipy stack setup to pip install on OS_X, Windows, and Linux, that would be pretty nice -- so keep at it ! -CHB > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.v.root at gmail.com Mon Jan 11 13:35:29 2016 From: ben.v.root at gmail.com (Benjamin Root) Date: Mon, 11 Jan 2016 13:35:29 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: The other half of the fun is how to deal with weird binary issues with libraries like libgeos_c, libhdf5 and such. You have to get all of the compile options right for your build of those libraries to get your build of GDAL and pyhdf working right. You also have packages like gdal and netcdf4 have diamond dependencies -- not only are they built and linked against numpy binaries, some of their binary dependencies are built against numpy binaries as well. Joys! I don't envy anybody that tries to take on the packaging problem in any language. Ben Root On Mon, Jan 11, 2016 at 1:25 PM, Chris Barker wrote: > On Fri, Jan 8, 2016 at 7:13 PM, Nathaniel Smith wrote: > >> > that this would potentially be able to let packages like numpy serve >> their >> > linux >> > users better without risking too much junk being uploaded to PyPI. >> >> That will never fly. But like Matthew says, I think we can probably >> get them to accept a PEP saying "here's a new well-specified platform >> tag that means that this wheel works on all linux systems meet the >> following list of criteria: ...", and then allow that new platform tag >> onto PyPI. >> > > The second step is a trick though -- how does pip know, when being run on > a client, that the system meets those requirements? Do we put a bunch of > code in that checks for those libs, etc??? > > If we get all that worked out, we still haven't made any progress toward > the non-standard libs that aren't python. This is the big "scipy problem" > -- fortran, BLAS, hdf, ad infinitum. > > I argued for years that we could build binary wheels that hold each of > these, and other python packages could depend on them, but pypa never > seemed to like that idea. In the end, if you did all this right, you'd have > something like conda -- so why not just use conda? > > All that being said, if you folks can get the core scipy stack setup to > pip install on OS_X, Windows, and Linux, that would be pretty nice -- so > keep at it ! > > -CHB > > > > > > > > > >> >> -n >> >> -- >> Nathaniel J. Smith -- http://vorpus.org >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Mon Jan 11 14:02:46 2016 From: cournape at gmail.com (David Cournapeau) Date: Mon, 11 Jan 2016 19:02:46 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 6:25 PM, Chris Barker wrote: > On Fri, Jan 8, 2016 at 7:13 PM, Nathaniel Smith wrote: > >> > that this would potentially be able to let packages like numpy serve >> their >> > linux >> > users better without risking too much junk being uploaded to PyPI. >> >> That will never fly. But like Matthew says, I think we can probably >> get them to accept a PEP saying "here's a new well-specified platform >> tag that means that this wheel works on all linux systems meet the >> following list of criteria: ...", and then allow that new platform tag >> onto PyPI. >> > > The second step is a trick though -- how does pip know, when being run on > a client, that the system meets those requirements? Do we put a bunch of > code in that checks for those libs, etc??? > You could make that option an opt-in at first, and gradually autodetect it for the main distros. > > If we get all that worked out, we still haven't made any progress toward > the non-standard libs that aren't python. This is the big "scipy problem" > -- fortran, BLAS, hdf, ad infinitum. > > I argued for years that we could build binary wheels that hold each of > these, and other python packages could depend on them, but pypa never > seemed to like that idea. > I don't think that's an accurate statement. There are issues to solve around this, but I did not encounter push back, either on the ML or face to face w/ various pypa members at Pycon, etc... There may be push backs for a particular detail, but making "pip install scipy" or "pip install matplotlib" a reality on every platform is something everybody agrees o > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Jan 11 18:53:39 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 11 Jan 2016 15:53:39 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 11:02 AM, David Cournapeau wrote: > If we get all that worked out, we still haven't made any progress toward >> the non-standard libs that aren't python. This is the big "scipy problem" >> -- fortran, BLAS, hdf, ad infinitum. >> >> I argued for years that we could build binary wheels that hold each of >> these, and other python packages could depend on them, but pypa never >> seemed to like that idea. >> > > I don't think that's an accurate statement. There are issues to solve > around this, but I did not encounter push back, either on the ML or face to > face w/ various pypa members at Pycon, etc... There may be push backs for a > particular detail, but making "pip install scipy" or "pip install > matplotlib" a reality on every platform is something everybody agrees o > sure, everyone wants that. But when it gets deeper, they don't want to have a bunc hof pip-installable binary wheels that are simply clibs re-packaged as a dependency. And, then you have the problelm of those being "binary wheel" dependencies, rather than "package" dependencies. e.g.: this particular build of pillow depends on the libpng and libjpeg wheels, but the Pillow package, in general, does not. And you would have different dependencies on Windows, and OS-X, and Linux. pip/wheel simply was not designed for that, and I didn't get any warm and fuzzy feelings from dist-utils sig that the it ever would. And again, then you are re-designing conda. So the only way to do all that is to statically link all the dependent libs in with each binary wheel (or ship a dll). Somehow it always bothered me to ship the same lib with multiple packages -- isn't that why shared libs exist? - In practice, maybe it doesn't matter, memory is cheap. But I also got sick of building the damn things! I like that Anaconda comes with libs someone else has built, and I can use them with my packages, too. And when it comes to ugly stuff like HDF and GDAL, I'm really happy someone else has built them! Anyway -- carry on -- being able to pip install the scipy stack would be very nice. -CHB > >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Jan 11 20:29:30 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 11 Jan 2016 17:29:30 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Jan 11, 2016 3:54 PM, "Chris Barker" wrote: > > On Mon, Jan 11, 2016 at 11:02 AM, David Cournapeau wrote: >>> >>> If we get all that worked out, we still haven't made any progress toward the non-standard libs that aren't python. This is the big "scipy problem" -- fortran, BLAS, hdf, ad infinitum. >>> >>> I argued for years that we could build binary wheels that hold each of these, and other python packages could depend on them, but pypa never seemed to like that idea. >> >> >> I don't think that's an accurate statement. There are issues to solve around this, but I did not encounter push back, either on the ML or face to face w/ various pypa members at Pycon, etc... There may be push backs for a particular detail, but making "pip install scipy" or "pip install matplotlib" a reality on every platform is something everybody agrees o > > > sure, everyone wants that. But when it gets deeper, they don't want to have a bunc hof pip-installable binary wheels that are simply clibs re-packaged as a dependency. And, then you have the problelm of those being "binary wheel" dependencies, rather than "package" dependencies. > > e.g.: > > this particular build of pillow depends on the libpng and libjpeg wheels, but the Pillow package, in general, does not. And you would have different dependencies on Windows, and OS-X, and Linux. > > pip/wheel simply was not designed for that, and I didn't get any warm and fuzzy feelings from dist-utils sig that the it ever would. And again, then you are re-designing conda. I agree that talking about such things on distutils-sig tends to elicit a certain amount of puzzled incomprehension, but I don't think it matters -- wheels already have everything you need to support this. E.g. wheels for different platforms can trivially have different dependencies. (They even go to some lengths to make sure this is possible for pure python packages where the same wheel can be used on multiple platforms.) When distributing a library-in-a-wheel then you need a little bit of hackishness to make sure the runtime loader can find the library, which conda would otherwise handle for you, but AFAICT it's like 10 lines of code or something. And in any case we have lots of users who don't use conda and are thus doomed to support both ecosystems regardless, so we might as well make the best of it :-). -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcgibbo at gmail.com Mon Jan 11 20:54:08 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Mon, 11 Jan 2016 17:54:08 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: > And in any case we have lots of users who don't use conda and are thus doomed to support both ecosystems regardless, so we might as well make the best of it :-). Yes, this is the key. Conda is a great tool for a lot of users / use cases, but it's not for everyone. Anyways, I think I've made a pretty good start on the tooling for a wheel ABI tag for a LSB-style base system that represents a common set of shared libraries and symbols versions provided by "many" linuxes (previously discussed by Nathaniel here: https://code.activestate.com/lists/python-distutils-sig/26272/) -Robert On Mon, Jan 11, 2016 at 5:29 PM, Nathaniel Smith wrote: > On Jan 11, 2016 3:54 PM, "Chris Barker" wrote: > > > > On Mon, Jan 11, 2016 at 11:02 AM, David Cournapeau > wrote: > >>> > >>> If we get all that worked out, we still haven't made any progress > toward the non-standard libs that aren't python. This is the big "scipy > problem" -- fortran, BLAS, hdf, ad infinitum. > >>> > >>> I argued for years that we could build binary wheels that hold each of > these, and other python packages could depend on them, but pypa never > seemed to like that idea. > >> > >> > >> I don't think that's an accurate statement. There are issues to solve > around this, but I did not encounter push back, either on the ML or face to > face w/ various pypa members at Pycon, etc... There may be push backs for a > particular detail, but making "pip install scipy" or "pip install > matplotlib" a reality on every platform is something everybody agrees o > > > > > > sure, everyone wants that. But when it gets deeper, they don't want to > have a bunc hof pip-installable binary wheels that are simply clibs > re-packaged as a dependency. And, then you have the problelm of those being > "binary wheel" dependencies, rather than "package" dependencies. > > > > e.g.: > > > > this particular build of pillow depends on the libpng and libjpeg > wheels, but the Pillow package, in general, does not. And you would have > different dependencies on Windows, and OS-X, and Linux. > > > > pip/wheel simply was not designed for that, and I didn't get any warm > and fuzzy feelings from dist-utils sig that the it ever would. And again, > then you are re-designing conda. > > I agree that talking about such things on distutils-sig tends to elicit a > certain amount of puzzled incomprehension, but I don't think it matters -- > wheels already have everything you need to support this. E.g. wheels for > different platforms can trivially have different dependencies. (They even > go to some lengths to make sure this is possible for pure python packages > where the same wheel can be used on multiple platforms.) When distributing > a library-in-a-wheel then you need a little bit of hackishness to make sure > the runtime loader can find the library, which conda would otherwise handle > for you, but AFAICT it's like 10 lines of code or something. > > And in any case we have lots of users who don't use conda and are thus > doomed to support both ecosystems regardless, so we might as well make the > best of it :-). > > -n > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Jan 11 22:24:43 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 11 Jan 2016 19:24:43 -0800 Subject: [Numpy-discussion] Defining a base linux-64 environment [was: Should I use pip install numpy in linux?] In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 6:06 AM, Robert McGibbon wrote: > I started working on a tool for checking linux wheels for "manylinux" > compatibility, and fixing them up if possible, based on the same ideas as > Matthew Brett's delocate for OS X. Current WIP code, if anyone wants to help > / throw penuts, is here: https://github.com/rmcgibbo/deloc8. > > It's currently fairly modest and can only list non-whitelisted external > shared library dependencies, and verify that sufficiently old versioned > symbols for glibc and its ilk are used. That is super cool! and also this week David C. @ Enthought contributed the docker image that they use to actually make compatible builds, so I guess we have some momentum; let's make this happen :-). I just made a repo and a mailing list to continue the discussion... https://github.com/manylinux/manylinux https://groups.google.com/forum/#!forum/manylinux-discuss -n -- Nathaniel J. Smith -- http://vorpus.org From charlesr.harris at gmail.com Tue Jan 12 15:17:40 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 12 Jan 2016 13:17:40 -0700 Subject: [Numpy-discussion] 1.11 branch this weekend Message-ID: Hi All, Just a heads up. Note that we are trying to move to a faster release cycle and that changes the flavor of the releases. Instead of a list of goals to meet before each release, releases will more resemble snapshots of master at branch time. The main thing is that they be regression and bug free to the best of our ability when the branch is made. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Jan 13 00:18:43 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 12 Jan 2016 22:18:43 -0700 Subject: [Numpy-discussion] Get rid of special scalar arithmetic Message-ID: Hi All, I've opened issue #7002 , reproduced below, for discussion. > Numpy umath has a file scalarmath.c.src that implements scalar arithmetic > using special functions that are about 10x faster than the equivalent > ufuncs. > > In [1]: a = np.float64(1) > > In [2]: timeit a*a > 10000000 loops, best of 3: 69.5 ns per loop > > In [3]: timeit np.multiply(a, a) > 1000000 loops, best of 3: 722 ns per loop > > I contend that in large programs this improvement in execution time is not > worth the complexity and maintenance overhead; it is unlikely that > scalar-scalar arithmetic is a significant part of their execution time. > Therefore I propose to use ufuncs for all of the scalar-scalar arithmetic. > This would also bring the benefits of __numpy_ufunc__ to scalars with > minimal effort. > Thoughts? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed Jan 13 03:52:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 13 Jan 2016 00:52:26 -0800 Subject: [Numpy-discussion] Get rid of special scalar arithmetic In-Reply-To: References: Message-ID: On Tue, Jan 12, 2016 at 9:18 PM, Charles R Harris wrote: > > Hi All, > > I've opened issue #7002, reproduced below, for discussion. >> >> Numpy umath has a file scalarmath.c.src that implements scalar arithmetic using special functions that are about 10x faster than the equivalent ufuncs. >> >> In [1]: a = np.float64(1) >> >> In [2]: timeit a*a >> 10000000 loops, best of 3: 69.5 ns per loop >> >> In [3]: timeit np.multiply(a, a) >> 1000000 loops, best of 3: 722 ns per loop >> >> I contend that in large programs this improvement in execution time is not worth the complexity and maintenance overhead; it is unlikely that scalar-scalar arithmetic is a significant part of their execution time. Therefore I propose to use ufuncs for all of the scalar-scalar arithmetic. This would also bring the benefits of __numpy_ufunc__ to scalars with minimal effort. > > Thoughts? +1e6, scalars are a maintenance disaster is so many ways. But can we actually pull it off? IIRC there were complaints about scalars getting slower at some point (and not 10x slower), because it's not actually too hard to have code that is heavy on scalar arithmetic. (Indexing an array returns a numpy scalar rather than a python object, even if these look similar, so any code that, say, does a Python loop over the elements of an array may well be bottlenecked by scalar arithmetic. Obviously it's better not to do such loops, but...) It still seems to me that surely we can surely speed up ufuncs on scalars / small arrays? Also I am somewhat encouraged that like you I get ~700 ns for multiply(scalar, scalar) versus ~70 ns for scalar * scalar, but I also get ~380 ns for 0d-array * 0d-array. (I guess probably for multiply(scalar, scalar) we're first calling asarray on both scalar objects, which is certainly avoidable.) Here's a profile of zerod * zerod [0]: http://vorpus.org/~njs/tmp/zerod.svg (Click on PyNumber_Multiply to zoom in on the relevant part) And here's multiply(scalar, scalar) [1]: http://vorpus.org/~njs/tmp/scalar.svg In principle it feels like tons of this stuff is fat that can be trimmed -- even in the first, faster, profile, we're allocating a 0d array and then converting it to a scalar, and the latter conversion in PyArray_Return takes 12% of time on its own; like 14% of the time is spent trying to figure out from scratch the complicated type resolution and casting procedure needed to multiply two float64s, ... [0] a = np.array(1, dtype=float) for i in range(...): a * a [1] s = np.float64(1) m = np.multiply for i in range(...): m(s, s) -n -- Nathaniel J. Smith -- http://vorpus.org From robert.kern at gmail.com Wed Jan 13 04:12:08 2016 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 13 Jan 2016 09:12:08 +0000 Subject: [Numpy-discussion] Get rid of special scalar arithmetic In-Reply-To: References: Message-ID: On Wed, Jan 13, 2016 at 5:18 AM, Charles R Harris wrote: > > Hi All, > > I've opened issue #7002, reproduced below, for discussion. >> >> Numpy umath has a file scalarmath.c.src that implements scalar arithmetic using special functions that are about 10x faster than the equivalent ufuncs. >> >> In [1]: a = np.float64(1) >> >> In [2]: timeit a*a >> 10000000 loops, best of 3: 69.5 ns per loop >> >> In [3]: timeit np.multiply(a, a) >> 1000000 loops, best of 3: 722 ns per loop >> >> I contend that in large programs this improvement in execution time is not worth the complexity and maintenance overhead; it is unlikely that scalar-scalar arithmetic is a significant part of their execution time. Therefore I propose to use ufuncs for all of the scalar-scalar arithmetic. This would also bring the benefits of __numpy_ufunc__ to scalars with minimal effort. > > Thoughts? Not all important-to-optimize programs are large in our field; interactive use is rampant. The scalar optimizations weren't added speculatively: people noticed that their Numeric code ran much slower under numpy and were reluctant to migrate. I was forever responding on comp.lang.python, "It's because scalar arithmetic hasn't been optimized yet. We know how to do it, we just need a volunteer to do the work. Contributions gratefully accepted!" The most critical areas tended to be optimization where you are often working with implicit scalars that pop out in the optimization loop. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.h.vankerkwijk at gmail.com Wed Jan 13 10:33:58 2016 From: m.h.vankerkwijk at gmail.com (Marten van Kerkwijk) Date: Wed, 13 Jan 2016 10:33:58 -0500 Subject: [Numpy-discussion] Get rid of special scalar arithmetic In-Reply-To: References: Message-ID: Just thought I would add here a general comment I made in the thread: replacing scalars everywhere with array scalars (i.e., ndim=0) would be great also from the perspective of ndarray subclasses; as is, it is quite annoying to have to special-case, e.g., getting a single subclass element, and rewrapping the scalar in the subclass. -- Marten -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Wed Jan 13 12:59:35 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Wed, 13 Jan 2016 18:59:35 +0100 Subject: [Numpy-discussion] Get rid of special scalar arithmetic In-Reply-To: References: Message-ID: <1452707975.2512.18.camel@sipsolutions.net> On Mi, 2016-01-13 at 10:33 -0500, Marten van Kerkwijk wrote: > Just thought I would add here a general comment I made in the thread: > replacing scalars everywhere with array scalars (i.e., ndim=0) would > be great also from the perspective of ndarray subclasses; as is, it > is quite annoying to have to special-case, e.g., getting a single > subclass element, and rewrapping the scalar in the subclass. > -- Marten I understand the sentiment, and right now I think we usually give the subclass the chance to rewrap itself around 0-d arrays. But ideally I think this is incorrect. Either you want the scalar to be a scalar, or the array actually holds information which is associated with the dtype (i.e. units) and thus should survive conversion to scalar. To me personally, I don't think that we can really remove scalars, due to things such as mutability, sequence ABC registration and with that also hashability. My gut feeling is that there is actually an advantage in having a scalar object, even if internally this scalar object could reuse a lot. Note that a, e.g. 0-d write-only array would raise an error on `a += 1`.... Now practicality beating purity and all that, but to me it is not obvious that it would be the best thing to get rid of scalars completly (getting rid of the code duplication is a different issue). - Sebastian > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From chris.barker at noaa.gov Wed Jan 13 17:23:51 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 13 Jan 2016 14:23:51 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 5:29 PM, Nathaniel Smith wrote: > I agree that talking about such things on distutils-sig tends to elicit a > certain amount of puzzled incomprehension, but I don't think it matters -- > wheels already have everything you need to support this. > well, that's what I figured -- and I started down that path a while back and got no support whatsoever (OK, some from Matthew Brett -- thanks!). But I know myself well enough to know I wasn't going to get the critical mass required to make it useful by myself, so I've moved on to an ecosystem that is doing most of the work already. Also, you have the problem that there is one PyPi -- so where do you put your nifty wheels that depend on other binary wheels? you may need to fork every package you want to build :-( But sure, let's get the core scipy stack pip-installable as much as possible -- and maybe some folks with more energy than me can move this all forward. Sorry to be a downer -- keep up the good work and energy! -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryan at bytemining.com Thu Jan 14 04:15:00 2016 From: ryan at bytemining.com (Ryan R. Rosario) Date: Thu, 14 Jan 2016 01:15:00 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? Message-ID: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Hi, I have a very large dictionary that must be shared across processes and does not fit in RAM. I need access to this object to be fast. The key is an integer ID and the value is a list containing two elements, both of them numpy arrays (one has ints, the other has floats). The key is sequential, starts at 0, and there are no gaps, so the ?outer? layer of this data structure could really just be a list with the key actually being the index. The lengths of each pair of arrays may differ across keys. For a visual: { key=0: [ numpy.array([1,8,15,?, 16000]), numpy.array([0.1,0.1,0.1,?,0.1]) ], key=1: [ numpy.array([5,6]), numpy.array([0.5,0.5]) ], ? } I?ve tried: - manager proxy objects, but the object was so big that low-level code threw an exception due to format and monkey-patching wasn?t successful. - Redis, which was far too slow due to setting up connections and data conversion etc. - Numpy rec arrays + memory mapping, but there is a restriction that the numpy arrays in each ?column? must be of fixed and same size. - I looked at PyTables, which may be a solution, but seems to have a very steep learning curve. - I haven?t tried SQLite3, but I am worried about the time it takes to query the DB for a sequential ID, and then translate byte arrays. Any ideas? I greatly appreciate any guidance you can provide. Thanks, Ryan From njs at pobox.com Thu Jan 14 05:19:00 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 14 Jan 2016 02:19:00 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: I'd try storing the data in hdf5 (probably via h5py, which is a more basic interface without all the bells-and-whistles that pytables adds), though any method you use is going to be limited by the need to do a seek before each read. Storing the data on SSD will probably help a lot if you can afford it for your data size. On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario wrote: > Hi, > > I have a very large dictionary that must be shared across processes and does not fit in RAM. I need access to this object to be fast. The key is an integer ID and the value is a list containing two elements, both of them numpy arrays (one has ints, the other has floats). The key is sequential, starts at 0, and there are no gaps, so the ?outer? layer of this data structure could really just be a list with the key actually being the index. The lengths of each pair of arrays may differ across keys. > > For a visual: > > { > key=0: > [ > numpy.array([1,8,15,?, 16000]), > numpy.array([0.1,0.1,0.1,?,0.1]) > ], > key=1: > [ > numpy.array([5,6]), > numpy.array([0.5,0.5]) > ], > ? > } > > I?ve tried: > - manager proxy objects, but the object was so big that low-level code threw an exception due to format and monkey-patching wasn?t successful. > - Redis, which was far too slow due to setting up connections and data conversion etc. > - Numpy rec arrays + memory mapping, but there is a restriction that the numpy arrays in each ?column? must be of fixed and same size. > - I looked at PyTables, which may be a solution, but seems to have a very steep learning curve. > - I haven?t tried SQLite3, but I am worried about the time it takes to query the DB for a sequential ID, and then translate byte arrays. > > Any ideas? I greatly appreciate any guidance you can provide. > > Thanks, > Ryan > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -- Nathaniel J. Smith -- http://vorpus.org From oscar.j.benjamin at gmail.com Thu Jan 14 06:39:21 2016 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Thu, 14 Jan 2016 11:39:21 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: On 13 January 2016 at 22:23, Chris Barker wrote: > On Mon, Jan 11, 2016 at 5:29 PM, Nathaniel Smith wrote: >> >> I agree that talking about such things on distutils-sig tends to elicit a >> certain amount of puzzled incomprehension, but I don't think it matters -- >> wheels already have everything you need to support this. > > well, that's what I figured -- and I started down that path a while back and > got no support whatsoever (OK, some from Matthew Brett -- thanks!). But I > know myself well enough to know I wasn't going to get the critical mass > required to make it useful by myself, so I've moved on to an ecosystem that > is doing most of the work already. I think the problem with discussing these things on distutils-sig is that the discussions are often very theoretical. In reality PyPA are waiting for people to adopt the infrastructure that they have created so far by uploading sets of binary wheels. Once that process really kicks off then as issues emerge there will be real specific problems to solve and a more concrete discussion of what changes are needed to wheel/pip/PyPI can emerge. The main exceptions to this are wheels for Linux and non-setuptools build dependencies for sdists so it's definitely good to pursue those problems and try to complete the basic infrastructure. > Also, you have the problem that there is one PyPi -- so where do you put > your nifty wheels that depend on other binary wheels? you may need to fork > every package you want to build :-( Is this a real problem or a theoretical one? Do you know of some situation where this wheel to wheel dependency will occur that won't just be solved in some other way? -- Oscar From faltet at gmail.com Thu Jan 14 08:48:45 2016 From: faltet at gmail.com (Francesc Alted) Date: Thu, 14 Jan 2016 14:48:45 +0100 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: Well, maybe something like a simple class emulating a dictionary that stores a key-value on disk would be more than enough. Then you can use whatever persistence layer that you want (even HDF5, but not necessarily). As a demonstration I did a quick and dirty implementation for such a persistent key-store thing ( https://gist.github.com/FrancescAlted/8e87c8762a49cf5fc897). On it, the KeyStore class (less than 40 lines long) is responsible for storing the value (2 arrays) into a key (a directory). As I am quite a big fan of compression, I implemented a couple of serialization flavors: one using the .npz format (so no other dependencies than NumPy are needed) and the other using the ctable object from the bcolz package (bcolz.blosc.org). Here are some performance numbers: python key-store.py -f numpy -d __test -l 0 ########## Checking method: numpy (via .npz files) ############ Building database. Wait please... Time ( creation) --> 1.906 Retrieving 100 keys in arbitrary order... Time ( query) --> 0.191 Number of elements out of getitem: 10518976 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test 75M __test So, with the NPZ format we can deal with the 75 MB quite easily. But NPZ can compress data as well, so let's see how it goes: $ python key-store.py -f numpy -d __test -l 9 ########## Checking method: numpy (via .npz files) ############ Building database. Wait please... Time ( creation) --> 6.636 Retrieving 100 keys in arbitrary order... Time ( query) --> 0.384 Number of elements out of getitem: 10518976 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test 28M __test Ok, in this case we have got almost a 3x compression ratio, which is not bad. However, the performance has degraded a lot. Let's use now bcolz. First in non-compressed mode: $ python key-store.py -f bcolz -d __test -l 0 ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') ############ Building database. Wait please... Time ( creation) --> 0.479 Retrieving 100 keys in arbitrary order... Time ( query) --> 0.103 Number of elements out of getitem: 10518976 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test 82M __test Without compression, bcolz takes a bit more (~10%) space than NPZ. However, bcolz is actually meant to be used with compression on by default: $ python key-store.py -f bcolz -d __test -l 9 ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') ############ Building database. Wait please... Time ( creation) --> 0.487 Retrieving 100 keys in arbitrary order... Time ( query) --> 0.98 Number of elements out of getitem: 10518976 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test 29M __test So, the final disk usage is quite similar to NPZ, but it can store and retrieve lots faster. Also, the data decompression speed is on par to using non-compression. This is because bcolz uses Blosc behind the scenes, which is much faster than zlib (used by NPZ) --and sometimes faster than a memcpy(). However, even we are doing I/O against the disk, this dataset is so small that fits in the OS filesystem cache, so the benchmark is actually checking I/O at memory speeds, not disk speeds. In order to do a more real-life comparison, let's use a dataset that is much larger than the amount of memory in my laptop (8 GB): $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d /media/faltet/docker/__test -l 0 ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') ############ Building database. Wait please... Time ( creation) --> 133.650 Retrieving 100 keys in arbitrary order... Time ( query) --> 2.881 Number of elements out of getitem: 91907396 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh /media/faltet/docker/__test 39G /media/faltet/docker/__test and now, with compression on: $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d /media/faltet/docker/__test -l 9 ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') ############ Building database. Wait please... Time ( creation) --> 145.633 Retrieving 100 keys in arbitrary order... Time ( query) --> 1.339 Number of elements out of getitem: 91907396 faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh /media/faltet/docker/__test 12G /media/faltet/docker/__test So, we are still seeing the 3x compression ratio. But the interesting thing here is that the compressed version works a 50% faster than the uncompressed one (13 ms/query vs 29 ms/query). In this case I was using a SSD (hence the low query times), so the compression advantage is even more noticeable than when using memory as above (as expected). But anyway, this is just a demonstration that you don't need heavy tools to achieve what you want. And as a corollary, (fast) compressors can save you not only storage, but processing time too. Francesc 2016-01-14 11:19 GMT+01:00 Nathaniel Smith : > I'd try storing the data in hdf5 (probably via h5py, which is a more > basic interface without all the bells-and-whistles that pytables > adds), though any method you use is going to be limited by the need to > do a seek before each read. Storing the data on SSD will probably help > a lot if you can afford it for your data size. > > On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario > wrote: > > Hi, > > > > I have a very large dictionary that must be shared across processes and > does not fit in RAM. I need access to this object to be fast. The key is an > integer ID and the value is a list containing two elements, both of them > numpy arrays (one has ints, the other has floats). The key is sequential, > starts at 0, and there are no gaps, so the ?outer? layer of this data > structure could really just be a list with the key actually being the > index. The lengths of each pair of arrays may differ across keys. > > > > For a visual: > > > > { > > key=0: > > [ > > numpy.array([1,8,15,?, 16000]), > > numpy.array([0.1,0.1,0.1,?,0.1]) > > ], > > key=1: > > [ > > numpy.array([5,6]), > > numpy.array([0.5,0.5]) > > ], > > ? > > } > > > > I?ve tried: > > - manager proxy objects, but the object was so big that low-level > code threw an exception due to format and monkey-patching wasn?t successful. > > - Redis, which was far too slow due to setting up connections and > data conversion etc. > > - Numpy rec arrays + memory mapping, but there is a restriction > that the numpy arrays in each ?column? must be of fixed and same size. > > - I looked at PyTables, which may be a solution, but seems to have > a very steep learning curve. > > - I haven?t tried SQLite3, but I am worried about the time it > takes to query the DB for a sequential ID, and then translate byte arrays. > > > > Any ideas? I greatly appreciate any guidance you can provide. > > > > Thanks, > > Ryan > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- Francesc Alted -------------- next part -------------- An HTML attachment was scrubbed... URL: From jehturner at gmail.com Thu Jan 14 09:08:32 2016 From: jehturner at gmail.com (James E.H. Turner) Date: Thu, 14 Jan 2016 11:08:32 -0300 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: <5697ABE0.1020604@gmail.com> On 09/01/16 00:13, Nathaniel Smith wrote: > Right. There's a small problem which is that the base linux system isn't just > "CentOS 5", it's "CentOS 5 and here's the list of libraries that you're > allowed to link to: ...", where that list is empirically chosen to include > only stuff that really is installed on ~all linux machines and for which the > ABI really has been stable in practice over multiple years and distros (so > e.g. no OpenSSL). > > So the key next step is for someone to figure out and write down that list. > Continuum and Enthought both have versions of it that we know are good... You mean something more empirical than http://refspecs.linuxfoundation.org/lsb.shtml ? I tend to cross-reference with that when adding stuff to Ureka and just err on the side of including things where feasible, then of course test it on the main target platforms. We have also been building on CentOS 5-6 BTW (I believe the former is about to be unsupported). Just skimming the thread... Cheers, James. From edisongustavo at gmail.com Thu Jan 14 09:16:14 2016 From: edisongustavo at gmail.com (Edison Gustavo Muenz) Date: Thu, 14 Jan 2016 12:16:14 -0200 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: >From what I know this would be the use case that Dask seems to solve. I think this blog post can help: https://www.continuum.io/content/xray-dask-out-core-labeled-arrays-python Notice that I haven't used any of these projects myself. On Thu, Jan 14, 2016 at 11:48 AM, Francesc Alted wrote: > Well, maybe something like a simple class emulating a dictionary that > stores a key-value on disk would be more than enough. Then you can use > whatever persistence layer that you want (even HDF5, but not necessarily). > > As a demonstration I did a quick and dirty implementation for such a > persistent key-store thing ( > https://gist.github.com/FrancescAlted/8e87c8762a49cf5fc897). On it, the > KeyStore class (less than 40 lines long) is responsible for storing the > value (2 arrays) into a key (a directory). As I am quite a big fan of > compression, I implemented a couple of serialization flavors: one using the > .npz format (so no other dependencies than NumPy are needed) and the other > using the ctable object from the bcolz package (bcolz.blosc.org). Here > are some performance numbers: > > python key-store.py -f numpy -d __test -l 0 > ########## Checking method: numpy (via .npz files) ############ > Building database. Wait please... > Time ( creation) --> 1.906 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 0.191 > Number of elements out of getitem: 10518976 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test > > 75M __test > > So, with the NPZ format we can deal with the 75 MB quite easily. But NPZ > can compress data as well, so let's see how it goes: > > $ python key-store.py -f numpy -d __test -l 9 > ########## Checking method: numpy (via .npz files) ############ > Building database. Wait please... > Time ( creation) --> 6.636 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 0.384 > Number of elements out of getitem: 10518976 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test > 28M __test > > Ok, in this case we have got almost a 3x compression ratio, which is not > bad. However, the performance has degraded a lot. Let's use now bcolz. > First in non-compressed mode: > > $ python key-store.py -f bcolz -d __test -l 0 > ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') > ############ > Building database. Wait please... > Time ( creation) --> 0.479 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 0.103 > Number of elements out of getitem: 10518976 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test > 82M __test > > Without compression, bcolz takes a bit more (~10%) space than NPZ. > However, bcolz is actually meant to be used with compression on by default: > > $ python key-store.py -f bcolz -d __test -l 9 > ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') > ############ > Building database. Wait please... > Time ( creation) --> 0.487 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 0.98 > Number of elements out of getitem: 10518976 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test > 29M __test > > So, the final disk usage is quite similar to NPZ, but it can store and > retrieve lots faster. Also, the data decompression speed is on par to > using non-compression. This is because bcolz uses Blosc behind the scenes, > which is much faster than zlib (used by NPZ) --and sometimes faster than a > memcpy(). However, even we are doing I/O against the disk, this dataset is > so small that fits in the OS filesystem cache, so the benchmark is actually > checking I/O at memory speeds, not disk speeds. > > In order to do a more real-life comparison, let's use a dataset that is > much larger than the amount of memory in my laptop (8 GB): > > $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d > /media/faltet/docker/__test -l 0 > ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') > ############ > Building database. Wait please... > Time ( creation) --> 133.650 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 2.881 > Number of elements out of getitem: 91907396 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh > /media/faltet/docker/__test > > 39G /media/faltet/docker/__test > > and now, with compression on: > > $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d > /media/faltet/docker/__test -l 9 > ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') > ############ > Building database. Wait please... > Time ( creation) --> 145.633 > Retrieving 100 keys in arbitrary order... > Time ( query) --> 1.339 > Number of elements out of getitem: 91907396 > faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh > /media/faltet/docker/__test > > 12G /media/faltet/docker/__test > > So, we are still seeing the 3x compression ratio. But the interesting > thing here is that the compressed version works a 50% faster than the > uncompressed one (13 ms/query vs 29 ms/query). In this case I was using a > SSD (hence the low query times), so the compression advantage is even more > noticeable than when using memory as above (as expected). > > But anyway, this is just a demonstration that you don't need heavy tools > to achieve what you want. And as a corollary, (fast) compressors can save > you not only storage, but processing time too. > > Francesc > > > 2016-01-14 11:19 GMT+01:00 Nathaniel Smith : > >> I'd try storing the data in hdf5 (probably via h5py, which is a more >> basic interface without all the bells-and-whistles that pytables >> adds), though any method you use is going to be limited by the need to >> do a seek before each read. Storing the data on SSD will probably help >> a lot if you can afford it for your data size. >> >> On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario >> wrote: >> > Hi, >> > >> > I have a very large dictionary that must be shared across processes and >> does not fit in RAM. I need access to this object to be fast. The key is an >> integer ID and the value is a list containing two elements, both of them >> numpy arrays (one has ints, the other has floats). The key is sequential, >> starts at 0, and there are no gaps, so the ?outer? layer of this data >> structure could really just be a list with the key actually being the >> index. The lengths of each pair of arrays may differ across keys. >> > >> > For a visual: >> > >> > { >> > key=0: >> > [ >> > numpy.array([1,8,15,?, 16000]), >> > numpy.array([0.1,0.1,0.1,?,0.1]) >> > ], >> > key=1: >> > [ >> > numpy.array([5,6]), >> > numpy.array([0.5,0.5]) >> > ], >> > ? >> > } >> > >> > I?ve tried: >> > - manager proxy objects, but the object was so big that low-level >> code threw an exception due to format and monkey-patching wasn?t successful. >> > - Redis, which was far too slow due to setting up connections and >> data conversion etc. >> > - Numpy rec arrays + memory mapping, but there is a restriction >> that the numpy arrays in each ?column? must be of fixed and same size. >> > - I looked at PyTables, which may be a solution, but seems to >> have a very steep learning curve. >> > - I haven?t tried SQLite3, but I am worried about the time it >> takes to query the DB for a sequential ID, and then translate byte arrays. >> > >> > Any ideas? I greatly appreciate any guidance you can provide. >> > >> > Thanks, >> > Ryan >> > _______________________________________________ >> > NumPy-Discussion mailing list >> > NumPy-Discussion at scipy.org >> > https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> >> >> -- >> Nathaniel J. Smith -- http://vorpus.org >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > > > -- > Francesc Alted > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.v.root at gmail.com Thu Jan 14 09:51:20 2016 From: ben.v.root at gmail.com (Benjamin Root) Date: Thu, 14 Jan 2016 09:51:20 -0500 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: A warning about HDF5. It is not a database format, so you have to be extremely careful if the data is getting updated while it is open for reading by anybody else. If it is strictly read-only, and no body else is updating it, then have at it! Cheers! Ben Root On Thu, Jan 14, 2016 at 9:16 AM, Edison Gustavo Muenz < edisongustavo at gmail.com> wrote: > From what I know this would be the use case that Dask seems to solve. > > I think this blog post can help: > https://www.continuum.io/content/xray-dask-out-core-labeled-arrays-python > > Notice that I haven't used any of these projects myself. > > On Thu, Jan 14, 2016 at 11:48 AM, Francesc Alted wrote: > >> Well, maybe something like a simple class emulating a dictionary that >> stores a key-value on disk would be more than enough. Then you can use >> whatever persistence layer that you want (even HDF5, but not necessarily). >> >> As a demonstration I did a quick and dirty implementation for such a >> persistent key-store thing ( >> https://gist.github.com/FrancescAlted/8e87c8762a49cf5fc897). On it, the >> KeyStore class (less than 40 lines long) is responsible for storing the >> value (2 arrays) into a key (a directory). As I am quite a big fan of >> compression, I implemented a couple of serialization flavors: one using the >> .npz format (so no other dependencies than NumPy are needed) and the other >> using the ctable object from the bcolz package (bcolz.blosc.org). Here >> are some performance numbers: >> >> python key-store.py -f numpy -d __test -l 0 >> ########## Checking method: numpy (via .npz files) ############ >> Building database. Wait please... >> Time ( creation) --> 1.906 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.191 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> >> 75M __test >> >> So, with the NPZ format we can deal with the 75 MB quite easily. But NPZ >> can compress data as well, so let's see how it goes: >> >> $ python key-store.py -f numpy -d __test -l 9 >> ########## Checking method: numpy (via .npz files) ############ >> Building database. Wait please... >> Time ( creation) --> 6.636 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.384 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 28M __test >> >> Ok, in this case we have got almost a 3x compression ratio, which is not >> bad. However, the performance has degraded a lot. Let's use now bcolz. >> First in non-compressed mode: >> >> $ python key-store.py -f bcolz -d __test -l 0 >> ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 0.479 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.103 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 82M __test >> >> Without compression, bcolz takes a bit more (~10%) space than NPZ. >> However, bcolz is actually meant to be used with compression on by default: >> >> $ python key-store.py -f bcolz -d __test -l 9 >> ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 0.487 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.98 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 29M __test >> >> So, the final disk usage is quite similar to NPZ, but it can store and >> retrieve lots faster. Also, the data decompression speed is on par to >> using non-compression. This is because bcolz uses Blosc behind the scenes, >> which is much faster than zlib (used by NPZ) --and sometimes faster than a >> memcpy(). However, even we are doing I/O against the disk, this dataset is >> so small that fits in the OS filesystem cache, so the benchmark is actually >> checking I/O at memory speeds, not disk speeds. >> >> In order to do a more real-life comparison, let's use a dataset that is >> much larger than the amount of memory in my laptop (8 GB): >> >> $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d >> /media/faltet/docker/__test -l 0 >> ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 133.650 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 2.881 >> Number of elements out of getitem: 91907396 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh >> /media/faltet/docker/__test >> >> 39G /media/faltet/docker/__test >> >> and now, with compression on: >> >> $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d >> /media/faltet/docker/__test -l 9 >> ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 145.633 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 1.339 >> Number of elements out of getitem: 91907396 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh >> /media/faltet/docker/__test >> >> 12G /media/faltet/docker/__test >> >> So, we are still seeing the 3x compression ratio. But the interesting >> thing here is that the compressed version works a 50% faster than the >> uncompressed one (13 ms/query vs 29 ms/query). In this case I was using a >> SSD (hence the low query times), so the compression advantage is even more >> noticeable than when using memory as above (as expected). >> >> But anyway, this is just a demonstration that you don't need heavy tools >> to achieve what you want. And as a corollary, (fast) compressors can save >> you not only storage, but processing time too. >> >> Francesc >> >> >> 2016-01-14 11:19 GMT+01:00 Nathaniel Smith : >> >>> I'd try storing the data in hdf5 (probably via h5py, which is a more >>> basic interface without all the bells-and-whistles that pytables >>> adds), though any method you use is going to be limited by the need to >>> do a seek before each read. Storing the data on SSD will probably help >>> a lot if you can afford it for your data size. >>> >>> On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario >>> wrote: >>> > Hi, >>> > >>> > I have a very large dictionary that must be shared across processes >>> and does not fit in RAM. I need access to this object to be fast. The key >>> is an integer ID and the value is a list containing two elements, both of >>> them numpy arrays (one has ints, the other has floats). The key is >>> sequential, starts at 0, and there are no gaps, so the ?outer? layer of >>> this data structure could really just be a list with the key actually being >>> the index. The lengths of each pair of arrays may differ across keys. >>> > >>> > For a visual: >>> > >>> > { >>> > key=0: >>> > [ >>> > numpy.array([1,8,15,?, 16000]), >>> > numpy.array([0.1,0.1,0.1,?,0.1]) >>> > ], >>> > key=1: >>> > [ >>> > numpy.array([5,6]), >>> > numpy.array([0.5,0.5]) >>> > ], >>> > ? >>> > } >>> > >>> > I?ve tried: >>> > - manager proxy objects, but the object was so big that >>> low-level code threw an exception due to format and monkey-patching wasn?t >>> successful. >>> > - Redis, which was far too slow due to setting up connections >>> and data conversion etc. >>> > - Numpy rec arrays + memory mapping, but there is a restriction >>> that the numpy arrays in each ?column? must be of fixed and same size. >>> > - I looked at PyTables, which may be a solution, but seems to >>> have a very steep learning curve. >>> > - I haven?t tried SQLite3, but I am worried about the time it >>> takes to query the DB for a sequential ID, and then translate byte arrays. >>> > >>> > Any ideas? I greatly appreciate any guidance you can provide. >>> > >>> > Thanks, >>> > Ryan >>> > _______________________________________________ >>> > NumPy-Discussion mailing list >>> > NumPy-Discussion at scipy.org >>> > https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >>> >>> -- >>> Nathaniel J. Smith -- http://vorpus.org >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> >> >> -- >> Francesc Alted >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Thu Jan 14 11:26:48 2016 From: travis at continuum.io (Travis Oliphant) Date: Thu, 14 Jan 2016 10:26:48 -0600 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: On Thu, Jan 14, 2016 at 8:16 AM, Edison Gustavo Muenz < edisongustavo at gmail.com> wrote: > From what I know this would be the use case that Dask seems to solve. > > I think this blog post can help: > https://www.continuum.io/content/xray-dask-out-core-labeled-arrays-python > > Notice that I haven't used any of these projects myself. > I don't know enough about xray to know whether it supports this kind of general labeling to be able to build your entire data-structure as an x-ray object. Dask could definitely be used to process your data in an easy to describe manner (creating a dask.bag of dask.arrays would work though I'm not sure there are any methods that would buy you from just having a standard dictionary of dask.arrays). You can definitely use dask imperative to parallelize your data-manipulation algorithms. But, dask doesn't take a strong opinion as to how you store your data --- it can use anything python can read. I believe your question was "how do I store this?" If you think of the file-system as a simple key-value store, then you could easily construct this kind of scenario on disk with directory names for your keys and two files in each directory for your arrays. Then, you could mmap the individual arrays directly for processing. Those individual arrays could be stored as bcolz, npy files, or anything else. Will your multiple processes need to write to these files or will they be read-only? -Travis > > On Thu, Jan 14, 2016 at 11:48 AM, Francesc Alted wrote: > >> Well, maybe something like a simple class emulating a dictionary that >> stores a key-value on disk would be more than enough. Then you can use >> whatever persistence layer that you want (even HDF5, but not necessarily). >> >> As a demonstration I did a quick and dirty implementation for such a >> persistent key-store thing ( >> https://gist.github.com/FrancescAlted/8e87c8762a49cf5fc897). On it, the >> KeyStore class (less than 40 lines long) is responsible for storing the >> value (2 arrays) into a key (a directory). As I am quite a big fan of >> compression, I implemented a couple of serialization flavors: one using the >> .npz format (so no other dependencies than NumPy are needed) and the other >> using the ctable object from the bcolz package (bcolz.blosc.org). Here >> are some performance numbers: >> >> python key-store.py -f numpy -d __test -l 0 >> ########## Checking method: numpy (via .npz files) ############ >> Building database. Wait please... >> Time ( creation) --> 1.906 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.191 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> >> 75M __test >> >> So, with the NPZ format we can deal with the 75 MB quite easily. But NPZ >> can compress data as well, so let's see how it goes: >> >> $ python key-store.py -f numpy -d __test -l 9 >> ########## Checking method: numpy (via .npz files) ############ >> Building database. Wait please... >> Time ( creation) --> 6.636 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.384 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 28M __test >> >> Ok, in this case we have got almost a 3x compression ratio, which is not >> bad. However, the performance has degraded a lot. Let's use now bcolz. >> First in non-compressed mode: >> >> $ python key-store.py -f bcolz -d __test -l 0 >> ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 0.479 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.103 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 82M __test >> >> Without compression, bcolz takes a bit more (~10%) space than NPZ. >> However, bcolz is actually meant to be used with compression on by default: >> >> $ python key-store.py -f bcolz -d __test -l 9 >> ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 0.487 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 0.98 >> Number of elements out of getitem: 10518976 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh __test >> 29M __test >> >> So, the final disk usage is quite similar to NPZ, but it can store and >> retrieve lots faster. Also, the data decompression speed is on par to >> using non-compression. This is because bcolz uses Blosc behind the scenes, >> which is much faster than zlib (used by NPZ) --and sometimes faster than a >> memcpy(). However, even we are doing I/O against the disk, this dataset is >> so small that fits in the OS filesystem cache, so the benchmark is actually >> checking I/O at memory speeds, not disk speeds. >> >> In order to do a more real-life comparison, let's use a dataset that is >> much larger than the amount of memory in my laptop (8 GB): >> >> $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d >> /media/faltet/docker/__test -l 0 >> ########## Checking method: bcolz (via ctable(clevel=0, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 133.650 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 2.881 >> Number of elements out of getitem: 91907396 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh >> /media/faltet/docker/__test >> >> 39G /media/faltet/docker/__test >> >> and now, with compression on: >> >> $ PYTHONPATH=. python key-store.py -f bcolz -m 1000000 -k 5000 -d >> /media/faltet/docker/__test -l 9 >> ########## Checking method: bcolz (via ctable(clevel=9, cname='blosclz') >> ############ >> Building database. Wait please... >> Time ( creation) --> 145.633 >> Retrieving 100 keys in arbitrary order... >> Time ( query) --> 1.339 >> Number of elements out of getitem: 91907396 >> faltet at faltet-Latitude-E6430:~/blosc/bcolz$ du -sh >> /media/faltet/docker/__test >> >> 12G /media/faltet/docker/__test >> >> So, we are still seeing the 3x compression ratio. But the interesting >> thing here is that the compressed version works a 50% faster than the >> uncompressed one (13 ms/query vs 29 ms/query). In this case I was using a >> SSD (hence the low query times), so the compression advantage is even more >> noticeable than when using memory as above (as expected). >> >> But anyway, this is just a demonstration that you don't need heavy tools >> to achieve what you want. And as a corollary, (fast) compressors can save >> you not only storage, but processing time too. >> >> Francesc >> >> >> 2016-01-14 11:19 GMT+01:00 Nathaniel Smith : >> >>> I'd try storing the data in hdf5 (probably via h5py, which is a more >>> basic interface without all the bells-and-whistles that pytables >>> adds), though any method you use is going to be limited by the need to >>> do a seek before each read. Storing the data on SSD will probably help >>> a lot if you can afford it for your data size. >>> >>> On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario >>> wrote: >>> > Hi, >>> > >>> > I have a very large dictionary that must be shared across processes >>> and does not fit in RAM. I need access to this object to be fast. The key >>> is an integer ID and the value is a list containing two elements, both of >>> them numpy arrays (one has ints, the other has floats). The key is >>> sequential, starts at 0, and there are no gaps, so the ?outer? layer of >>> this data structure could really just be a list with the key actually being >>> the index. The lengths of each pair of arrays may differ across keys. >>> > >>> > For a visual: >>> > >>> > { >>> > key=0: >>> > [ >>> > numpy.array([1,8,15,?, 16000]), >>> > numpy.array([0.1,0.1,0.1,?,0.1]) >>> > ], >>> > key=1: >>> > [ >>> > numpy.array([5,6]), >>> > numpy.array([0.5,0.5]) >>> > ], >>> > ? >>> > } >>> > >>> > I?ve tried: >>> > - manager proxy objects, but the object was so big that >>> low-level code threw an exception due to format and monkey-patching wasn?t >>> successful. >>> > - Redis, which was far too slow due to setting up connections >>> and data conversion etc. >>> > - Numpy rec arrays + memory mapping, but there is a restriction >>> that the numpy arrays in each ?column? must be of fixed and same size. >>> > - I looked at PyTables, which may be a solution, but seems to >>> have a very steep learning curve. >>> > - I haven?t tried SQLite3, but I am worried about the time it >>> takes to query the DB for a sequential ID, and then translate byte arrays. >>> > >>> > Any ideas? I greatly appreciate any guidance you can provide. >>> > >>> > Thanks, >>> > Ryan >>> > _______________________________________________ >>> > NumPy-Discussion mailing list >>> > NumPy-Discussion at scipy.org >>> > https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >>> >>> -- >>> Nathaniel J. Smith -- http://vorpus.org >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> >> >> -- >> Francesc Alted >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- *Travis Oliphant* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Thu Jan 14 12:14:19 2016 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Thu, 14 Jan 2016 09:14:19 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: Message-ID: <-5868265598930469412@unknownmsgid> >> Also, you have the problem that there is one PyPi -- so where do you put >> your nifty wheels that depend on other binary wheels? you may need to fork >> every package you want to build :-( > > Is this a real problem or a theoretical one? Do you know of some > situation where this wheel to wheel dependency will occur that won't > just be solved in some other way? It's real -- at least during the whole bootstrapping period. Say I build a nifty hdf5 binary wheel -- I could probably just grab the name "libhdf5" on PyPI. So far so good. But the goal here would be to have netcdf and pytables and GDAL and who knows what else then link against that wheel. But those projects are all supported be different people, that all have their own distribution strategy. So where do I put binary wheels of each of those projects that depend on my libhdf5 wheel? _maybe_ I would put it out there, and it would all grow organically, but neither the culture nor the tooling support that approach now, so I'm not very confident you could gather adoption. Even beyond the adoption period, sometimes you need to do stuff in more than one way -- look at the proliferation of channels on Anaconda.org. This is more likely to work if there is a good infrastructure for third parties to build and distribute the binaries -- e.g. Anaconda.org. Or the Linux dist to model -- for the most part, the people developing a given library are not packaging it. -CHB From rainwoodman at gmail.com Thu Jan 14 13:51:48 2016 From: rainwoodman at gmail.com (Feng Yu) Date: Thu, 14 Jan 2016 10:51:48 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: Hi Ryan, Did you consider packing the arrays into one(two) giant array stored with mmap? That way you only need to store the start & end offsets, and there is no need to use a dictionary. It may allow you to simplify some numerical operations as well. To be more specific, start : numpy.intp end : numpy.intp data1 : numpy.int32 data2 : numpy.float64 Then your original access to the dictionary can be rewritten as data1[start[key]:end[key] data2[start[key]:end[key] Whether to wrap this as a dictionary-like object is just a matter of taste -- depending you like it raw or fine. If you need to apply some global transformation to the data, then something like data2[...] *= 10 would work. ufunc.reduceat(data1, ....) can be very useful as well. (with some tricks on start /end) I was facing a similar issue a few years ago, and you may want to look at this code (It wasn't very well written I had to admit) https://github.com/rainwoodman/gaepsi/blob/master/gaepsi/tools/__init__.py#L362 Best, - Yu On Thu, Jan 14, 2016 at 1:15 AM, Ryan R. Rosario wrote: > Hi, > > I have a very large dictionary that must be shared across processes and does not fit in RAM. I need access to this object to be fast. The key is an integer ID and the value is a list containing two elements, both of them numpy arrays (one has ints, the other has floats). The key is sequential, starts at 0, and there are no gaps, so the ?outer? layer of this data structure could really just be a list with the key actually being the index. The lengths of each pair of arrays may differ across keys. > > For a visual: > > { > key=0: > [ > numpy.array([1,8,15,?, 16000]), > numpy.array([0.1,0.1,0.1,?,0.1]) > ], > key=1: > [ > numpy.array([5,6]), > numpy.array([0.5,0.5]) > ], > ? > } > > I?ve tried: > - manager proxy objects, but the object was so big that low-level code threw an exception due to format and monkey-patching wasn?t successful. > - Redis, which was far too slow due to setting up connections and data conversion etc. > - Numpy rec arrays + memory mapping, but there is a restriction that the numpy arrays in each ?column? must be of fixed and same size. > - I looked at PyTables, which may be a solution, but seems to have a very steep learning curve. > - I haven?t tried SQLite3, but I am worried about the time it takes to query the DB for a sequential ID, and then translate byte arrays. > > Any ideas? I greatly appreciate any guidance you can provide. > > Thanks, > Ryan > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion From matthew.brett at gmail.com Thu Jan 14 13:58:17 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 14 Jan 2016 10:58:17 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <-5868265598930469412@unknownmsgid> References: <-5868265598930469412@unknownmsgid> Message-ID: On Thu, Jan 14, 2016 at 9:14 AM, Chris Barker - NOAA Federal wrote: >>> Also, you have the problem that there is one PyPi -- so where do you put >>> your nifty wheels that depend on other binary wheels? you may need to fork >>> every package you want to build :-( >> >> Is this a real problem or a theoretical one? Do you know of some >> situation where this wheel to wheel dependency will occur that won't >> just be solved in some other way? > > It's real -- at least during the whole bootstrapping period. Say I > build a nifty hdf5 binary wheel -- I could probably just grab the name > "libhdf5" on PyPI. So far so good. But the goal here would be to have > netcdf and pytables and GDAL and who knows what else then link against > that wheel. But those projects are all supported be different people, > that all have their own distribution strategy. So where do I put > binary wheels of each of those projects that depend on my libhdf5 > wheel? _maybe_ I would put it out there, and it would all grow > organically, but neither the culture nor the tooling support that > approach now, so I'm not very confident you could gather adoption. I don't think there's a very large amount of cultural work - but some to be sure. We already have the following on OSX: pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py where all the wheels come from pypi. So, I don't think this is really outside our range, even if the problem is a little more difficult for Linux. > Even beyond the adoption period, sometimes you need to do stuff in > more than one way -- look at the proliferation of channels on > Anaconda.org. > > This is more likely to work if there is a good infrastructure for > third parties to build and distribute the binaries -- e.g. > Anaconda.org. I thought that Anaconda.org allows pypi channels as well? Matthew From shoyer at gmail.com Thu Jan 14 17:13:38 2016 From: shoyer at gmail.com (Stephan Hoyer) Date: Thu, 14 Jan 2016 14:13:38 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: On Thu, Jan 14, 2016 at 8:26 AM, Travis Oliphant wrote: > I don't know enough about xray to know whether it supports this kind of > general labeling to be able to build your entire data-structure as an x-ray > object. Dask could definitely be used to process your data in an easy to > describe manner (creating a dask.bag of dask.arrays would work though I'm > not sure there are any methods that would buy you from just having a > standard dictionary of dask.arrays). You can definitely use dask > imperative to parallelize your data-manipulation algorithms. > Indeed, xray's data model is not flexible enough to represent this sort of data -- it's designed around cases where multiple arrays use shared axes. However, I would indeed recommend dask.array (coupled with some sort of on-disk storage) as a possible solution for this problem, if you need to be able manipulate these arrays with an API that looks like NumPy. That said, the fact that your data consists of ragged arrays suggests that the dask.array API may be less useful for you. Tools like dask.imperative, coupled with HDF5 for storage, could still be very useful, though. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 14 17:30:56 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 14 Jan 2016 14:30:56 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: On Thu, Jan 14, 2016 at 2:13 PM, Stephan Hoyer wrote: > On Thu, Jan 14, 2016 at 8:26 AM, Travis Oliphant > wrote: >> >> I don't know enough about xray to know whether it supports this kind of >> general labeling to be able to build your entire data-structure as an x-ray >> object. Dask could definitely be used to process your data in an easy to >> describe manner (creating a dask.bag of dask.arrays would work though I'm >> not sure there are any methods that would buy you from just having a >> standard dictionary of dask.arrays). You can definitely use dask >> imperative to parallelize your data-manipulation algorithms. > > > Indeed, xray's data model is not flexible enough to represent this sort of > data -- it's designed around cases where multiple arrays use shared axes. > > However, I would indeed recommend dask.array (coupled with some sort of > on-disk storage) as a possible solution for this problem, if you need to be > able manipulate these arrays with an API that looks like NumPy. That said, > the fact that your data consists of ragged arrays suggests that the > dask.array API may be less useful for you. > > Tools like dask.imperative, coupled with HDF5 for storage, could still be > very useful, though. The reason I didn't suggest dask is that I had the impression that dask's model is better suited to bulk/streaming computations with vectorized semantics ("do the same thing to lots of data" kinds of problems, basically), whereas it sounded like the OP's algorithm needed lots of one-off unpredictable random access. Obviously even if this is true then it's useful to point out both because the OP's problem might turn out to be a better fit for dask's model than they indicated -- the post is somewhat vague :-). But, I just wanted to check, is the above a good characterization of dask's strengths/applicability? -n -- Nathaniel J. Smith -- http://vorpus.org From shoyer at gmail.com Thu Jan 14 18:37:07 2016 From: shoyer at gmail.com (Stephan Hoyer) Date: Thu, 14 Jan 2016 15:37:07 -0800 Subject: [Numpy-discussion] Fast Access to Container of Numpy Arrays on Disk? In-Reply-To: References: <0E01808A-5646-48AC-BC62-1CF1ED89A1D2@bytemining.com> Message-ID: On Thu, Jan 14, 2016 at 2:30 PM, Nathaniel Smith wrote: > The reason I didn't suggest dask is that I had the impression that > dask's model is better suited to bulk/streaming computations with > vectorized semantics ("do the same thing to lots of data" kinds of > problems, basically), whereas it sounded like the OP's algorithm > needed lots of one-off unpredictable random access. > > Obviously even if this is true then it's useful to point out both > because the OP's problem might turn out to be a better fit for dask's > model than they indicated -- the post is somewhat vague :-). > > But, I just wanted to check, is the above a good characterization of > dask's strengths/applicability? > Yes, dask is definitely designed around setting up a large streaming computation and then executing it all at once. But it is pretty flexible in terms of what those specific computations are, and can also work for non-vectorized computation (especially via dask imperative). It's worth taking a look at dask's collections for a sense of what it can do here. The recently refreshed docs provide a nice overview: http://dask.pydata.org/ Cheers, Stephan -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Thu Jan 14 19:48:31 2016 From: andyfaff at gmail.com (Andrew Nelson) Date: Fri, 15 Jan 2016 11:48:31 +1100 Subject: [Numpy-discussion] inconsistency in np.isclose Message-ID: Hi all, I think there is an inconsistency with np.isclose when I compare two numbers: >>> np.isclose(0, np.inf) array([False], dtype=bool) >>> np.isclose(0, 1) False The first comparison returns a bool array, the second returns a bool. Shouldn't they both return the same result? -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 14 20:09:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 14 Jan 2016 17:09:26 -0800 Subject: [Numpy-discussion] inconsistency in np.isclose In-Reply-To: References: Message-ID: Yeah, that does look like a bug. On Thu, Jan 14, 2016 at 4:48 PM, Andrew Nelson wrote: > Hi all, > I think there is an inconsistency with np.isclose when I compare two > numbers: > >>>> np.isclose(0, np.inf) > array([False], dtype=bool) > >>>> np.isclose(0, 1) > False > > The first comparison returns a bool array, the second returns a bool. > Shouldn't they both return the same result? > > -- > _____________________________________ > Dr. Andrew Nelson > > > _____________________________________ > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- Nathaniel J. Smith -- http://vorpus.org From jiajiali at gatech.edu Fri Jan 15 11:36:19 2016 From: jiajiali at gatech.edu (Li Jiajia) Date: Fri, 15 Jan 2016 11:36:19 -0500 Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper Message-ID: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> Hi all, I?m a PhD student in Georgia Tech. Recently, we?re working on a survey paper about tensor algorithms: basic tensor operations, tensor decomposition and some tensor applications. We are making a table to compare the capabilities of different software and planning to include NumPy. We?d like to make sure these parameters are correct to make a fair compare. Although we have looked into the related documents, please help us to confirm these. Besides, if you think there are more features of your software and a more preferred citation, please let us know. We?ll consider to update them. We want to show NumPy supports tensors, and we also include "scikit-tensor? in our survey, which is based on NumPy. Please let me know any confusion or any advice! Thanks a lot! :-) Notice: 1. ?YES/NO? to show whether or not the software supports the operation or has the feature. 2. ??? means we?re not sure of the feature, and please help us out. 3. ?Tensor order? means the maximum number of tensor dimensions that users can do with this software. 4. For computational cores, 1) "Element-wise Tensor Operation (A * B)? includes element-wise add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If the software contains one of them, we mark ?YES?. 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM from tensor contraction. If the software includes tensor contraction, it can also support TTM. 3) For ?MTTKRP?, we know most software can realize it through the above two operations. We mark it ?YES?, only if an specified optimization for the whole operation. <> <>Software Name <> NumPy Computational Cores Element-wise Tensor Operation (A * B) YES Tensor Contraction (A Xmn B) NO TTM ( A Xn B) NO Matriced Tensor Times Khatri-Rao Product (MTTKRP) NO Tensor Decomposition CP NO Tucker NO Hierarchical Tucker (HT) NO Tensor Train (TT) NO Tensor Features Tensor Order Arbitrary Dense Tensors YES Sparse Tensors NO ? Parallelized NO ? Software Information Application Domain General Programming Environment Python Latest Version 1.10.4 Release Date 2016 Citation: 1. AN DER WALT, S., COLBERT, S., AND VAROQUAUX, G. The NumPy array: A structure for efficient numerical computation. Computing in Science Engineering 13, 2 (March 2011), 22?30. 2. OLIPHANT, T. E. Python for scientific computing. Computing in Science Engineering 9, 3 (May 2007), 10?20. 3. NumPy (Version1.10.4).Available from http://www.numpy.org, Jan 2016. Best regards! Jiajia Li ------------------------------------------ E-mail: jiajiali at gatech.edu Tel: +1 (404)9404603 Computational Science & Engineering Georgia Institute of Technology -------------- next part -------------- An HTML attachment was scrubbed... URL: From bryanv at continuum.io Fri Jan 15 11:51:56 2016 From: bryanv at continuum.io (Bryan Van de Ven) Date: Fri, 15 Jan 2016 10:51:56 -0600 Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper In-Reply-To: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> References: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> Message-ID: <6CFEDFCC-072E-4989-A619-367BD95407DC@continuum.io> Your first citation is incorrect. It is "VAN DER WALT" (missing V in yours) Bryan > On Jan 15, 2016, at 10:36 AM, Li Jiajia wrote: > > Hi all, > I?m a PhD student in Georgia Tech. Recently, we?re working on a survey paper about tensor algorithms: basic tensor operations, tensor decomposition and some tensor applications. We are making a table to compare the capabilities of different software and planning to include NumPy. We?d like to make sure these parameters are correct to make a fair compare. Although we have looked into the related documents, please help us to confirm these. Besides, if you think there are more features of your software and a more preferred citation, please let us know. We?ll consider to update them. We want to show NumPy supports tensors, and we also include "scikit-tensor? in our survey, which is based on NumPy. > Please let me know any confusion or any advice! > Thanks a lot! :-) > > Notice: > 1. ?YES/NO? to show whether or not the software supports the operation or has the feature. > 2. ??? means we?re not sure of the feature, and please help us out. > 3. ?Tensor order? means the maximum number of tensor dimensions that users can do with this software. > 4. For computational cores, > 1) "Element-wise Tensor Operation (A * B)? includes element-wise add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If the software contains one of them, we mark ?YES?. > 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM from tensor contraction. If the software includes tensor contraction, it can also support TTM. > 3) For ?MTTKRP?, we know most software can realize it through the above two operations. We mark it ?YES?, only if an specified optimization for the whole operation. > > Software Name > > NumPy > > Computational Cores > > Element-wise Tensor Operation (A * B) > > YES > > Tensor Contraction (A Xmn B) > > NO > > TTM ( A Xn B) > > NO > > Matriced Tensor Times Khatri-Rao Product (MTTKRP) > > NO > > Tensor Decomposition > > CP > > NO > > Tucker > > NO > > Hierarchical Tucker (HT) > > NO > > Tensor Train (TT) > > NO > > Tensor Features > > Tensor Order > > Arbitrary > > Dense Tensors > > YES > > Sparse Tensors > > NO ? > > Parallelized > > NO ? > > Software Information > > Application Domain > > General > > Programming Environment > > Python > > Latest Version > > 1.10.4 > Release Date > > 2016 > > Citation: > 1. AN DER WALT, S., COLBERT, S., AND VAROQUAUX, G. The NumPy array: A structure for efficient numerical computation. Computing in Science Engineering 13, 2 (March 2011), 22?30. > 2. OLIPHANT, T. E. Python for scientific computing. Computing in Science Engineering 9, 3 (May 2007), 10?20. > 3. NumPy (Version1.10.4).Available from http://www.numpy.org, Jan 2016. > > Best regards! > Jiajia Li > > ------------------------------------------ > E-mail: jiajiali at gatech.edu > Tel: +1 (404)9404603 > Computational Science & Engineering > Georgia Institute of Technology > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion From njs at pobox.com Fri Jan 15 12:30:13 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 15 Jan 2016 09:30:13 -0800 Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper In-Reply-To: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> References: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> Message-ID: On Jan 15, 2016 8:36 AM, "Li Jiajia" wrote: > > Hi all, > I?m a PhD student in Georgia Tech. Recently, we?re working on a survey paper about tensor algorithms: basic tensor operations, tensor decomposition and some tensor applications. We are making a table to compare the capabilities of different software and planning to include NumPy. We?d like to make sure these parameters are correct to make a fair compare. Although we have looked into the related documents, please help us to confirm these. Besides, if you think there are more features of your software and a more preferred citation, please let us know. We?ll consider to update them. We want to show NumPy supports tensors, and we also include "scikit-tensor? in our survey, which is based on NumPy. > Please let me know any confusion or any advice! > Thanks a lot! :-) > > Notice: > 1. ?YES/NO? to show whether or not the software supports the operation or has the feature. > 2. ??? means we?re not sure of the feature, and please help us out. > 3. ?Tensor order? means the maximum number of tensor dimensions that users can do with this software. > 4. For computational cores, > 1) "Element-wise Tensor Operation (A * B)? includes element-wise add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If the software contains one of them, we mark ?YES?. > 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM from tensor contraction. If the software includes tensor contraction, it can also support TTM. > 3) For ?MTTKRP?, we know most software can realize it through the above two operations. We mark it ?YES?, only if an specified optimization for the whole operation. NumPy has support for working with multidimensional tensors, if you like, but it doesn't really use the tensor language and notation (preferring instead to think in terms of "arrays" as a somewhat more computationally focused and less mathematically focused conceptual framework). Which is to say that I actually have no idea what all those jargon terms you're asking about mean :-) I am suspicious that NumPy supports more of those operations than you have marked, just under different names/notation, but really can't tell either way for sure without knowing what exactly they are. (It is definitely correct though that NumPy includes no support for sparse tensors, and NumPy itself is not multi-threaded beyond what we get for free through the BLAS, though there are external libraries that can perform multi-threaded computations on top of data stored in numpy arrays.) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Jan 15 12:32:35 2016 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 15 Jan 2016 17:32:35 +0000 Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper In-Reply-To: References: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> Message-ID: On Fri, Jan 15, 2016 at 5:30 PM, Nathaniel Smith wrote: > > On Jan 15, 2016 8:36 AM, "Li Jiajia" wrote: > > > > Hi all, > > I?m a PhD student in Georgia Tech. Recently, we?re working on a survey paper about tensor algorithms: basic tensor operations, tensor decomposition and some tensor applications. We are making a table to compare the capabilities of different software and planning to include NumPy. We?d like to make sure these parameters are correct to make a fair compare. Although we have looked into the related documents, please help us to confirm these. Besides, if you think there are more features of your software and a more preferred citation, please let us know. We?ll consider to update them. We want to show NumPy supports tensors, and we also include "scikit-tensor? in our survey, which is based on NumPy. > > Please let me know any confusion or any advice! > > Thanks a lot! :-) > > > > Notice: > > 1. ?YES/NO? to show whether or not the software supports the operation or has the feature. > > 2. ??? means we?re not sure of the feature, and please help us out. > > 3. ?Tensor order? means the maximum number of tensor dimensions that users can do with this software. > > 4. For computational cores, > > 1) "Element-wise Tensor Operation (A * B)? includes element-wise add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If the software contains one of them, we mark ?YES?. > > 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM from tensor contraction. If the software includes tensor contraction, it can also support TTM. > > 3) For ?MTTKRP?, we know most software can realize it through the above two operations. We mark it ?YES?, only if an specified optimization for the whole operation. > > NumPy has support for working with multidimensional tensors, if you like, but it doesn't really use the tensor language and notation (preferring instead to think in terms of "arrays" as a somewhat more computationally focused and less mathematically focused conceptual framework). > > Which is to say that I actually have no idea what all those jargon terms you're asking about mean :-) I am suspicious that NumPy supports more of those operations than you have marked, just under different names/notation, but really can't tell either way for sure without knowing what exactly they are. In particular check if your operations can be expressed with einsum() http://docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.einsum.html -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From yw5aj at virginia.edu Fri Jan 15 12:34:34 2016 From: yw5aj at virginia.edu (Yuxiang Wang) Date: Fri, 15 Jan 2016 12:34:34 -0500 Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper In-Reply-To: References: <9BAFA354-4603-4193-8B20-510017335150@gatech.edu> Message-ID: I echo with Robert that the contraction can be done with np.einsum(). Also, check out the np.tensordot() as well - it can also be used to perform contraction. Shawn On Fri, Jan 15, 2016 at 12:32 PM, Robert Kern wrote: > On Fri, Jan 15, 2016 at 5:30 PM, Nathaniel Smith wrote: >> >> On Jan 15, 2016 8:36 AM, "Li Jiajia" wrote: >> > >> > Hi all, >> > I?m a PhD student in Georgia Tech. Recently, we?re working on a survey >> > paper about tensor algorithms: basic tensor operations, tensor decomposition >> > and some tensor applications. We are making a table to compare the >> > capabilities of different software and planning to include NumPy. We?d like >> > to make sure these parameters are correct to make a fair compare. Although >> > we have looked into the related documents, please help us to confirm these. >> > Besides, if you think there are more features of your software and a more >> > preferred citation, please let us know. We?ll consider to update them. We >> > want to show NumPy supports tensors, and we also include "scikit-tensor? in >> > our survey, which is based on NumPy. >> > Please let me know any confusion or any advice! >> > Thanks a lot! :-) >> > >> > Notice: >> > 1. ?YES/NO? to show whether or not the software supports the operation >> > or has the feature. >> > 2. ??? means we?re not sure of the feature, and please help us out. >> > 3. ?Tensor order? means the maximum number of tensor dimensions that >> > users can do with this software. >> > 4. For computational cores, >> > 1) "Element-wise Tensor Operation (A * B)? includes element-wise >> > add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If >> > the software contains one of them, we mark ?YES?. >> > 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM >> > from tensor contraction. If the software includes tensor contraction, it can >> > also support TTM. >> > 3) For ?MTTKRP?, we know most software can realize it through the above >> > two operations. We mark it ?YES?, only if an specified optimization for the >> > whole operation. >> >> NumPy has support for working with multidimensional tensors, if you like, >> but it doesn't really use the tensor language and notation (preferring >> instead to think in terms of "arrays" as a somewhat more computationally >> focused and less mathematically focused conceptual framework). >> >> Which is to say that I actually have no idea what all those jargon terms >> you're asking about mean :-) I am suspicious that NumPy supports more of >> those operations than you have marked, just under different names/notation, >> but really can't tell either way for sure without knowing what exactly they >> are. > > In particular check if your operations can be expressed with einsum() > > http://docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.einsum.html > > -- > Robert Kern > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- Yuxiang "Shawn" Wang Gerling Haptics Lab University of Virginia yw5aj at virginia.edu +1 (434) 284-0836 https://sites.google.com/a/virginia.edu/yw5aj/ From shoyer at gmail.com Fri Jan 15 12:35:34 2016 From: shoyer at gmail.com (Stephan Hoyer) Date: Fri, 15 Jan 2016 09:35:34 -0800 (PST) Subject: [Numpy-discussion] Software Capabilities of NumPy in Our Tensor Survey Paper In-Reply-To: References: Message-ID: <1452879334215.b6596335@Nodemailer> Robert beat me to it on einsum, but also check tensordot for general tensor contraction. On Fri, Jan 15, 2016 at 9:30 AM, Nathaniel Smith wrote: > On Jan 15, 2016 8:36 AM, "Li Jiajia" wrote: >> >> Hi all, >> I?m a PhD student in Georgia Tech. Recently, we?re working on a survey > paper about tensor algorithms: basic tensor operations, tensor > decomposition and some tensor applications. We are making a table to > compare the capabilities of different software and planning to include > NumPy. We?d like to make sure these parameters are correct to make a fair > compare. Although we have looked into the related documents, please help us > to confirm these. Besides, if you think there are more features of your > software and a more preferred citation, please let us know. We?ll consider > to update them. We want to show NumPy supports tensors, and we also include > "scikit-tensor? in our survey, which is based on NumPy. >> Please let me know any confusion or any advice! >> Thanks a lot! :-) >> >> Notice: >> 1. ?YES/NO? to show whether or not the software supports the operation or > has the feature. >> 2. ??? means we?re not sure of the feature, and please help us out. >> 3. ?Tensor order? means the maximum number of tensor dimensions that > users can do with this software. >> 4. For computational cores, >> 1) "Element-wise Tensor Operation (A * B)? includes element-wise > add/minus/multiply/divide, also Kronecker, outer and Katri-Rao products. If > the software contains one of them, we mark ?YES?. >> 2) ?TTM? means tensor-times-matrix multiplication. We distinguish TTM > from tensor contraction. If the software includes tensor contraction, it > can also support TTM. >> 3) For ?MTTKRP?, we know most software can realize it through the above > two operations. We mark it ?YES?, only if an specified optimization for the > whole operation. > NumPy has support for working with multidimensional tensors, if you like, > but it doesn't really use the tensor language and notation (preferring > instead to think in terms of "arrays" as a somewhat more computationally > focused and less mathematically focused conceptual framework). > Which is to say that I actually have no idea what all those jargon terms > you're asking about mean :-) I am suspicious that NumPy supports more of > those operations than you have marked, just under different names/notation, > but really can't tell either way for sure without knowing what exactly they > are. > (It is definitely correct though that NumPy includes no support for sparse > tensors, and NumPy itself is not multi-threaded beyond what we get for free > through the BLAS, though there are external libraries that can perform > multi-threaded computations on top of data stored in numpy arrays.) > -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri Jan 15 13:22:25 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 Jan 2016 10:22:25 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Thu, Jan 14, 2016 at 10:58 AM, Matthew Brett wrote: > > but neither the culture nor the tooling support that > > approach now, so I'm not very confident you could gather adoption. > > I don't think there's a very large amount of cultural work - but some > to be sure. > > We already have the following on OSX: > > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py > > where all the wheels come from pypi. So, I don't think this is really > outside our range, even if the problem is a little more difficult for > Linux. > I'm actually less concerned about the Linux issue, I think that can be solved reasonably with "manylinux" -- which would put us in a very similar position to OS-X , and pretty similar to Windows -- i.e. a common platform with the basic libs (libc, etc), but not a whole lot else. I'm concerned about all the other libs various packages depend on. It's not too bad if you want the core scipy stack -- a decent BLAS being the real challenge there, but there is enough coordination between numpy and scipy that at least efforts to solve that will be shared. But we're still stuck with delivering dependencies on libs along with each package -- usually statically linked. > I thought that Anaconda.org allows pypi channels as well? I think you can host pip-compatible wheels, etc on anaconda.org -- though that may be deprecated... but anyway, I thought the goal here was a simple "pip install", which will point only to PyPi -- I don't think there is a way, ala conda, to add "channels" that will then get automatically searched by pip. But I may be wrong there. So here is the problem I want to solve: > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py last I checked, each of those is self-contained, except for python-level dependencies, most notably on numpy. So it doesn't' help me solve my problem. For instance, I have my own C/C++ code that I'm wrapping that requires netcdf (https://github.com/NOAA-ORR-ERD/PyGnome), and another that requires image libs like libpng, libjpeg, etc.( https://github.com/NOAA-ORR-ERD/py_gd) netcdf is not too ugly itself, but it depends on hdf5, libcurl, zlib (others?). So I have all these libs I need. As it happens, matplotlib probably has the image libs I need, and h5py has hdf5 (and libcurl? and zlib?). But even then, as far as I can tell, I need to build and provide these libs myself for my code. Which is a pain in the @%$ and then I'm shipping (and running) multiple copies of the same libs all over the place -- will there be compatibility issues? apparently not, but it's still wastes the advantage of shared libs, and makes things a pain for all of us. With conda, on the other hand, I get netcdf libs, hdf5 libs, libpng, libjpeg, ibtiff, and I can build my stuff against those and depend on them -- saves me a lot of pain, and my users get a better system. Oh, and add on the GIS stuff: GDAL, etc. (seriously a pain to build), and I get a lot of value. And, many of these libs (GDAL, netcdf) come with nifty command line utilities -- I get those too. So, pip+wheel _may_ be able to support all that, but AFAICT, no one is doing it. And it's really not going to support shipping cmake, and perl, and who knows what else I might need in my toolchain that's not python or "just" a library. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Fri Jan 15 13:50:44 2016 From: travis at continuum.io (Travis Oliphant) Date: Fri, 15 Jan 2016 12:50:44 -0600 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Thu, Jan 14, 2016 at 12:58 PM, Matthew Brett wrote: > On Thu, Jan 14, 2016 at 9:14 AM, Chris Barker - NOAA Federal > wrote: > >>> Also, you have the problem that there is one PyPi -- so where do you > put > >>> your nifty wheels that depend on other binary wheels? you may need to > fork > >>> every package you want to build :-( > >> > >> Is this a real problem or a theoretical one? Do you know of some > >> situation where this wheel to wheel dependency will occur that won't > >> just be solved in some other way? > > > > It's real -- at least during the whole bootstrapping period. Say I > > build a nifty hdf5 binary wheel -- I could probably just grab the name > > "libhdf5" on PyPI. So far so good. But the goal here would be to have > > netcdf and pytables and GDAL and who knows what else then link against > > that wheel. But those projects are all supported be different people, > > that all have their own distribution strategy. So where do I put > > binary wheels of each of those projects that depend on my libhdf5 > > wheel? _maybe_ I would put it out there, and it would all grow > > organically, but neither the culture nor the tooling support that > > approach now, so I'm not very confident you could gather adoption. > > I don't think there's a very large amount of cultural work - but some > to be sure. > > We already have the following on OSX: > > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py > > where all the wheels come from pypi. So, I don't think this is really > outside our range, even if the problem is a little more difficult for > Linux. > > > Even beyond the adoption period, sometimes you need to do stuff in > > more than one way -- look at the proliferation of channels on > > Anaconda.org. > > > > This is more likely to work if there is a good infrastructure for > > third parties to build and distribute the binaries -- e.g. > > Anaconda.org. > > I thought that Anaconda.org allows pypi channels as well? > It does: http://pypi.anaconda.org/ -Travis > > Matthew > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- *Travis Oliphant* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jan 15 14:21:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 15 Jan 2016 11:21:26 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Jan 15, 2016 10:23 AM, "Chris Barker" wrote: > [...] > So here is the problem I want to solve: > > > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py > > last I checked, each of those is self-contained, except for python-level dependencies, most notably on numpy. So it doesn't' help me solve my problem. For instance, I have my own C/C++ code that I'm wrapping that requires netcdf (https://github.com/NOAA-ORR-ERD/PyGnome), and another that requires image libs like libpng, libjpeg, etc.( https://github.com/NOAA-ORR-ERD/py_gd) > > netcdf is not too ugly itself, but it depends on hdf5, libcurl, zlib (others?). So I have all these libs I need. As it happens, matplotlib probably has the image libs I need, and h5py has hdf5 (and libcurl? and zlib?). But even then, as far as I can tell, I need to build and provide these libs myself for my code. Which is a pain in the @%$ and then I'm shipping (and running) multiple copies of the same libs all over the place -- will there be compatibility issues? apparently not, but it's still wastes the advantage of shared libs, and makes things a pain for all of us. > > With conda, on the other hand, I get netcdf libs, hdf5 libs, libpng, libjpeg, ibtiff, and I can build my stuff against those and depend on them -- saves me a lot of pain, and my users get a better system. Sure. Someone's already packaged those for conda, and no one has packaged them for pypi, so it makes sense that conda is more convenient for you. If someone does the work of packaging them for pypi, then that difference goes away. I'm not planning to do that work myself :-). My focus in these discussions around pip/pypi is selfishly focused on the needs of numpy. pip/pypi is clearly the weakest link in the space of packaging and distribution systems that our users care about, so improvements there raise the minimum baseline we can assume. But if/when we sort out the hard problems blocking numpy wheels (Linux issues, windows issues, etc.) then I suspect that we'll start seeing people packaging up those dependencies that you're worrying about and putting them on pypi, just because there won't be any big road blocks anymore to doing so. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Fri Jan 15 14:56:30 2016 From: travis at continuum.io (Travis Oliphant) Date: Fri, 15 Jan 2016 13:56:30 -0600 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Fri, Jan 15, 2016 at 1:21 PM, Nathaniel Smith wrote: > On Jan 15, 2016 10:23 AM, "Chris Barker" wrote: > > > [...] > > So here is the problem I want to solve: > > > > > pip install numpy scipy matplotlib scikit-learn scikit-image pandas > h5py > > > > last I checked, each of those is self-contained, except for python-level > dependencies, most notably on numpy. So it doesn't' help me solve my > problem. For instance, I have my own C/C++ code that I'm wrapping that > requires netcdf (https://github.com/NOAA-ORR-ERD/PyGnome), and another > that requires image libs like libpng, libjpeg, etc.( > https://github.com/NOAA-ORR-ERD/py_gd) > > > > netcdf is not too ugly itself, but it depends on hdf5, libcurl, zlib > (others?). So I have all these libs I need. As it happens, matplotlib > probably has the image libs I need, and h5py has hdf5 (and libcurl? and > zlib?). But even then, as far as I can tell, I need to build and provide > these libs myself for my code. Which is a pain in the @%$ and then I'm > shipping (and running) multiple copies of the same libs all over the place > -- will there be compatibility issues? apparently not, but it's still > wastes the advantage of shared libs, and makes things a pain for all of us. > > > > With conda, on the other hand, I get netcdf libs, hdf5 libs, libpng, > libjpeg, ibtiff, and I can build my stuff against those and depend on them > -- saves me a lot of pain, and my users get a better system. > > Sure. Someone's already packaged those for conda, and no one has packaged > them for pypi, so it makes sense that conda is more convenient for you. If > someone does the work of packaging them for pypi, then that difference goes > away. I'm not planning to do that work myself :-). My focus in these > discussions around pip/pypi is selfishly focused on the needs of numpy. > pip/pypi is clearly the weakest link in the space of packaging and > distribution systems that our users care about, so improvements there raise > the minimum baseline we can assume. But if/when we sort out the hard > problems blocking numpy wheels (Linux issues, windows issues, etc.) then I > suspect that we'll start seeing people packaging up those dependencies that > you're worrying about and putting them on pypi, just because there won't be > any big road blocks anymore to doing so. > I still submit that this is not the best use of time. Conda *already* solves the problem. My sadness is that people keep working to create an ultimately inferior solution rather than just help make a better solution more accessible. People mistakenly believe that wheels and conda packages are equivalent. They are not. If they were we would not have created conda. We could not do what was necessary with wheels and contorting wheels to become conda packages was and still is a lot more work. Now, obviously, it's just code and you can certainly spend effort and time to migrate wheels so that they functionally equivalently to conda packages --- but what is the point, really? Why don't we work together to make the open-source conda project and open-source conda packages more universally accessible? The other very real downside is that these efforts to promote numpy as wheels further encourages people to not use the better solution that already exists in conda. I have to deal with people that *think* pip will solve all their problems all the time. It causes a lot of difficulty when they end up with work-around after work-around that is actually all solved already with conda. It's a weird situation to be in. I'm really baffled by the resistance of this community to just help make conda *the* solution for the scientific python community. I think it would be better to spend time: 1) helping smooth out the pip/conda divide. There are many ways to do this that have my full support: * making sure pip install conda works well * creating "shadow packages" in conda for things that pip has installed * making it possible for pip to install conda packages directly (and ignore the extra features that conda makes possible that pip does not support). A pull-request to pip that did that would be far more useful than trying to cram things into the wheel concept. 2) creating a community conda packages to alleviate whatever concerns might exist about the "control" of the packages that people can install with conda. * Continuum has volunteered resources to Numfocus so that it can be the governing body of what goes into a "pycommunity" channel for conda that will be as easy to get as a "conda install pycommunity" I have not yet heard any really valid reason why this community should not just adopt conda packages and encourage others to do the same. The only thing I have heard are "chicken-and-egg" stories that come down to "we want people to be able to use pip." So, good, then let's make it so that pip can install conda packages and that conda packages with certain restrictions can be hosted on pypi or anywhere else that you have an "index". At least if there were valid reasons they could be addressed. But, this head-in-the-sand attitude towards a viable technology that is freely available is really puzzling to me. There are millions of downloads of Anaconda and many millions of downloads of conda packages each year. That is just with one company doing it. There could be many millions more with other companies and organizations hosting conda packages and indexes. The conda user-base is already very large. A great benefit to the Python ecosystem would be to allow pip users and conda users to share each other's work --- rather than to spend time reproducing work that is already done and freely available. -Travis > -n > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- *Travis Oliphant* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri Jan 15 15:09:19 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 Jan 2016 12:09:19 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Fri, Jan 15, 2016 at 11:21 AM, Nathaniel Smith wrote: > Sure. Someone's already packaged those for conda, and no one has packaged > them for pypi, so it makes sense that conda is more convenient for you. If > someone does the work of packaging them for pypi, then that difference goes > away. > This is what I meant by "cultural" issues :-) but pypi has been an option for Windows and OS-X for ages and those platforms are the bigger problem anyway -- and no one has done it for those platforms -- Linux is not the issue here. I really did try to get that effort started a while back -- I got zero support, nothing , nada. Which doesn't mean I couldn't have gone ahead and done more myself, but I only have so much time, and I wanted to work on my actual problems, and it it hadn't gained any support, it would a been a big waste. MAybe the world is ready for it now... But besides the cultural/community/critical mass issues, pip+whell isn't very well set up to support these use cases anyway -- so we have a 50% solutikon now, and if we did this nifty binary-wheels-of-libs thing we'd have a 90% solution -- and still struggle eith teh other 10%, until we re-invented conda. Anyway -- not trying to be a drag here -- just telling my story. As for the many linux idea -- that's a great one -- hope we can get that working. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jan 15 15:20:23 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 15 Jan 2016 12:20:23 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Fri, Jan 15, 2016 at 12:09 PM, Chris Barker wrote: > On Fri, Jan 15, 2016 at 11:21 AM, Nathaniel Smith wrote: >> >> Sure. Someone's already packaged those for conda, and no one has packaged >> them for pypi, so it makes sense that conda is more convenient for you. If >> someone does the work of packaging them for pypi, then that difference goes >> away. > > This is what I meant by "cultural" issues :-) > > but pypi has been an option for Windows and OS-X for ages and those > platforms are the bigger problem anyway -- and no one has done it for those > platforms I think what's going on here is that Windows *hasn't* been an option for numpy/scipy due to the toolchain issues, and on OS-X we've just been using the platform BLAS (Accelerate), so the core packages haven't had any motivation to sort out the whole library dependency issue, and no-one else is motivated enough to do it. My prediction is that after the core packages sort it out in order to solve their own problems then we might see others picking it up. -n -- Nathaniel J. Smith -- http://vorpus.org From ben.v.root at gmail.com Fri Jan 15 16:08:06 2016 From: ben.v.root at gmail.com (Benjamin Root) Date: Fri, 15 Jan 2016 16:08:06 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: Travis - I will preface the following by pointing out how valuable miniconda and anaconda has been for our workplace because we were running into issues with ensuring that everyone in our mixed platform office had access to all the same tools, particularly GDAL, NetCDF4 and such. For the longest time, we were all stuck on an ancient "Corporate Python" that our IT staff managed to put together, but never had the time to update. So, I do absolutely love conda for the problems that it solved for us. That being said... I take exception to your assertion that anaconda is *the* solution to the packaging problem. I still have a number of issues, particularly with the interactions of GDAL, shapely, and Basemap (they all seek out libgeos_c differently), and I have to use my own build of GDAL to enable many of the features that we use (the vanilla GDAL put out by Continuum just has the default options, and is quite limited). If I don't set up my environment *just right* one of those packages will fail to import in some way due to being unable to find their particular version of libgeos_c. I haven't figure it out exactly why this happens, but it is very easy to break such an environment this way after an update. But that problem is a solvable problem within the framework of conda, so I am not too concerned about that. The bigger problem is Apache. In particular, mod_wsgi. About a year ago, one of our developers was happily developing a webtool that utilized numpy, NetCDF4 and libxml via conda environments. All of the testing was done in flask and everything was peachy. We figured that final deployment out to the Apache server would be a cinch, right? Wrong. Because our mod_wsgi was built against the system's python because it came in through RPMs, it was completely incompatible with the compiled numpy because of the differences in the python compile options. In a clutch, we had our IT staff manually build mod_wsgi against anaconda's python, but they weren't too happy about that, due to mod_wsgi no longer getting updated via yum. If anaconda was the end-all, be-all solution, then it should just be a simple matter to do "conda install mod_wsgi". But the design of conda is that it is intended to be a user-space package-manager. mod_wsgi is installed via root/apache user, which is siloed off from the user. I would have to (in theory) go and install conda for the apache user and likely have to install a conda "apache" package and mod_wsgi package. I seriously doubt this would be an acceptable solution for many IT administrators who would rather depend upon the upstream distributions who are going to be very quick about getting updates out the door and are updated automatically through yum or apt. So, again, I love conda for what it can do when it works well. I only take exception to the notion that it can address *all* problems, because there are some problems that it just simply isn't properly situated for. Cheers! Ben Root On Fri, Jan 15, 2016 at 3:20 PM, Nathaniel Smith wrote: > On Fri, Jan 15, 2016 at 12:09 PM, Chris Barker > wrote: > > On Fri, Jan 15, 2016 at 11:21 AM, Nathaniel Smith wrote: > >> > >> Sure. Someone's already packaged those for conda, and no one has > packaged > >> them for pypi, so it makes sense that conda is more convenient for you. > If > >> someone does the work of packaging them for pypi, then that difference > goes > >> away. > > > > This is what I meant by "cultural" issues :-) > > > > but pypi has been an option for Windows and OS-X for ages and those > > platforms are the bigger problem anyway -- and no one has done it for > those > > platforms > > I think what's going on here is that Windows *hasn't* been an option > for numpy/scipy due to the toolchain issues, and on OS-X we've just > been using the platform BLAS (Accelerate), so the core packages > haven't had any motivation to sort out the whole library dependency > issue, and no-one else is motivated enough to do it. My prediction is > that after the core packages sort it out in order to solve their own > problems then we might see others picking it up. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcgibbo at gmail.com Fri Jan 15 16:22:49 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Fri, 15 Jan 2016 13:22:49 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Fri, Jan 15, 2016 at 11:56 AM, Travis Oliphant wrote: > > > I still submit that this is not the best use of time. Conda *already* solves the problem. My sadness is that people keep working to create an ultimately inferior solution rather than just help make a better solution more accessible. People mistakenly believe that wheels and conda packages are equivalent. They are not. If they were we would not have created conda. We could not do what was necessary with wheels and contorting wheels to become conda packages was and still is a lot more work. Now, obviously, it's just code and you can certainly spend effort and time to migrate wheels so that they functionally equivalently to conda packages --- but what is the point, really? > > Why don't we work together to make the open-source conda project and open-source conda packages more universally accessible? The factors that motivate my interest in making wheels for Linux (i.e. the proposed manylinux tag) work on PyPI are - All (new) Python installations come with pip. As a package author writing documentation, I count on users having pip installed, but I can't count on conda. - I would like to see Linux have feature parity with OS X and Windows with respect to pip and PyPI. - I want the PyPA tools like pip to be as good as possible. - I'm confident that the manylinux proposal will work, and it's very straightforward. -Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From waterbug at pangalactic.us Fri Jan 15 16:56:19 2016 From: waterbug at pangalactic.us (Steve Waterbury) Date: Fri, 15 Jan 2016 16:56:19 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: <56996B03.3080403@pangalactic.us> On 01/15/2016 04:08 PM, Benjamin Root wrote: > So, again, I love conda for what it can do when it works well. I only > take exception to the notion that it can address *all* problems, because > there are some problems that it just simply isn't properly situated for. Actually, I would say you didn't mention any ... ;) The issue is not that it "isn't properly situated for" (whatever that means) the problems you describe, but that -- in the case you mention, for example -- no one has conda-packaged those solutions yet. FWIW, our sysadmins and I use conda for django / apache / mod_wsgi sites and we are very happy with it. IMO, compiling mod_wsgi in the conda environment and keeping it up is trivial compared to the awkwardnesses introduced by using pip/virtualenv in those cases. We also use conda for sites with nginx and the conda-packaged uwsgi, which works great and even permits the use of a separate env (with, if necessary, different versions of django, etc.) for each application. No need to set up an entire VM for each app! *My* sysadmins love conda -- as soon as they saw how much better than pip/virtualenv it was, they have never looked back. IMO, conda is by *far* the best packaging solution the python community has ever seen (and I have been using python for more than 20 years). I too have been stunned by some of the resistance to conda that one sometimes sees in the python packaging world. I've had a systems package maintainer tell me "it solves a different problem [than pip]" ... hmmm ... I would say it solves the same problem *and more*, *better*. I attribute some of the conda-ignoring to "NIH" and, to some extent, possibly defensiveness (I would be defensive too if I had been working on pip as long as they had when conda came along ;). Cheers, Steve Waterbury NASA/GSFC From matthew.brett at gmail.com Fri Jan 15 17:07:22 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 15 Jan 2016 14:07:22 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <56996B03.3080403@pangalactic.us> References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> Message-ID: Hi, On Fri, Jan 15, 2016 at 1:56 PM, Steve Waterbury wrote: > On 01/15/2016 04:08 PM, Benjamin Root wrote: >> >> So, again, I love conda for what it can do when it works well. I only >> take exception to the notion that it can address *all* problems, because >> there are some problems that it just simply isn't properly situated for. > > > Actually, I would say you didn't mention any ... ;) The issue is > not that it "isn't properly situated for" (whatever that means) > the problems you describe, but that -- in the case you mention, > for example -- no one has conda-packaged those solutions yet. > > FWIW, our sysadmins and I use conda for django / apache / mod_wsgi > sites and we are very happy with it. IMO, compiling mod_wsgi in > the conda environment and keeping it up is trivial compared to the > awkwardnesses introduced by using pip/virtualenv in those cases. > > We also use conda for sites with nginx and the conda-packaged > uwsgi, which works great and even permits the use of a separate > env (with, if necessary, different versions of django, etc.) > for each application. No need to set up an entire VM for each app! > *My* sysadmins love conda -- as soon as they saw how much better > than pip/virtualenv it was, they have never looked back. > > IMO, conda is by *far* the best packaging solution the python > community has ever seen (and I have been using python for more > than 20 years). Yes, I think everyone would agree that, until recently, Python packaging was in a mess. > I too have been stunned by some of the resistance > to conda that one sometimes sees in the python packaging world. You can correct me if you see evidence to the contrary, but I think all of the argument is that it is desirable that pip should work as well. > I've had a systems package maintainer tell me "it solves a > different problem [than pip]" ... hmmm ... I would say it > solves the same problem *and more*, *better*. I know what you mean, but I suppose the person you were talking to may have been thinking that many of us already have Python distributions that we are using, and for those, we want to use pip. > I attribute > some of the conda-ignoring to "NIH" and, to some extent, > possibly defensiveness (I would be defensive too if I had been > working on pip as long as they had when conda came along ;). I must say, I don't personally recognize those reasons. For example, I hadn't worked on pip at all before conda came along. Best, Matthew From waterbug at pangalactic.us Fri Jan 15 17:15:23 2016 From: waterbug at pangalactic.us (Steve Waterbury) Date: Fri, 15 Jan 2016 17:15:23 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> Message-ID: <56996F7B.50808@pangalactic.us> On 01/15/2016 05:07 PM, Matthew Brett wrote: >> I attribute >> some of the conda-ignoring to "NIH" and, to some extent, >> possibly defensiveness (I would be defensive too if I had been >> working on pip as long as they had when conda came along ;). > > I must say, I don't personally recognize those reasons. For example, > I hadn't worked on pip at all before conda came along. By "working on pip", I was referring to *developers* of pip, not those who *use* pip for packaging things. Are you contributing to the development of pip, or merely using it for creating packages? Steve From matthew.brett at gmail.com Fri Jan 15 17:19:42 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 15 Jan 2016 14:19:42 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <56996F7B.50808@pangalactic.us> References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> <56996F7B.50808@pangalactic.us> Message-ID: On Fri, Jan 15, 2016 at 2:15 PM, Steve Waterbury wrote: > On 01/15/2016 05:07 PM, Matthew Brett wrote: >>> >>> I attribute >>> some of the conda-ignoring to "NIH" and, to some extent, >>> possibly defensiveness (I would be defensive too if I had been >>> working on pip as long as they had when conda came along ;). >> >> >> I must say, I don't personally recognize those reasons. For example, >> I hadn't worked on pip at all before conda came along. > > > By "working on pip", I was referring to *developers* of pip, > not those who *use* pip for packaging things. Are you > contributing to the development of pip, or merely using it > for creating packages? Sorry - I assumed you were taking about us here on the list planning to build Linux and Windows wheels. It certainly doesn't seem surprising to me that the pip developers would continue to develop pip rather than switch to conda. Has there been any attempt to persuade the pip developers to do this? Best, Matthew From waterbug at pangalactic.us Fri Jan 15 17:33:20 2016 From: waterbug at pangalactic.us (Steve Waterbury) Date: Fri, 15 Jan 2016 17:33:20 -0500 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> <56996F7B.50808@pangalactic.us> Message-ID: <569973B0.7070303@pangalactic.us> On 01/15/2016 05:19 PM, Matthew Brett wrote: > On Fri, Jan 15, 2016 at 2:15 PM, Steve Waterbury > wrote: >> On 01/15/2016 05:07 PM, Matthew Brett wrote: >>>> >>>> I attribute >>>> some of the conda-ignoring to "NIH" and, to some extent, >>>> possibly defensiveness (I would be defensive too if I had been >>>> working on pip as long as they had when conda came along ;). >>> >>> >>> I must say, I don't personally recognize those reasons. For example, >>> I hadn't worked on pip at all before conda came along. >> >> >> By "working on pip", I was referring to *developers* of pip, >> not those who *use* pip for packaging things. Are you >> contributing to the development of pip, or merely using it >> for creating packages? > > Sorry - I assumed you were taking about us here on the list planning > to build Linux and Windows wheels. No, I was definitely *not* talking about those on the list planning to build Linux and Windows wheels when I referred to the folks "working on pip". However, that said, I *completely* agree with Travis's remark: "The other very real downside is that these efforts to promote numpy as wheels further encourages people to not use the better solution that already exists in conda." > It certainly doesn't seem surprising to me that the pip developers > would continue to develop pip rather than switch to conda. Has there > been any attempt to persuade the pip developers to do this? Not that I know of, but as I said, I have asked pip developers and core python developers for their opinions on conda, and my impression has always been one of, shall we say, a circling of the wagons. Yes, even the nice guys in the python community (and I've know some of them a *long* time) sometimes do that ... ;) Cheers, Steve From cournape at gmail.com Fri Jan 15 17:37:34 2016 From: cournape at gmail.com (David Cournapeau) Date: Fri, 15 Jan 2016 22:37:34 +0000 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <56996B03.3080403@pangalactic.us> References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> Message-ID: On Fri, Jan 15, 2016 at 9:56 PM, Steve Waterbury wrote: > On 01/15/2016 04:08 PM, Benjamin Root wrote: > >> So, again, I love conda for what it can do when it works well. I only >> take exception to the notion that it can address *all* problems, because >> there are some problems that it just simply isn't properly situated for. >> > > Actually, I would say you didn't mention any ... ;) The issue is > not that it "isn't properly situated for" (whatever that means) > the problems you describe, but that -- in the case you mention, > for example -- no one has conda-packaged those solutions yet. > > FWIW, our sysadmins and I use conda for django / apache / mod_wsgi > sites and we are very happy with it. IMO, compiling mod_wsgi in > the conda environment and keeping it up is trivial compared to the > awkwardnesses introduced by using pip/virtualenv in those cases. > > We also use conda for sites with nginx and the conda-packaged > uwsgi, which works great and even permits the use of a separate > env (with, if necessary, different versions of django, etc.) > for each application. No need to set up an entire VM for each app! > *My* sysadmins love conda -- as soon as they saw how much better > than pip/virtualenv it was, they have never looked back. > > IMO, conda is by *far* the best packaging solution the python > community has ever seen (and I have been using python for more > than 20 years). I too have been stunned by some of the resistance > to conda that one sometimes sees in the python packaging world. > I've had a systems package maintainer tell me "it solves a > different problem [than pip]" ... hmmm ... I would say it > solves the same problem *and more*, *better*. I attribute > some of the conda-ignoring to "NIH" and, to some extent, > possibly defensiveness (I would be defensive too if I had been > working on pip as long as they had when conda came along ;). > Conda and pip solve some of the same problems, but pip also does quite a bit more than conda (and vice et versa, as conda also acts akin rvm-for-python). Conda works so well because it supports a subset of what pip does: install things from binaries. This is the logical thing to do when you want to distribute binaries because in the python world, since for historical reasons, metadata in the general python lang are dynamic by the very nature of setup.py. For a long time, pip worked from sources instead of binaries (this is actually the reason why it was started following easy_install), and thus had to cope w/ those dynamic metadata. It also has to deal w/ building packages and the whole distutils/setuptools interoperability mess. Conda being solely a packaging solution can sidestep all this complexity (which is again a the logical and smart thing to do if what you care is deployment). Having pip understand conda packages is a non trivial endeavour: since conda packages are relocatable, they are not compatible with the usual python interpreters, and setuptools metadata is neither a subset or a superset of conda metadata. Regarding conda/pip interoperability, there are things that conda could (and IMO should) do, such as writing the expected metadata in site-packages (PEP 376). Currently, conda does not recognize packages installed by pip (because it does not implement PEP 376 and co), so if you do a "pip install ." of a package, it will likely break existing package if present. David -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Fri Jan 15 17:38:35 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 15 Jan 2016 14:38:35 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <569973B0.7070303@pangalactic.us> References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> <56996F7B.50808@pangalactic.us> <569973B0.7070303@pangalactic.us> Message-ID: On Fri, Jan 15, 2016 at 2:33 PM, Steve Waterbury wrote: > On 01/15/2016 05:19 PM, Matthew Brett wrote: >> >> On Fri, Jan 15, 2016 at 2:15 PM, Steve Waterbury >> wrote: >>> >>> On 01/15/2016 05:07 PM, Matthew Brett wrote: >>>>> >>>>> >>>>> I attribute >>>>> some of the conda-ignoring to "NIH" and, to some extent, >>>>> possibly defensiveness (I would be defensive too if I had been >>>>> working on pip as long as they had when conda came along ;). >>>> >>>> >>>> >>>> I must say, I don't personally recognize those reasons. For example, >>>> I hadn't worked on pip at all before conda came along. >>> >>> >>> >>> By "working on pip", I was referring to *developers* of pip, >>> not those who *use* pip for packaging things. Are you >>> contributing to the development of pip, or merely using it >>> for creating packages? >> >> >> Sorry - I assumed you were taking about us here on the list planning >> to build Linux and Windows wheels. > > > No, I was definitely *not* talking about those on the list > planning to build Linux and Windows wheels when I referred to > the folks "working on pip". However, that said, I *completely* > agree with Travis's remark: > > "The other very real downside is that these efforts to promote > numpy as wheels further encourages people to not use the > better solution that already exists in conda." I think there's a distinction between 'promote numpy as wheels' and 'make numpy available as a wheel'. I don't think you'll see much evidence of "promotion" here - it's not really the open-source way. I'm not quite sure what you mean about 'circling the wagons', but the general approach of staying on course and seeing how things shake out seems to me entirely sensible. Cheers too, Matthew From chris.barker at noaa.gov Fri Jan 15 22:00:29 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 Jan 2016 19:00:29 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: hmm -- didn't mean to rev this up quite so much -- sorry! But it's a good conversation to have, so... On Fri, Jan 15, 2016 at 1:08 PM, Benjamin Root wrote: > That being said... I take exception to your assertion that anaconda is > *the* solution to the packaging problem. > I think we need to keep some things straight here: "conda" is a binary package management system. "Anaconda" is a python (and other stuff) distribution, built with conda. In practice, everyone ( I know of ) uses the Anaconda distribution (or at least the default conda channel) when using conda, but in theory, you could maintain your an entirely distinct distribution with conda as the tool. Also in practice, conda is so easy because continuum has done the hard work of building a lot of the packages we all need -- there are still a lot being maintained by the community in various ways, but frankly, we do depend on continuum for all the hard work. But working on/with conda does not lock you into that if you think it's not serving your needs. And this discussion, (for me anyway) is about tools and the way forward, not existing packages. So onward! > I still have a number of issues, particularly with the interactions of > GDAL, shapely, and Basemap (they all seek out libgeos_c differently), and I > have to use my own build of GDAL to enable many of the features that we use > (the vanilla GDAL put out by Continuum just has the default options, and is > quite limited). > Yeah, GDAL/OGR is a F%$#ing nightmare -- and I do wish that Anaconda had a better build, but frankly, there is no system that's going to make that any easier -- do any of the Linux distros ship a really good compatible, up to date set of these libs -- and OS-X and Windows? yow! (Though Chris Gohlke is a wonder!) > If I don't set up my environment *just right* one of those packages will > fail to import in some way due to being unable to find their particular > version of libgeos_c. I haven't figure it out exactly why this happens, but > it is very easy to break such an environment this way after an update. > Maybe conda could be improved to make this easier, I don't know (though do checkout out the IOOS channel on anaconda.org Filipe has done some nice work on this) > In a clutch, we had our IT staff manually build mod_wsgi against > anaconda's python, but they weren't too happy about that, due to mod_wsgi > no longer getting updated via yum. > I'm not sure how pip helps you out here, either. sure for easy-to-compile from source packages, sure, you can just pip install, and you'll get a package compatible with your (system) python. But binary wheels will give you the same headaches -- so you're back to expecting your linux dstro to provide everything, which they don't :-( I understand that the IT folks want everything to come from their OS vendor -- they like that -- but it simply isn't practical for scipy-based web services. And once you've got most of your stack coming from another source, is it really a big deal for python to come from somewhere else also (and apache, and ???) -- conda at least is a technology that _can_ provide an integrated system that includes all this -- I don't think you're going to be pip-installling apache anytime soon! > But the design of conda is that it is intended to be a user-space > package-manager. mod_wsgi is installed via root/apache user, which is > siloed off from the user. I would have to (in theory) go and install conda > for the apache user and likely have to install a conda "apache" package and > mod_wsgi package. > This seems quite reasonable to me frankly. Plus, you can install conda centrally as well, and then the apache user can get access to it (bot not modify it) > I seriously doubt this would be an acceptable solution for many IT > administrators who would rather depend upon the upstream distributions who > are going to be very quick about getting updates out the door and are > updated automatically through yum or apt. > This is true -- but nothing to do with the technology -- it's a social problem. And the auto updates? Ha! the real problem with admins that want to use the system package managers is that they still insist you run python 2.6 for god's sake :-) > So, again, I love conda for what it can do when it works well. I only take > exception to the notion that it can address *all* problems, because there > are some problems that it just simply isn't properly situated for. > still true, of course, but it can address many more than pip. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri Jan 15 22:22:51 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 15 Jan 2016 19:22:51 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> <56996F7B.50808@pangalactic.us> <569973B0.7070303@pangalactic.us> Message-ID: On Fri, Jan 15, 2016 at 2:38 PM, Matthew Brett wrote: > > I think there's a distinction between 'promote numpy as wheels' and > 'make numpy available as a wheel'. I don't think you'll see much > evidence of "promotion" here - it's not really the open-source way. > Depends on how you define "promotion" I suppose. But I think that supporting something is indeed promoting it. I've been a fan of getting the scipy stack available from pip for a long time -- I think it could be really useful for lots of folks not doing heavy scipy-style work, but folks are very wary of introducing a new, hard to install, dependency. But now I'm not so sure -- the trick is what you tell newbies. When I teach Python (not scipy), I start folks off with the python.org python. Then they can pip install ipython, which is the only part of the "scipy stack" I want them to have if they are not doing real numerical work. But what if they are? Now it's pretty clear to me that anyone interested in getting into data analysis, etc with python should just stating off with Anaconda (or Canopy) -- or maybe Gohlke's binaries for Windows users. But what if we have wheels on all platforms for the scipy stack (and a few others?). Now they can learn python, pip install numpy, scipy, etc, learn some more, get excited -- delve into some domain-specific work,and WHAM -- hit the wall of installation nightmares. NOW, they need to switch to Anaconda or Canopy... I think it's too bad to get that far into it and then have to switch and learn something new. -- Again, is there really a point to a 90% solution? So this is the point -- heading down this road takes us to a place where people can get much farther before hitting the wall -- but hot the wall they will, so we'll still need other solutions, so maybe it would be better for us to put our energies into those other solutions. By the way, I'm seeing more and more folks that are not scipy-focused starting off with conda, even for web apps. > I'm not quite sure what you mean about 'circling the wagons', but the > general approach of staying on course and seeing how things shake out > seems to me entirely sensible. > well, what I've observed in the PyPa community may not be circling the wagons, but it has been a "we're not ever going to solve the scipy problems" attitude. I'm pretty convinced that pip/wheel will never be even a 90% solution without some modifications -- so why go there? Indeed, it's not uncommon for folks on the distutils list to say "go use conda" in response to issues that pip does not address well. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From bryanv at continuum.io Fri Jan 15 22:48:17 2016 From: bryanv at continuum.io (Bryan Van de Ven) Date: Fri, 15 Jan 2016 21:48:17 -0600 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> <56996B03.3080403@pangalactic.us> <56996F7B.50808@pangalactic.us> <569973B0.7070303@pangalactic.us> Message-ID: > On Jan 15, 2016, at 21:22, Chris Barker wrote: > > Indeed, it's not uncommon for folks on the distutils list to say "go use conda" in response to issues that pip does not address well. I was in the room at the very first proto-PyData conference when Guido told the assembled crowd "if pip doesn't satisfy the needs of the SciPy community, they should build something that does, on their own." Well, here we are. Bryan From njs at pobox.com Sat Jan 16 02:11:34 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 15 Jan 2016 23:11:34 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: On Fri, Jan 15, 2016 at 11:56 AM, Travis Oliphant wrote: > [...] > I still submit that this is not the best use of time. Conda *already* > solves the problem. My sadness is that people keep working to create an > ultimately inferior solution rather than just help make a better solution > more accessible. People mistakenly believe that wheels and conda > packages are equivalent. They are not. If they were we would not have > created conda. We could not do what was necessary with wheels and > contorting wheels to become conda packages was and still is a lot more work. > Now, obviously, it's just code and you can certainly spend effort and time > to migrate wheels so that they functionally equivalently to conda packages > --- but what is the point, really? Sure, conda definitely solves problems that wheels don't. And I recommend Anaconda to people all the time :-) But let me comment specifically on the topic of numpy-and-pip. (I don't want to be all "this is off-topic!" because there isn't really a better list of general scientific-python-ecosystem discussion, but the part of the conda-and-pip discussion that actually impacts on numpy development, or where the numpy maintainers can affect anything, is very small, and this is numpy-discussion, so...) Last month, numpy had ~740,000 downloads from PyPI, and there are probably hundreds of third-party projects that distribute via PyPI and depend on numpy. So concretely our options as a project are: 1) drop support for pip/PyPI and abandon those users 2) continue to support those users fairly poorly, and at substantial ongoing cost 3) spend some resources now to make pip/pypi work better, so we can support them better and at lower ongoing cost Option 1 would require overwhelming consensus of the community, which for better or worse is presumably not going to happen while substantial portions of that community are still using pip/PyPI. If folks interested in pushing things forward can convince them to switch to conda instead then the calculation changes, but that's just not the kind of thing that numpy-the-project can do or not do. So between the possible options, I've been spending some time trying to drag pip/PyPI into working better for us because I like (3) better than (2). It's not a referendum on anything else. I assume that others have similar motives, though I won't try speaking for them. I think beyond that there are also (currently) some unique benefits to supporting pip/PyPI, like the social/political benefits of not forking away from the broader Python community, and the fact that pip is currently the only functioning way we have of distributing numpy prereleases to users for testing. (The fine-grained numpy ABI tracking in conda is neat, but for this particular use case it's actually a bit of a problem :-).) But it doesn't much matter either way, because we can't/won't just abandon all those users regardless. -n -- Nathaniel J. Smith -- http://vorpus.org From ralf.gommers at gmail.com Mon Jan 18 16:53:07 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 18 Jan 2016 22:53:07 +0100 Subject: [Numpy-discussion] building with setuptools Message-ID: Hi all, We've been able to hold out for a long time, but this is to let you all know that we have now had to switch the numpy build to rely on setuptools: https://github.com/numpy/numpy/pull/6895. Main reason: plain distutils doesn't support PEP 440 naming. See https://github.com/numpy/numpy/issues/6257 and https://github.com/numpy/numpy/issues/6551 for details. So if you see different build weirdness on master, it's likely due to this change. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaime.frio at gmail.com Tue Jan 19 05:10:51 2016 From: jaime.frio at gmail.com (=?UTF-8?Q?Jaime_Fern=C3=A1ndez_del_R=C3=ADo?=) Date: Tue, 19 Jan 2016 11:10:51 +0100 Subject: [Numpy-discussion] Behavior of np.random.uniform Message-ID: Hi all, There is a PR (#7026 ) that documents the current behavior of np.random.uniform when the low and high parameters it takes do not conform to the expected low < high. Basically: - if low < high, random numbers are drawn from [low, high), - if low = high, all random numbers will be equal to low, and - if low > high, random numbers are drawn from (high, low] (notice the change in the open side of the interval.) My only worry is that, once we document this, we can no longer claim that it is a bug. So I would like to hear from others what do they think. The other more or less obvious options would be to: - Raise an error, but this would require a deprecation cycle, as people may be relying on the current undocumented behavior. - Check the inputs and draw numbers from [min(low, high), max(low, high)), which is minimally different from current behavior. I will be merging the current documentation changes in the next few days, so it would be good if any concerns were voiced before that. Thanks, Jaime -- (\__/) ( O.o) ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de dominaci?n mundial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.v.root at gmail.com Tue Jan 19 08:36:46 2016 From: ben.v.root at gmail.com (Benjamin Root) Date: Tue, 19 Jan 2016 08:36:46 -0500 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: Are there other functions where this behavior may or may not be happening? If it isn't consistent across all np.random functions, it probably should be, one way or the other. Ben Root On Tue, Jan 19, 2016 at 5:10 AM, Jaime Fern?ndez del R?o < jaime.frio at gmail.com> wrote: > Hi all, > > There is a PR (#7026 ) that > documents the current behavior of np.random.uniform when the low and high > parameters it takes do not conform to the expected low < high. Basically: > > - if low < high, random numbers are drawn from [low, high), > - if low = high, all random numbers will be equal to low, and > - if low > high, random numbers are drawn from (high, low] (notice the > change in the open side of the interval.) > > My only worry is that, once we document this, we can no longer claim that > it is a bug. So I would like to hear from others what do they think. The > other more or less obvious options would be to: > > - Raise an error, but this would require a deprecation cycle, as > people may be relying on the current undocumented behavior. > - Check the inputs and draw numbers from [min(low, high), max(low, > high)), which is minimally different from current behavior. > > I will be merging the current documentation changes in the next few days, > so it would be good if any concerns were voiced before that. > > Thanks, > > Jaime > > -- > (\__/) > ( O.o) > ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes > de dominaci?n mundial. > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Tue Jan 19 09:21:44 2016 From: gfyoung17 at gmail.com (G Young) Date: Tue, 19 Jan 2016 14:21:44 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: Of the methods defined in *numpy/mtrand.pyx* (excluding helper functions and *random_integers*, as they are all related to *randint*), *randint* is the only other function with *low* and *high* parameters. However, it enforces *high* > *low*. Greg On Tue, Jan 19, 2016 at 1:36 PM, Benjamin Root wrote: > Are there other functions where this behavior may or may not be happening? > If it isn't consistent across all np.random functions, it probably should > be, one way or the other. > > Ben Root > > On Tue, Jan 19, 2016 at 5:10 AM, Jaime Fern?ndez del R?o < > jaime.frio at gmail.com> wrote: > >> Hi all, >> >> There is a PR (#7026 ) that >> documents the current behavior of np.random.uniform when the low and high >> parameters it takes do not conform to the expected low < high. Basically: >> >> - if low < high, random numbers are drawn from [low, high), >> - if low = high, all random numbers will be equal to low, and >> - if low > high, random numbers are drawn from (high, low] (notice >> the change in the open side of the interval.) >> >> My only worry is that, once we document this, we can no longer claim that >> it is a bug. So I would like to hear from others what do they think. The >> other more or less obvious options would be to: >> >> - Raise an error, but this would require a deprecation cycle, as >> people may be relying on the current undocumented behavior. >> - Check the inputs and draw numbers from [min(low, high), max(low, >> high)), which is minimally different from current behavior. >> >> I will be merging the current documentation changes in the next few days, >> so it would be good if any concerns were voiced before that. >> >> Thanks, >> >> Jaime >> >> -- >> (\__/) >> ( O.o) >> ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes >> de dominaci?n mundial. >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.h.vankerkwijk at gmail.com Tue Jan 19 09:39:09 2016 From: m.h.vankerkwijk at gmail.com (Marten van Kerkwijk) Date: Tue, 19 Jan 2016 09:39:09 -0500 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: For what it is worth, the current behaviour seems the most logical to me, i.e., that the first limit is always the one that is included in the interval, and the second is not. -- Marten ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue Jan 19 11:23:55 2016 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Tue, 19 Jan 2016 08:23:55 -0800 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: <8424306488928554487@unknownmsgid> What does the standard lib do for rand range? I see that randint Is closed on both ends, so order doesn't matter, though if it raises for b wrote: Of the methods defined in *numpy/mtrand.pyx* (excluding helper functions and *random_integers*, as they are all related to *randint*), *randint* is the only other function with *low* and *high* parameters. However, it enforces *high* > *low*. Greg On Tue, Jan 19, 2016 at 1:36 PM, Benjamin Root wrote: > Are there other functions where this behavior may or may not be happening? > If it isn't consistent across all np.random functions, it probably should > be, one way or the other. > > Ben Root > > On Tue, Jan 19, 2016 at 5:10 AM, Jaime Fern?ndez del R?o < > jaime.frio at gmail.com> wrote: > >> Hi all, >> >> There is a PR (#7026 ) that >> documents the current behavior of np.random.uniform when the low and high >> parameters it takes do not conform to the expected low < high. Basically: >> >> - if low < high, random numbers are drawn from [low, high), >> - if low = high, all random numbers will be equal to low, and >> - if low > high, random numbers are drawn from (high, low] (notice >> the change in the open side of the interval.) >> >> My only worry is that, once we document this, we can no longer claim that >> it is a bug. So I would like to hear from others what do they think. The >> other more or less obvious options would be to: >> >> - Raise an error, but this would require a deprecation cycle, as >> people may be relying on the current undocumented behavior. >> - Check the inputs and draw numbers from [min(low, high), max(low, >> high)), which is minimally different from current behavior. >> >> I will be merging the current documentation changes in the next few days, >> so it would be good if any concerns were voiced before that. >> >> Thanks, >> >> Jaime >> >> -- >> (\__/) >> ( O.o) >> ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes >> de dominaci?n mundial. >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion at scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Tue Jan 19 11:28:59 2016 From: gfyoung17 at gmail.com (G Young) Date: Tue, 19 Jan 2016 16:28:59 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: <8424306488928554487@unknownmsgid> References: <8424306488928554487@unknownmsgid> Message-ID: In rand range, it raises an exception if low >= high. I should also add that AFAIK enforcing low >= high with floats is a lot trickier than it is for integers. I have been knee-deep in corner cases for some time with *randint* where numbers that are visually different are cast as the same number by *numpy* due to rounding and representation issues. That situation only gets worse with floats. Greg On Tue, Jan 19, 2016 at 4:23 PM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: > What does the standard lib do for rand range? I see that randint Is closed > on both ends, so order doesn't matter, though if it raises for b that's a precedent we could follow. > > (Sorry, on a phone, can't check) > > CHB > > > > On Jan 19, 2016, at 6:21 AM, G Young wrote: > > Of the methods defined in *numpy/mtrand.pyx* (excluding helper functions > and *random_integers*, as they are all related to *randint*), *randint* is > the only other function with *low* and *high* parameters. However, it > enforces *high* > *low*. > > Greg > > On Tue, Jan 19, 2016 at 1:36 PM, Benjamin Root > wrote: > >> Are there other functions where this behavior may or may not be >> happening? If it isn't consistent across all np.random functions, it >> probably should be, one way or the other. >> >> Ben Root >> >> On Tue, Jan 19, 2016 at 5:10 AM, Jaime Fern?ndez del R?o < >> jaime.frio at gmail.com> wrote: >> >>> Hi all, >>> >>> There is a PR (#7026 ) that >>> documents the current behavior of np.random.uniform when the low and >>> high parameters it takes do not conform to the expected low < high. >>> Basically: >>> >>> - if low < high, random numbers are drawn from [low, high), >>> - if low = high, all random numbers will be equal to low, and >>> - if low > high, random numbers are drawn from (high, low] (notice >>> the change in the open side of the interval.) >>> >>> My only worry is that, once we document this, we can no longer claim >>> that it is a bug. So I would like to hear from others what do they think. >>> The other more or less obvious options would be to: >>> >>> - Raise an error, but this would require a deprecation cycle, as >>> people may be relying on the current undocumented behavior. >>> - Check the inputs and draw numbers from [min(low, high), max(low, >>> high)), which is minimally different from current behavior. >>> >>> I will be merging the current documentation changes in the next few >>> days, so it would be good if any concerns were voiced before that. >>> >>> Thanks, >>> >>> Jaime >>> >>> -- >>> (\__/) >>> ( O.o) >>> ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus >>> planes de dominaci?n mundial. >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue Jan 19 11:57:56 2016 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Tue, 19 Jan 2016 08:57:56 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: <-3897079791178811395@unknownmsgid> > Last month, numpy had ~740,000 downloads from PyPI, Hm, given that Windows and Linux wheels have not been available, then that's mostly source installs anyway. Or failed installs -- no shortage of folks trying to pip install numpy on Windows and then having questions about why it doesn't work. Unfortunately, there is no way to know if pip downloads are successful, or if people pip install Numpy, then find out they need some other non-pip-installable packages, and go find another system. > and there are > probably hundreds of third-party projects that distribute via PyPI and > depend on numpy. I'm not so sure -- see above--as pip install has not been reliable for Numpy for ages, I doubt it. Not that they aren't there, but I doubt it's the primary distribution mechanism. There's been an explosion in the use of conda, and there have been multiple other options for ages: Canopy, python(x,y), Gohlke's builds, etc. So at this point, I think the only people using pip are folks that are set up to build -- mostly Linux. (though Mathew's efforts with the Mac wheels may have created a different story on the Mac). > So concretely our options as a project are: > 1) drop support for pip/PyPI and abandon those users There is no one to abandon -- except the Mac users -- we haven't supported them yet. > 2) continue to support those users fairly poorly, and at substantial > ongoing cost I'm curious what the cost is for this poor support -- throw the source up on PyPi, and we're done. The cost comes in when trying to build binaries... > Option 1 would require overwhelming consensus of the community, which > for better or worse is presumably not going to happen while > substantial portions of that community are still using pip/PyPI. Are they? Which community are we talking about? The community I'd like to target are web developers that aren't doing what they think of as "scientific" applications, but could use a little of the SciPy stack. These folks are committed to pip, and are very reluctant to introduce a difficult dependency. Binary wheels would help these folks, but that is not a community that exists yet ( or it's small, anyway) All that being said, I'd be happy to see binary wheels for the core SciPy stack on PyPi. It would be nice for people to be able to do a bit with Numpy or pandas, it MPL, without having to jump ship to a whole new way of doing things. But we should be realistic about how far it can go. > If > folks interested in pushing things forward can convince them to switch > to conda instead then the calculation changes, but that's just not the > kind of thing that numpy-the-project can do or not do. We can't convince anybody, but we can decide where to expend our efforts. -CHB From chris.barker at noaa.gov Tue Jan 19 12:03:23 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 Jan 2016 09:03:23 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> Message-ID: hmm -- didn't mean to rev this up quite so much -- sorry! But it's a good conversation to have, so... On Fri, Jan 15, 2016 at 1:08 PM, Benjamin Root wrote: > That being said... I take exception to your assertion that anaconda is > *the* solution to the packaging problem. > I think we need to keep some things straight here: "conda" is a binary package management system. "Anaconda" is a python (and other stuff) distribution, built with conda. In practice, everyone ( I know of ) uses the Anaconda distribution (or at least the default conda channel) when using conda, but in theory, you could maintain your an entirely distinct distribution with conda as the tool. Also in practice, conda is so easy because continuum has done the hard work of building a lot of the packages we all need -- there are still a lot being maintained by the community in various ways, but frankly, we do depend on continuum for all the hard work. But working on/with conda does not lock you into that if you think it's not serving your needs. And this discussion, (for me anyway) is about tools and the way forward, not existing packages. So onward! > I still have a number of issues, particularly with the interactions of > GDAL, shapely, and Basemap (they all seek out libgeos_c differently), and I > have to use my own build of GDAL to enable many of the features that we use > (the vanilla GDAL put out by Continuum just has the default options, and is > quite limited). > Yeah, GDAL/OGR is a F%$#ing nightmare -- and I do wish that Anaconda had a better build, but frankly, there is no system that's going to make that any easier -- do any of the Linux distros ship a really good compatible, up to date set of these libs -- and OS-X and Windows? yow! (Though Chris Gohlke is a wonder!) > If I don't set up my environment *just right* one of those packages will > fail to import in some way due to being unable to find their particular > version of libgeos_c. I haven't figure it out exactly why this happens, but > it is very easy to break such an environment this way after an update. > Maybe conda could be improved to make this easier, I don't know (though do checkout out the IOOS channel on anaconda.org Filipe has done some nice work on this) > In a clutch, we had our IT staff manually build mod_wsgi against > anaconda's python, but they weren't too happy about that, due to mod_wsgi > no longer getting updated via yum. > I'm not sure how pip helps you out here, either. Sure for easy-to-compile from source packages, you can just pip install, and you'll get a package compatible with your (system) python. But binary wheels will give you the same headaches -- so you're back to expecting your linux dstro to provide everything, which they don't :-( I understand that the IT folks want everything to come from their OS vendor -- they like that -- but it simply isn't practical for scipy-based web services. And once you've got most of your stack coming from another source, is it really a big deal for python to come from somewhere else also (and apache, and ???) -- conda at least is a technology that _can_ provide an integrated system that includes all this -- I don't hink you're going to be pip-installling apache anytime soon! (or node, or ???) > If anaconda was the end-all, be-all solution, then it should just be a > simple matter to do "conda install mod_wsgi". But the design of conda is > that it is intended to be a user-space package-manager. > Then you can either install it as the web user (or apache user), or install it as a system access. I haven't done this, but I don't think it all that hard -- you're then going to need to sudo to install/upgrade anything new, but that's expected. So, again, I love conda for what it can do when it works well. I only take > exception to the notion that it can address *all* problems, because there > are some problems that it just simply isn't properly situated for. > But pip isn't situated for any of these either -- I'm still confused as to the point here. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From alan.isaac at gmail.com Tue Jan 19 12:23:27 2016 From: alan.isaac at gmail.com (Alan G Isaac) Date: Tue, 19 Jan 2016 12:23:27 -0500 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: <569E710F.1050101@gmail.com> In principle, if we are describing an *interval*, that is the right thing to do: https://en.wikipedia.org/wiki/Interval_(mathematics)#Including_or_excluding_endpoints Alan Isaac On 1/19/2016 9:21 AM, G Young wrote: > Of the methods defined in *numpy/mtrand.pyx* (excluding helper functions and *random_integers*, as they are all related to *randint*), *randint*//is > the only other function with /low/ and /high/ parameters. However, it enforces /high/ > /low/. From charlesr.harris at gmail.com Tue Jan 19 12:27:03 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 19 Jan 2016 10:27:03 -0700 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: <8424306488928554487@unknownmsgid> References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: > What does the standard lib do for rand range? I see that randint Is closed > on both ends, so order doesn't matter, though if it raises for b that's a precedent we could follow. > randint is not closed on the high end. The now deprecated random_integers is the function that does that. For floats, it's good to have various interval options. For instance, in generating numbers that will be inverted or have their log taken it is good to avoid zero. However, the names 'low' and 'high' are misleading... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Tue Jan 19 12:35:28 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Tue, 19 Jan 2016 18:35:28 +0100 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: <1453224928.15519.16.camel@sipsolutions.net> On Di, 2016-01-19 at 16:28 +0000, G Young wrote: > In rand range, it raises an exception if low >= high. > > I should also add that AFAIK enforcing low >= high with floats is a > lot trickier than it is for integers. I have been knee-deep in > corner cases for some time with randint where numbers that are > visually different are cast as the same number by numpy due to > rounding and representation issues. That situation only gets worse > with floats. > Well, actually random.uniform docstring says: Get a random number in the range [a, b) or [a, b] depending on rounding. and is true to the word, it does not care about the relative value of a vs. b. So my guess it is identical to your version (though one could check a bit more careful with corner cases) Quick check would suggests it is the same (though I guess if there was a slight rounding issue somewhere, it could be different): >>> np.random.set_state(('MT19937', random.getstate()[1][:-1], random.getstate()[1][-1])) Will enable you to draw the same numbers with random.uniform and np.random.uniform. - Sebastian > Greg > > On Tue, Jan 19, 2016 at 4:23 PM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > > What does the standard lib do for rand range? I see that randint Is > > closed on both ends, so order doesn't matter, though if it raises > > for b > > > (Sorry, on a phone, can't check) > > > > CHB > > > > > > > > On Jan 19, 2016, at 6:21 AM, G Young wrote: > > > > > Of the methods defined in numpy/mtrand.pyx (excluding helper > > > functions and random_integers, as they are all related to > > > randint), randint is the only other function with low and high > > > parameters. However, it enforces high > low. > > > > > > Greg > > > > > > On Tue, Jan 19, 2016 at 1:36 PM, Benjamin Root < > > > ben.v.root at gmail.com> wrote: > > > > Are there other functions where this behavior may or may not be > > > > happening? If it isn't consistent across all np.random > > > > functions, it probably should be, one way or the other. > > > > > > > > Ben Root > > > > > > > > On Tue, Jan 19, 2016 at 5:10 AM, Jaime Fern?ndez del R?o < > > > > jaime.frio at gmail.com> wrote: > > > > > Hi all, > > > > > > > > > > There is a PR (#7026) that documents the current behavior of > > > > > np.random.uniform when the low and high parameters it takes > > > > > do not conform to the expected low < high. Basically: > > > > > if low < high, random numbers are drawn from [low, high), > > > > > if low = high, all random numbers will be equal to low, and > > > > > if low > high, random numbers are drawn from (high, low] > > > > > (notice the change in the open side of the interval.) > > > > > My only worry is that, once we document this, we can no > > > > > longer claim that it is a bug. So I would like to hear from > > > > > others what do they think. The other more or less obvious > > > > > options would be to: > > > > > Raise an error, but this would require a deprecation cycle, > > > > > as people may be relying on the current undocumented > > > > > behavior. > > > > > Check the inputs and draw numbers from [min(low, high), > > > > > max(low, high)), which is minimally different from current > > > > > behavior. > > > > > I will be merging the current documentation changes in the > > > > > next few days, so it would be good if any concerns were > > > > > voiced before that. > > > > > > > > > > Thanks, > > > > > > > > > > Jaime > > > > > > > > > > -- > > > > > (\__/) > > > > > ( O.o) > > > > > ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale > > > > > en sus planes de dominaci?n mundial. > > > > > > > > > > _______________________________________________ > > > > > NumPy-Discussion mailing list > > > > > NumPy-Discussion at scipy.org > > > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > > > > > > > > > _______________________________________________ > > > > NumPy-Discussion mailing list > > > > NumPy-Discussion at scipy.org > > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > > > _______________________________________________ > > > NumPy-Discussion mailing list > > > NumPy-Discussion at scipy.org > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From charlesr.harris at gmail.com Tue Jan 19 12:35:38 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 19 Jan 2016 10:35:38 -0700 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 10:27 AM, Charles R Harris < charlesr.harris at gmail.com> wrote: > > > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > >> What does the standard lib do for rand range? I see that randint Is >> closed on both ends, so order doesn't matter, though if it raises for b> then that's a precedent we could follow. >> > > randint is not closed on the high end. The now deprecated random_integers > is the function that does that. > > For floats, it's good to have various interval options. For instance, in > generating numbers that will be inverted or have their log taken it is good > to avoid zero. However, the names 'low' and 'high' are misleading... > > Note that we also have both arange and linspace for floats. What works well for integers is not always the best for floats. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Jan 19 12:36:08 2016 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Jan 2016 17:36:08 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris wrote: > > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: >> >> What does the standard lib do for rand range? I see that randint Is closed on both ends, so order doesn't matter, though if it raises for b > randint is not closed on the high end. The now deprecated random_integers is the function that does that. > > For floats, it's good to have various interval options. For instance, in generating numbers that will be inverted or have their log taken it is good to avoid zero. However, the names 'low' and 'high' are misleading... They are correctly leading the users to the manner in which the author intended the function to be used. The *implementation* is misleading by allowing users to do things contrary to the documented intent. ;-) With floating point and general intervals, there is not really a good way to guarantee that the generated results avoid the "open" end of the specified interval or even stay *within* that interval. This function is definitely not intended to be used as `uniform(closed_end, open_end)`. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 19 12:40:33 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 19 Jan 2016 10:40:33 -0700 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 10:36 AM, Robert Kern wrote: > On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > > > > > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > >> > >> What does the standard lib do for rand range? I see that randint Is > closed on both ends, so order doesn't matter, though if it raises for b then that's a precedent we could follow. > > > > randint is not closed on the high end. The now deprecated > random_integers is the function that does that. > > > > For floats, it's good to have various interval options. For instance, in > generating numbers that will be inverted or have their log taken it is good > to avoid zero. However, the names 'low' and 'high' are misleading... > > They are correctly leading the users to the manner in which the author > intended the function to be used. The *implementation* is misleading by > allowing users to do things contrary to the documented intent. ;-) > > With floating point and general intervals, there is not really a good way > to guarantee that the generated results avoid the "open" end of the > specified interval or even stay *within* that interval. This function is > definitely not intended to be used as `uniform(closed_end, open_end)`. > Well, it is possible to make that happen if one is careful or directly sets the bits in ieee types... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Jan 19 12:41:00 2016 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Jan 2016 17:41:00 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 5:36 PM, Robert Kern wrote: > > On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: > > > > > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: > >> > >> What does the standard lib do for rand range? I see that randint Is closed on both ends, so order doesn't matter, though if it raises for b > > > randint is not closed on the high end. The now deprecated random_integers is the function that does that. > > > > For floats, it's good to have various interval options. For instance, in generating numbers that will be inverted or have their log taken it is good to avoid zero. However, the names 'low' and 'high' are misleading... > > They are correctly leading the users to the manner in which the author intended the function to be used. The *implementation* is misleading by allowing users to do things contrary to the documented intent. ;-) > > With floating point and general intervals, there is not really a good way to guarantee that the generated results avoid the "open" end of the specified interval or even stay *within* that interval. This function is definitely not intended to be used as `uniform(closed_end, open_end)`. There are special cases that *can* be implemented and are worth doing so as they are building blocks for other distributions that do need to avoid 0 or 1 as you say. Full-featured RNG suites do offer these: [0, 1] [0, 1) (0, 1] (0, 1) -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Jan 19 12:42:24 2016 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Jan 2016 17:42:24 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 5:40 PM, Charles R Harris wrote: > > On Tue, Jan 19, 2016 at 10:36 AM, Robert Kern wrote: >> >> On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: >> > >> >> > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: >> >> >> >> What does the standard lib do for rand range? I see that randint Is closed on both ends, so order doesn't matter, though if it raises for b> > >> > randint is not closed on the high end. The now deprecated random_integers is the function that does that. >> > >> > For floats, it's good to have various interval options. For instance, in generating numbers that will be inverted or have their log taken it is good to avoid zero. However, the names 'low' and 'high' are misleading... >> >> They are correctly leading the users to the manner in which the author intended the function to be used. The *implementation* is misleading by allowing users to do things contrary to the documented intent. ;-) >> >> With floating point and general intervals, there is not really a good way to guarantee that the generated results avoid the "open" end of the specified interval or even stay *within* that interval. This function is definitely not intended to be used as `uniform(closed_end, open_end)`. > > Well, it is possible to make that happen if one is careful or directly sets the bits in ieee types... For the unit interval, certainly. For general bounds, I am not so sure. -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 19 12:43:55 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 19 Jan 2016 10:43:55 -0700 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 10:42 AM, Robert Kern wrote: > On Tue, Jan 19, 2016 at 5:40 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > > > > On Tue, Jan 19, 2016 at 10:36 AM, Robert Kern > wrote: > >> > >> On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> > > >> > >> > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > >> >> > >> >> What does the standard lib do for rand range? I see that randint Is > closed on both ends, so order doesn't matter, though if it raises for b then that's a precedent we could follow. > >> > > >> > randint is not closed on the high end. The now deprecated > random_integers is the function that does that. > >> > > >> > For floats, it's good to have various interval options. For instance, > in generating numbers that will be inverted or have their log taken it is > good to avoid zero. However, the names 'low' and 'high' are misleading... > >> > >> They are correctly leading the users to the manner in which the author > intended the function to be used. The *implementation* is misleading by > allowing users to do things contrary to the documented intent. ;-) > >> > >> With floating point and general intervals, there is not really a good > way to guarantee that the generated results avoid the "open" end of the > specified interval or even stay *within* that interval. This function is > definitely not intended to be used as `uniform(closed_end, open_end)`. > > > > Well, it is possible to make that happen if one is careful or directly > sets the bits in ieee types... > > For the unit interval, certainly. For general bounds, I am not so sure. > Point taken. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Tue Jan 19 12:49:28 2016 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 19 Jan 2016 12:49:28 -0500 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 12:43 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: > > > On Tue, Jan 19, 2016 at 10:42 AM, Robert Kern > wrote: > >> On Tue, Jan 19, 2016 at 5:40 PM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> > >> > On Tue, Jan 19, 2016 at 10:36 AM, Robert Kern >> wrote: >> >> >> >> On Tue, Jan 19, 2016 at 5:27 PM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> >> > >> >> >> >> > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < >> chris.barker at noaa.gov> wrote: >> >> >> >> >> >> What does the standard lib do for rand range? I see that randint Is >> closed on both ends, so order doesn't matter, though if it raises for b> then that's a precedent we could follow. >> >> > >> >> > randint is not closed on the high end. The now deprecated >> random_integers is the function that does that. >> >> > >> >> > For floats, it's good to have various interval options. For >> instance, in generating numbers that will be inverted or have their log >> taken it is good to avoid zero. However, the names 'low' and 'high' are >> misleading... >> >> >> >> They are correctly leading the users to the manner in which the author >> intended the function to be used. The *implementation* is misleading by >> allowing users to do things contrary to the documented intent. ;-) >> >> >> >> With floating point and general intervals, there is not really a good >> way to guarantee that the generated results avoid the "open" end of the >> specified interval or even stay *within* that interval. This function is >> definitely not intended to be used as `uniform(closed_end, open_end)`. >> > >> > Well, it is possible to make that happen if one is careful or directly >> sets the bits in ieee types... >> >> For the unit interval, certainly. For general bounds, I am not so sure. >> > > Point taken. > What's the practical importance of this. The boundary points have probability zero, theoretically. What happens if low and high are only a few nulps apart? If you clip the distribution to obey boundary rules you create mass points :) Josef > > Chuck > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue Jan 19 12:51:54 2016 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 19 Jan 2016 09:51:54 -0800 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 9:27 AM, Charles R Harris wrote: > On Tue, Jan 19, 2016 at 9:23 AM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > >> What does the standard lib do for rand range? I see that randint Is >> closed on both ends, so order doesn't matter, though if it raises for b> then that's a precedent we could follow. >> > > randint is not closed on the high end. The now deprecated random_integers > is the function that does that. > I was referring to the stdlib randint: In [*7*]: [random.randint(2,5) for i in range(10)] Out[*7*]: [5, 5, 2, 4, 5, 5, 3, 5, 2, 2] thinking that compatibility with that would be good -- but that ship has sailed. randrange is open on the high end, as range is, (duh!) [random.randrange(2,5) for i in range(10)] Out[*9*]: [3, 3, 4, 3, 3, 2, 4, 2, 3, 3] but you get an exception is low >=high: In [*10*]: random.randrange(5,2) ValueError: empty range for randrange() (5,2, -3) I like the exception idea best -- but backward compatibility and all that. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jan 19 13:05:52 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 19 Jan 2016 19:05:52 +0100 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: <-3897079791178811395@unknownmsgid> References: <-5868265598930469412@unknownmsgid> <-3897079791178811395@unknownmsgid> Message-ID: On Tue, Jan 19, 2016 at 5:57 PM, Chris Barker - NOAA Federal < chris.barker at noaa.gov> wrote: > > > 2) continue to support those users fairly poorly, and at substantial > > ongoing cost > > I'm curious what the cost is for this poor support -- throw the source > up on PyPi, and we're done. The cost comes in when trying to build > binaries... > I'm sure Nathaniel means the cost to users of failed installs and of numpy losing users because of that, not the cost of building binaries. > Option 1 would require overwhelming consensus of the community, which > > for better or worse is presumably not going to happen while > > substantial portions of that community are still using pip/PyPI. > > Are they? Which community are we talking about? The community I'd like > to target are web developers that aren't doing what they think of as > "scientific" applications, but could use a little of the SciPy stack. > These folks are committed to pip, and are very reluctant to introduce > a difficult dependency. Binary wheels would help these folks, but > that is not a community that exists yet ( or it's small, anyway) > > All that being said, I'd be happy to see binary wheels for the core > SciPy stack on PyPi. It would be nice for people to be able to do a > bit with Numpy or pandas, it MPL, without having to jump ship to a > whole new way of doing things. > This is indeed exactly why we need binary wheels. Efforts to provide those will not change our strong recommendation to our users that they're better off using a scientific Python distribution. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From Stephan.Sahm at gmx.de Tue Jan 19 14:33:27 2016 From: Stephan.Sahm at gmx.de (Stephan Sahm) Date: Tue, 19 Jan 2016 20:33:27 +0100 Subject: [Numpy-discussion] FeatureRequest: support for array construction from iterators In-Reply-To: References: <56585843.80103@gmail.com> Message-ID: just to not prevent it from the black hole - what about integrating fromiter into array? (see the post by Benjamin Root) for me personally, taking the first element for deducing the dtype would be a perfect default way to read generators. If one wants a specific other dtype, one could specify it like in the current fromiter method. On 15 December 2015 at 08:08, Stephan Sahm wrote: > I would like to further push Benjamin Root's suggestion: > > "Therefore, I think it is not out of the realm of reason that passing a > generator object and a dtype could then delegate the work under the hood to > np.fromiter()? I would even go so far as to raise an error if one passes a > generator without specifying dtype to np.array(). The point is to reduce > the number of entry points for creating numpy arrays." > > would this be ok? > > On Mon, Dec 14, 2015 at 6:50 PM Robert Kern wrote: > >> On Mon, Dec 14, 2015 at 5:41 PM, Benjamin Root >> wrote: >> > >> > Heh, never noticed that. Was it implemented more like a >> generator/iterator in older versions of Python? >> >> No, it predates generators and iterators so it has always had to be >> implemented like that. >> >> -- >> Robert Kern >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 19 22:13:21 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 19 Jan 2016 20:13:21 -0700 Subject: [Numpy-discussion] maintenance/1.11.x branched Message-ID: Hi All, I've branched 1.11.x. There is still work to be done, but the branch draws the line for PRs to be included in 1.11. Experience teaches that fixups will needed, but new work/enhancements will be in 1.12.0-dev0. Please run tests on this branch if you have the time. Travis CI has begun failing for Python 3.2 due to a bug in pip. I don't know if this is on our account or something on the Travis end. Thoughts on that are welcome. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnaldorusso at gmail.com Wed Jan 20 11:25:42 2016 From: arnaldorusso at gmail.com (Arnaldo Russo) Date: Wed, 20 Jan 2016 14:25:42 -0200 Subject: [Numpy-discussion] What does -0.0 means? Message-ID: Hi all, After fitting a curve, the output returns the initial value as *-0.0*. Why not a simple 0.0? What is this difference? Sorry for the stupid question, but why exist a negative zero? Cheers, Arnaldo. --- Arnaldo D'Amaral Pereira Granja Russo Instituto Ambiental Boto Flipper institutobotoflipper .org -------------- next part -------------- An HTML attachment was scrubbed... URL: From deshpande.jaidev at gmail.com Wed Jan 20 11:45:03 2016 From: deshpande.jaidev at gmail.com (Jaidev Deshpande) Date: Wed, 20 Jan 2016 22:15:03 +0530 Subject: [Numpy-discussion] What does -0.0 means? In-Reply-To: References: Message-ID: Hi Arnaldo, On 20 Jan 2016 21:56, "Arnaldo Russo" wrote: > > Hi all, > > After fitting a curve, the output returns the initial value as -0.0. Why not a simple 0.0? > What is this difference? > Sorry for the stupid question, but why exist a negative zero? Haha, that used to stump me too, before I found out that there's such a thing as a signed zero. The IEEE 754 standard accommodates it. You can read more at https://en.m.wikipedia.org/wiki/Signed_zero > > Cheers, > Arnaldo. > --- > Arnaldo D'Amaral Pereira Granja Russo > Instituto Ambiental Boto Flipper > institutobotoflipper .org > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnaldorusso at gmail.com Wed Jan 20 13:27:15 2016 From: arnaldorusso at gmail.com (Arnaldo Russo) Date: Wed, 20 Jan 2016 16:27:15 -0200 Subject: [Numpy-discussion] What does -0.0 means? In-Reply-To: References: Message-ID: hahaha! Thanks --- Arnaldo D'Amaral Pereira Granja Russo Instituto Ambiental Boto Flipper institutobotoflipper .org 2016-01-20 14:45 GMT-02:00 Jaidev Deshpande : > Hi Arnaldo, > On 20 Jan 2016 21:56, "Arnaldo Russo" wrote: > > > > Hi all, > > > > After fitting a curve, the output returns the initial value as -0.0. Why > not a simple 0.0? > > What is this difference? > > Sorry for the stupid question, but why exist a negative zero? > > Haha, that used to stump me too, before I found out that there's such a > thing as a signed zero. The IEEE 754 standard accommodates it. > > You can read more at https://en.m.wikipedia.org/wiki/Signed_zero > > > > > Cheers, > > Arnaldo. > > --- > > Arnaldo D'Amaral Pereira Granja Russo > > Instituto Ambiental Boto Flipper > > institutobotoflipper .org > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Jan 20 13:32:50 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 20 Jan 2016 11:32:50 -0700 Subject: [Numpy-discussion] 1.11.0 release notes. Message-ID: Hi All, I've put up a PR with revised 1.11.0 release notes at https://github.com/numpy/numpy/pull/7073. I would appreciate it if anyone involved in the 1.11 release would take a look and note anything missing that they think should be included or things that are misrepresented. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.e.creasey.00 at googlemail.com Wed Jan 20 13:57:35 2016 From: p.e.creasey.00 at googlemail.com (Peter Creasey) Date: Wed, 20 Jan 2016 10:57:35 -0800 Subject: [Numpy-discussion] Behavior of np.random.uniform Message-ID: +1 for the deprecation warning for low>high, I think the cases where that is called are more likely to be unintentional rather than someone trying to use uniform(closed_end, open_end) and you might help users find bugs - i.e. the idioms of ?explicit is better than implicit? and ?fail early and fail loudly? apply. I would also point out that requiring open vs closed intervals (in doubles) is already an extremely specialised use case. In terms of *sampling the reals*, there is no difference between the intervals (a,b) and [a,b], because the endpoints have measure 0, and even with double-precision arithmetic, you are going to have to make several petabytes of random data before you hit an endpoint... Peter From travis at continuum.io Wed Jan 20 14:03:43 2016 From: travis at continuum.io (Travis Oliphant) Date: Wed, 20 Jan 2016 13:03:43 -0600 Subject: [Numpy-discussion] 1.11.0 release notes. In-Reply-To: References: Message-ID: Impressive work! Thank you for all the hard work that went in to these improvements and releases. -Travis On Wed, Jan 20, 2016 at 12:32 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: > Hi All, > > I've put up a PR with revised 1.11.0 release notes at > https://github.com/numpy/numpy/pull/7073. I would appreciate it if anyone > involved in the 1.11 release would take a look and note anything missing > that they think should be included or things that are misrepresented. > > Chuck > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- *Travis Oliphant* *Co-founder and CEO* @teoliphant 512-222-5440 http://www.continuum.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Jan 20 14:07:55 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 20 Jan 2016 12:07:55 -0700 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: On Wed, Jan 20, 2016 at 11:57 AM, Peter Creasey < p.e.creasey.00 at googlemail.com> wrote: > +1 for the deprecation warning for low>high, I think the cases where > that is called are more likely to be unintentional rather than someone > trying to use uniform(closed_end, open_end) and you might help users > find bugs - i.e. the idioms of ?explicit is better than implicit? and > ?fail early and fail loudly? apply. > > I would also point out that requiring open vs closed intervals (in > doubles) is already an extremely specialised use case. In terms of > *sampling the reals*, there is no difference between the intervals > (a,b) and [a,b], because the endpoints have measure 0, and even with > double-precision arithmetic, you are going to have to make several > petabytes of random data before you hit an endpoint... > Petabytes ain't what they used to be ;) I remember testing some hardware which, due to grounding/timing issues would occasionally goof up a readable register. The hardware designers never saw it because they didn't test for hours and days at high data rates. But it was there, and it would show up in the data. Measure zero is about as real as real numbers... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.e.creasey.00 at googlemail.com Wed Jan 20 20:55:25 2016 From: p.e.creasey.00 at googlemail.com (Peter Creasey) Date: Wed, 20 Jan 2016 17:55:25 -0800 Subject: [Numpy-discussion] Behavior of np.random.uniform Message-ID: >> I would also point out that requiring open vs closed intervals (in >> doubles) is already an extremely specialised use case. In terms of >> *sampling the reals*, there is no difference between the intervals >> (a,b) and [a,b], because the endpoints have measure 0, and even with >> double-precision arithmetic, you are going to have to make several >> petabytes of random data before you hit an endpoint... >> > Petabytes ain't what they used to be ;) I remember testing some hardware > which, due to grounding/timing issues would occasionally goof up a readable > register. The hardware designers never saw it because they didn't test for > hours and days at high data rates. But it was there, and it would show up > in the data. Measure zero is about as real as real numbers... > > Chuck Actually, your point is well taken and I am quite mistaken. If you pick some values like uniform(low, low * (1+2**-52)) then you can hit your endpoints pretty easily. I am out of practice making pathological tests for double precision arithmetic. I guess my suggestion would be to add the deprecation warning and change the docstring to warn that the interval is not guaranteed to be right-open. Peter From sk09idm at gmail.com Wed Jan 20 22:14:19 2016 From: sk09idm at gmail.com (Sam Radhakrishan) Date: Thu, 21 Jan 2016 08:44:19 +0530 Subject: [Numpy-discussion] Behavior of Printoptions with scalar types Message-ID: Hi all, This is related to #6908 . While working on it I found that the printoptions are set only for array printing. The documentation however tells that printoptions can be set to pretty print single numpy values. I tried working on it and I am kinda stuck. I really don't know where else to post this. Regards, Sam Radhakrishnan -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Thu Jan 21 01:51:17 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Thu, 21 Jan 2016 01:51:17 -0500 Subject: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogram Message-ID: Please let me know if there is anything wrong or missing. I have added a couple of estimators that I find useful sometimes. -Joe From jaime.frio at gmail.com Thu Jan 21 02:06:10 2016 From: jaime.frio at gmail.com (=?UTF-8?Q?Jaime_Fern=C3=A1ndez_del_R=C3=ADo?=) Date: Thu, 21 Jan 2016 08:06:10 +0100 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: There doesn't seem to be much of a consensus on the way to go, so leaving things as they are and have been seems the wisest choice for now, thanks for all the feedback. I will work with Greg on documenting the status quo properly. We probably want to follow the lead of the stdlib's random.uniform on how the openness of the interval actually results in practice: https://docs.python.org/3.6/library/random.html#random.uniform On Thu, Jan 21, 2016 at 2:55 AM, Peter Creasey < p.e.creasey.00 at googlemail.com> wrote: > >> I would also point out that requiring open vs closed intervals (in > >> doubles) is already an extremely specialised use case. In terms of > >> *sampling the reals*, there is no difference between the intervals > >> (a,b) and [a,b], because the endpoints have measure 0, and even with > >> double-precision arithmetic, you are going to have to make several > >> petabytes of random data before you hit an endpoint... > >> > > Petabytes ain't what they used to be ;) I remember testing some hardware > > which, due to grounding/timing issues would occasionally goof up a > readable > > register. The hardware designers never saw it because they didn't test > for > > hours and days at high data rates. But it was there, and it would show up > > in the data. Measure zero is about as real as real numbers... > > > > Chuck > > Actually, your point is well taken and I am quite mistaken. If you > pick some values like uniform(low, low * (1+2**-52)) then you can hit > your endpoints pretty easily. I am out of practice making > pathological tests for double precision arithmetic. > > I guess my suggestion would be to add the deprecation warning and > change the docstring to warn that the interval is not guaranteed to be > right-open. > > Peter > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- (\__/) ( O.o) ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de dominaci?n mundial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaime.frio at gmail.com Thu Jan 21 02:17:33 2016 From: jaime.frio at gmail.com (=?UTF-8?Q?Jaime_Fern=C3=A1ndez_del_R=C3=ADo?=) Date: Thu, 21 Jan 2016 08:17:33 +0100 Subject: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogram In-Reply-To: References: Message-ID: The tests are not passing, seems like you are taking the sqrt of a negative number, may want to check the inputs and raise a more informative error (and add a test for it). Jaime On Thu, Jan 21, 2016 at 7:51 AM, Joseph Fox-Rabinovitz < jfoxrabinovitz at gmail.com> wrote: > Please let me know if there is anything wrong or missing. I have added > a couple of estimators that I find useful sometimes. > > -Joe > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -- (\__/) ( O.o) ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de dominaci?n mundial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Jan 21 04:35:16 2016 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 21 Jan 2016 09:35:16 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: Message-ID: On Thu, Jan 21, 2016 at 7:06 AM, Jaime Fern?ndez del R?o < jaime.frio at gmail.com> wrote: > > There doesn't seem to be much of a consensus on the way to go, so leaving things as they are and have been seems the wisest choice for now, thanks for all the feedback. I will work with Greg on documenting the status quo properly. Ugh. Be careful in documenting the way things currently work. No one intended for it to work that way! No one should rely on high From robert.kern at gmail.com Thu Jan 21 04:38:45 2016 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 21 Jan 2016 09:38:45 +0000 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: <1453224928.15519.16.camel@sipsolutions.net> References: <8424306488928554487@unknownmsgid> <1453224928.15519.16.camel@sipsolutions.net> Message-ID: On Tue, Jan 19, 2016 at 5:35 PM, Sebastian Berg wrote: > > On Di, 2016-01-19 at 16:28 +0000, G Young wrote: > > In rand range, it raises an exception if low >= high. > > > > I should also add that AFAIK enforcing low >= high with floats is a > > lot trickier than it is for integers. I have been knee-deep in > > corner cases for some time with randint where numbers that are > > visually different are cast as the same number by numpy due to > > rounding and representation issues. That situation only gets worse > > with floats. > > > > Well, actually random.uniform docstring says: > > Get a random number in the range [a, b) or [a, b] depending on > rounding. Which docstring are you looking at? The current one says [low, high) http://docs.scipy.org/doc/numpy/reference/generated/numpy.random.uniform.html#numpy.random.uniform -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcgibbo at gmail.com Thu Jan 21 04:42:44 2016 From: rmcgibbo at gmail.com (Robert McGibbon) Date: Thu, 21 Jan 2016 01:42:44 -0800 Subject: [Numpy-discussion] Should I use pip install numpy in linux? In-Reply-To: References: <-5868265598930469412@unknownmsgid> <-3897079791178811395@unknownmsgid> Message-ID: Hi all, Just as a heads up: Nathaniel and I wrote a draft PEP on binary linux wheels that is now being discussed on distutils-sig, so you can check that out and participate in the conversation if you're interested. - PEP on python.org: https://www.python.org/dev/peps/pep-0513/ - PEP on github with some typos fixed: https://github.com/manylinux/manylinux/blob/master/pep-513.rst - Email archive: https://mail.python.org/pipermail/distutils-sig/2016-January/027997.html -Robert On Tue, Jan 19, 2016 at 10:05 AM, Ralf Gommers wrote: > > > On Tue, Jan 19, 2016 at 5:57 PM, Chris Barker - NOAA Federal < > chris.barker at noaa.gov> wrote: > >> >> > 2) continue to support those users fairly poorly, and at substantial >> > ongoing cost >> >> I'm curious what the cost is for this poor support -- throw the source >> up on PyPi, and we're done. The cost comes in when trying to build >> binaries... >> > > I'm sure Nathaniel means the cost to users of failed installs and of numpy > losing users because of that, not the cost of building binaries. > > > Option 1 would require overwhelming consensus of the community, which >> > for better or worse is presumably not going to happen while >> > substantial portions of that community are still using pip/PyPI. >> >> Are they? Which community are we talking about? The community I'd like >> to target are web developers that aren't doing what they think of as >> "scientific" applications, but could use a little of the SciPy stack. >> These folks are committed to pip, and are very reluctant to introduce >> a difficult dependency. Binary wheels would help these folks, but >> that is not a community that exists yet ( or it's small, anyway) >> >> All that being said, I'd be happy to see binary wheels for the core >> SciPy stack on PyPi. It would be nice for people to be able to do a >> bit with Numpy or pandas, it MPL, without having to jump ship to a >> whole new way of doing things. >> > > This is indeed exactly why we need binary wheels. Efforts to provide those > will not change our strong recommendation to our users that they're better > off using a scientific Python distribution. > > Ralf > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Thu Jan 21 05:07:52 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Thu, 21 Jan 2016 11:07:52 +0100 Subject: [Numpy-discussion] Behavior of np.random.uniform In-Reply-To: References: <8424306488928554487@unknownmsgid> <1453224928.15519.16.camel@sipsolutions.net> Message-ID: <1453370872.29916.1.camel@sipsolutions.net> On Do, 2016-01-21 at 09:38 +0000, Robert Kern wrote: > On Tue, Jan 19, 2016 at 5:35 PM, Sebastian Berg < > sebastian at sipsolutions.net> wrote: > > > > On Di, 2016-01-19 at 16:28 +0000, G Young wrote: > > > In rand range, it raises an exception if low >= high. > > > > > > I should also add that AFAIK enforcing low >= high with floats is > a > > > lot trickier than it is for integers. I have been knee-deep in > > > corner cases for some time with randint where numbers that are > > > visually different are cast as the same number by numpy due to > > > rounding and representation issues. That situation only gets > worse > > > with floats. > > > > > > > Well, actually random.uniform docstring says: > > > > Get a random number in the range [a, b) or [a, b] depending on > > rounding. > > Which docstring are you looking at? The current one says [low, high) > Sorry, I was referring to the python random.uniform function. And as far as I now understand the current numpy equivalent (intentionally or not) seems to do the same thing, which suits fine with me. - Sebastian > http://docs.scipy.org/doc/numpy/reference/generated/numpy.random.unif > orm.html#numpy.random.uniform > > -- > Robert Kern > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From jfoxrabinovitz at gmail.com Thu Jan 21 06:44:01 2016 From: jfoxrabinovitz at gmail.com (=?utf-8?B?Sm9zZXBoIEZveC1SYWJpbm92aXR6?=) Date: Thu, 21 Jan 2016 03:44:01 -0800 (PST) Subject: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogram Message-ID: <000f4242.6f09d37c75b63b1c@gmail.com> Where can I find the output of the tests on GitHub? I seem to be having trouble locating them. -Joe Sent from my LG Optimus L70? ------ Original message------From: Jaime Fern?ndez del R?oDate: Thu, Jan 21, 2016 02:17To: Discussion of Numerical Python;Subject:Re: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogramThe tests are not passing, seems like you are taking the sqrt of a negative number, may want to check the inputs and raise a more informative error (and add a test for it). Jaime On Thu, Jan 21, 2016 at 7:51 AM, Joseph Fox-Rabinovitz wrote: Please let me know if there is anything wrong or missing. I have added a couple of estimators that I find useful sometimes. ? ? -Joe _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion at scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion -- (__/) ( O.o) ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de dominaci?n mundial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Thu Jan 21 13:35:03 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Thu, 21 Jan 2016 13:35:03 -0500 Subject: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogram In-Reply-To: References: Message-ID: I fixed the issue by checking if x.size > 2 before the calculation. An error should never need to be raised at that point. The build passes now and the results appear to be correct. On Thu, Jan 21, 2016 at 2:17 AM, Jaime Fern?ndez del R?o wrote: > The tests are not passing, seems like you are taking the sqrt of a negative > number, may want to check the inputs and raise a more informative error (and > add a test for it). > > Jaime > > On Thu, Jan 21, 2016 at 7:51 AM, Joseph Fox-Rabinovitz > wrote: >> >> Please let me know if there is anything wrong or missing. I have added >> a couple of estimators that I find useful sometimes. >> >> -Joe >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > -- > (\__/) > ( O.o) > ( > <) Este es Conejo. Copia a Conejo en tu firma y ay?dale en sus planes de > dominaci?n mundial. > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > From jfoxrabinovitz at gmail.com Thu Jan 21 15:37:03 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Thu, 21 Jan 2016 15:37:03 -0500 Subject: [Numpy-discussion] PR #7083: ENH: Added 'doane' and 'sqrt' estimators to np.histogram In-Reply-To: References: Message-ID: Due to a mistake I made in my branch structure, I have replaced this PR with #7090: https://github.com/numpy/numpy/pull/7090. All of the changes and fixes so far are squashed into the new request. -Joe On Thu, Jan 21, 2016 at 1:51 AM, Joseph Fox-Rabinovitz wrote: > Please let me know if there is anything wrong or missing. I have added > a couple of estimators that I find useful sometimes. > > -Joe From sebastian at sipsolutions.net Thu Jan 21 19:05:56 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Fri, 22 Jan 2016 01:05:56 +0100 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? Message-ID: <1453421156.31387.14.camel@sipsolutions.net> Hi all, should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still. So, I was starting to try it, and it is annoying obviously. The only serious issue, though, seems actually MA [1]. Sometimes one would add a filter for a whole test file (for my start I did this for the NaT stuff) [2]. Anyway, should we attempt to do this? I admit that trying to make it work, even *with* the change FutureWarnings suddenly pop up when you make the warnings being given less often (I will guess that warning was already issued at import time somewhere). - Sebastian [1] And at that a brand new future warning there, which seems too agressive in any case, though that won't change much. [2] One annoying thing about this, the filter might never be removed. One could add a canary maybe to error out when the filter is not needed anymore. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From njs at pobox.com Thu Jan 21 19:15:02 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jan 2016 16:15:02 -0800 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: <1453421156.31387.14.camel@sipsolutions.net> References: <1453421156.31387.14.camel@sipsolutions.net> Message-ID: On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg wrote: > Hi all, > > should we try to set FutureWarnings to errors in dev tests? I am > seriously annoyed by FutureWarnings getting lost all over for two > reasons. First, it is hard to impossible to find even our own errors > for our own FutureWarning changes. Secondly, we currently would not > even see any Futurewarnings from someone else. For numpy that may not > be a big issue, but still. Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-). -n -- Nathaniel J. Smith -- https://vorpus.org From njs at pobox.com Thu Jan 21 19:21:48 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jan 2016 16:21:48 -0800 Subject: [Numpy-discussion] deprecating assignment to ndarray.data Message-ID: So it turns out that ndarray.data supports assignment at the Python level, and what it does is just assign to the ->data field of the ndarray object: https://github.com/numpy/numpy/blob/master/numpy/core/src/multiarray/getset.c#L325 This kind of assignment been deprecated at the C level since 1.7, and is totally unsafe -- if there are any views pointing to the array when this happens, then they'll be left pointing off into unallocated memory. E.g.: a = np.arange(10) b = np.linspace(0, 1, 10) c = a.view() a.data = b.data # Now c points into free'd memory Can we deprecate or just remove this? (Also filed issue: https://github.com/numpy/numpy/issues/7093) -n -- Nathaniel J. Smith -- https://vorpus.org From sebastian at sipsolutions.net Thu Jan 21 19:44:08 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Fri, 22 Jan 2016 01:44:08 +0100 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: References: <1453421156.31387.14.camel@sipsolutions.net> Message-ID: <1453423448.31387.20.camel@sipsolutions.net> On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote: > On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg > wrote: > > Hi all, > > > > should we try to set FutureWarnings to errors in dev tests? I am > > seriously annoyed by FutureWarnings getting lost all over for two > > reasons. First, it is hard to impossible to find even our own > > errors > > for our own FutureWarning changes. Secondly, we currently would not > > even see any Futurewarnings from someone else. For numpy that may > > not > > be a big issue, but still. > > Yeah, I noticed this recently too :-(. Definitely it is the right > thing to do, I think. And this is actually more true the more > annoying > it is, because if we're triggering lots of FutureWarnings then we > should fix that :-). > Yeah, the problem is that some FutureWarnings that are given in the dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_". Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well). Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc. - Sebastian > -n > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From njs at pobox.com Thu Jan 21 19:51:13 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jan 2016 16:51:13 -0800 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: <1453423448.31387.20.camel@sipsolutions.net> References: <1453421156.31387.14.camel@sipsolutions.net> <1453423448.31387.20.camel@sipsolutions.net> Message-ID: On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg wrote: > On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote: >> On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg >> wrote: >> > Hi all, >> > >> > should we try to set FutureWarnings to errors in dev tests? I am >> > seriously annoyed by FutureWarnings getting lost all over for two >> > reasons. First, it is hard to impossible to find even our own >> > errors >> > for our own FutureWarning changes. Secondly, we currently would not >> > even see any Futurewarnings from someone else. For numpy that may >> > not >> > be a big issue, but still. >> >> Yeah, I noticed this recently too :-(. Definitely it is the right >> thing to do, I think. And this is actually more true the more >> annoying >> it is, because if we're triggering lots of FutureWarnings then we >> should fix that :-). >> > > Yeah, the problem is that some FutureWarnings that are given in the > dozens. Injecting the filter on the module level is possible, but not > quite correct. Maybe one could do evil things similar to a "module > decorator" to add the warning context + filter to every single function > in a module starting with "test_". Can we remove the FutureWarnings by making whatever change they're warning about? :-) > Another method could be to abuse `__warningregistry__`, but there are > at least two reasons why this is probably not viable (nevermind that it > would be ugly as well). > > Doesn't nose maybe provide *something*? I mean seriously, testing > warnings tends to be hell broke lose? Change one thing, suddenly dozens > appear from nowhere, never sure you found all cases, etc. AFAICT nose doesn't provide much of anything for working with warnings. :-/ -n -- Nathaniel J. Smith -- https://vorpus.org From sebastian at sipsolutions.net Thu Jan 21 20:00:43 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Fri, 22 Jan 2016 02:00:43 +0100 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: References: <1453421156.31387.14.camel@sipsolutions.net> <1453423448.31387.20.camel@sipsolutions.net> Message-ID: <1453424443.31387.23.camel@sipsolutions.net> On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote: > On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg > wrote: > > On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote: > > > On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg > > > wrote: > > > > Hi all, > > > > > > > > should we try to set FutureWarnings to errors in dev tests? I > > > > am > > > > seriously annoyed by FutureWarnings getting lost all over for > > > > two > > > > reasons. First, it is hard to impossible to find even our own > > > > errors > > > > for our own FutureWarning changes. Secondly, we currently would > > > > not > > > > even see any Futurewarnings from someone else. For numpy that > > > > may > > > > not > > > > be a big issue, but still. > > > > > > Yeah, I noticed this recently too :-(. Definitely it is the right > > > thing to do, I think. And this is actually more true the more > > > annoying > > > it is, because if we're triggering lots of FutureWarnings then we > > > should fix that :-). > > > > > > > Yeah, the problem is that some FutureWarnings that are given in the > > dozens. Injecting the filter on the module level is possible, but > > not > > quite correct. Maybe one could do evil things similar to a "module > > decorator" to add the warning context + filter to every single > > function > > in a module starting with "test_". > > Can we remove the FutureWarnings by making whatever change they're > warning about? :-) > That would be equivalent of not issueing the futurewarning at all ;). Well, you can write a context manager and put this stuff into the if __name__ == '__main__': block actually. It is still annoying, since ideally we want to filter a single warning, which means the necessaty to install our own warning printer. > > Another method could be to abuse `__warningregistry__`, but there > > are > > at least two reasons why this is probably not viable (nevermind > > that it > > would be ugly as well). > > > > Doesn't nose maybe provide *something*? I mean seriously, testing > > warnings tends to be hell broke lose? Change one thing, suddenly > > dozens > > appear from nowhere, never sure you found all cases, etc. > > AFAICT nose doesn't provide much of anything for working with > warnings. :-/ > > -n > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From njs at pobox.com Thu Jan 21 20:07:48 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jan 2016 17:07:48 -0800 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: <1453424443.31387.23.camel@sipsolutions.net> References: <1453421156.31387.14.camel@sipsolutions.net> <1453423448.31387.20.camel@sipsolutions.net> <1453424443.31387.23.camel@sipsolutions.net> Message-ID: Warnings filters can be given a regex matching the warning text, I think? On Jan 21, 2016 5:00 PM, "Sebastian Berg" wrote: > On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote: > > On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg > > wrote: > > > On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote: > > > > On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg > > > > wrote: > > > > > Hi all, > > > > > > > > > > should we try to set FutureWarnings to errors in dev tests? I > > > > > am > > > > > seriously annoyed by FutureWarnings getting lost all over for > > > > > two > > > > > reasons. First, it is hard to impossible to find even our own > > > > > errors > > > > > for our own FutureWarning changes. Secondly, we currently would > > > > > not > > > > > even see any Futurewarnings from someone else. For numpy that > > > > > may > > > > > not > > > > > be a big issue, but still. > > > > > > > > Yeah, I noticed this recently too :-(. Definitely it is the right > > > > thing to do, I think. And this is actually more true the more > > > > annoying > > > > it is, because if we're triggering lots of FutureWarnings then we > > > > should fix that :-). > > > > > > > > > > Yeah, the problem is that some FutureWarnings that are given in the > > > dozens. Injecting the filter on the module level is possible, but > > > not > > > quite correct. Maybe one could do evil things similar to a "module > > > decorator" to add the warning context + filter to every single > > > function > > > in a module starting with "test_". > > > > Can we remove the FutureWarnings by making whatever change they're > > warning about? :-) > > > > That would be equivalent of not issueing the futurewarning at all ;). > Well, you can write a context manager and put this stuff into the > > if __name__ == '__main__': > > block actually. It is still annoying, since ideally we want to filter a > single warning, which means the necessaty to install our own warning > printer. > > > > > Another method could be to abuse `__warningregistry__`, but there > > > are > > > at least two reasons why this is probably not viable (nevermind > > > that it > > > would be ugly as well). > > > > > > Doesn't nose maybe provide *something*? I mean seriously, testing > > > warnings tends to be hell broke lose? Change one thing, suddenly > > > dozens > > > appear from nowhere, never sure you found all cases, etc. > > > > AFAICT nose doesn't provide much of anything for working with > > warnings. :-/ > > > > -n > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Thu Jan 21 21:16:50 2016 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Fri, 22 Jan 2016 13:16:50 +1100 Subject: [Numpy-discussion] deprecating assignment to ndarray.data In-Reply-To: References: Message-ID: Does this apply in any way to the .data attribute in scipy.sparse matrices? I fiddle with that quite often! On Fri, Jan 22, 2016 at 11:21 AM, Nathaniel Smith wrote: > So it turns out that ndarray.data supports assignment at the Python > level, and what it does is just assign to the ->data field of the > ndarray object: > > https://github.com/numpy/numpy/blob/master/numpy/core/src/multiarray/getset.c#L325 > > This kind of assignment been deprecated at the C level since 1.7, and > is totally unsafe -- if there are any views pointing to the array when > this happens, then they'll be left pointing off into unallocated > memory. > > E.g.: > > a = np.arange(10) > b = np.linspace(0, 1, 10) > c = a.view() > a.data = b.data > # Now c points into free'd memory > > Can we deprecate or just remove this? > > (Also filed issue: https://github.com/numpy/numpy/issues/7093) > > -n > > -- > Nathaniel J. Smith -- https://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 21 21:32:55 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jan 2016 18:32:55 -0800 Subject: [Numpy-discussion] deprecating assignment to ndarray.data In-Reply-To: References: Message-ID: On Jan 21, 2016 6:17 PM, "Juan Nunez-Iglesias" wrote: > > Does this apply in any way to the .data attribute in scipy.sparse matrices? Nope! -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From shoyer at gmail.com Thu Jan 21 22:05:55 2016 From: shoyer at gmail.com (Stephan Hoyer) Date: Thu, 21 Jan 2016 19:05:55 -0800 Subject: [Numpy-discussion] ANN: xarray (formerly xray) v0.7.0 released Message-ID: I am pleased to announce version v0.7.0 of xarray, the project formerly known as xray. xarray is an open source project and Python package that aims to bring the labeled data power of pandas to the physical sciences, by providing N-dimensional variants of the core pandas data structures. These data structures are based on the data model of the netCDF file format. In this latest release, we have renamed the project from "xray" to "xarray". This avoids a namespace conflict with the entire field of X-ray science. We have new URLs for our documentation, source code and mailing list: http://xarray.pydata.org/ http://github.com/pydata/xarray/ https://groups.google.com/forum/#!forum/xarray Highlights of this release: * An internal refactor of DataArray internals * New methods for reshaping, rolling and shifting data * Preliminary support for pandas.MultiIndex * Support for reading GRIB, HDF4 and other file formats via PyNIO For more details, read the full release notes: http://xarray.pydata.org/en/stable/whats-new.html Contributors to this release: Antony Lee Fabien Maussion Joe Hamman Maximilian Roos Stephan Hoyer Takeshi Kanmae femtotrader I'd also like to highlight the contributions of Clark Fitzgerald, who added a plotting module to xray in v0.6, and Dave Brown, for his assistance adding PyNIO support. Best, Stephan -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Thu Jan 21 22:40:39 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Fri, 22 Jan 2016 04:40:39 +0100 Subject: [Numpy-discussion] Set FutureWarnings to error in (dev) tests? In-Reply-To: References: <1453421156.31387.14.camel@sipsolutions.net> <1453423448.31387.20.camel@sipsolutions.net> <1453424443.31387.23.camel@sipsolutions.net> Message-ID: <1453434039.31387.30.camel@sipsolutions.net> On Do, 2016-01-21 at 17:07 -0800, Nathaniel Smith wrote: > Warnings filters can be given a regex matching the warning text, I > think? Doesn't cut it, because you need to set the warning to "always", so then if you don't want to print it, you are stuck.... I wrote a context manager + func decorator + class decorator, which can do it (it overrides how the warning gets printed). Still got to decorate every weird test or at least test class, or escalate it to the global level. Needs some more, but that suppression functionality (in what exact form it could be in the end), is much more reasonable then our typical warning context, since that drops even printing (though since we would use Error during development it should not matter much). The stuff is basically a safe version of: with warnings.catch_warnings(): warnings.filterwarnings("ignore", ....) # oh noe not ignore! which also still prints other warnings. - sebastian > On Jan 21, 2016 5:00 PM, "Sebastian Berg" > wrote: > > On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote: > > > On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg > > > wrote: > > > > On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote: > > > > > On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg > > > > > wrote: > > > > > > Hi all, > > > > > > > > > > > > should we try to set FutureWarnings to errors in dev tests? > > I > > > > > > am > > > > > > seriously annoyed by FutureWarnings getting lost all over > > for > > > > > > two > > > > > > reasons. First, it is hard to impossible to find even our > > own > > > > > > errors > > > > > > for our own FutureWarning changes. Secondly, we currently > > would > > > > > > not > > > > > > even see any Futurewarnings from someone else. For numpy > > that > > > > > > may > > > > > > not > > > > > > be a big issue, but still. > > > > > > > > > > Yeah, I noticed this recently too :-(. Definitely it is the > > right > > > > > thing to do, I think. And this is actually more true the more > > > > > annoying > > > > > it is, because if we're triggering lots of FutureWarnings > > then we > > > > > should fix that :-). > > > > > > > > > > > > > Yeah, the problem is that some FutureWarnings that are given in > > the > > > > dozens. Injecting the filter on the module level is possible, > > but > > > > not > > > > quite correct. Maybe one could do evil things similar to a > > "module > > > > decorator" to add the warning context + filter to every single > > > > function > > > > in a module starting with "test_". > > > > > > Can we remove the FutureWarnings by making whatever change > > they're > > > warning about? :-) > > > > > > > That would be equivalent of not issueing the futurewarning at all > > ;). > > Well, you can write a context manager and put this stuff into the > > > > if __name__ == '__main__': > > > > block actually. It is still annoying, since ideally we want to > > filter a > > single warning, which means the necessaty to install our own > > warning > > printer. > > > > > > > > Another method could be to abuse `__warningregistry__`, but > > there > > > > are > > > > at least two reasons why this is probably not viable (nevermind > > > > that it > > > > would be ugly as well). > > > > > > > > Doesn't nose maybe provide *something*? I mean seriously, > > testing > > > > warnings tends to be hell broke lose? Change one thing, > > suddenly > > > > dozens > > > > appear from nowhere, never sure you found all cases, etc. > > > > > > AFAICT nose doesn't provide much of anything for working with > > > warnings. :-/ > > > > > > -n > > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From jjhelmus at gmail.com Fri Jan 22 01:24:12 2016 From: jjhelmus at gmail.com (Jonathan J. Helmus) Date: Fri, 22 Jan 2016 00:24:12 -0600 Subject: [Numpy-discussion] deprecating assignment to ndarray.data In-Reply-To: References: Message-ID: <56A1CB0C.9050607@gmail.com> On 1/21/2016 8:32 PM, Nathaniel Smith wrote: > > > Does this apply in any way to the .data attribute in scipy.sparse > matrices? > > Nope! > > -n > How about the .data attribute of masked arrays? I'm guessing there may be a decent amount of code that uses array.data to try to duck-type ndarrays and MaskedArrays even though there are better ways to do this, for example np.ma.getdata. Cheers, - Jonathan Helmus From njs at pobox.com Fri Jan 22 03:17:53 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 22 Jan 2016 00:17:53 -0800 Subject: [Numpy-discussion] deprecating assignment to ndarray.data In-Reply-To: <56A1CB0C.9050607@gmail.com> References: <56A1CB0C.9050607@gmail.com> Message-ID: On Thu, Jan 21, 2016 at 10:24 PM, Jonathan J. Helmus wrote: > On 1/21/2016 8:32 PM, Nathaniel Smith wrote: >> >> >> > Does this apply in any way to the .data attribute in scipy.sparse >> > matrices? >> >> Nope! >> >> -n >> > > How about the .data attribute of masked arrays? I'm guessing there may be a > decent amount of code that uses array.data to try to duck-type ndarrays and > MaskedArrays even though there are better ways to do this, for example > np.ma.getdata. It turns out the .data attribute on MaskedArrays has nothing whatsoever to do with the .data attribute on ndarrays, so yeah, this is also unaffected. -n -- Nathaniel J. Smith -- https://vorpus.org From evgeny.burovskiy at gmail.com Sat Jan 23 07:51:58 2016 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sat, 23 Jan 2016 12:51:58 +0000 Subject: [Numpy-discussion] ANN: scipy 0.17.0 release Message-ID: Hi, On behalf of the Scipy development team I am pleased to announce the availability of Scipy 0.17.0. This release contains several new features, detailed in the release notes below. 101 people contributed to this release over the course of six months. This release requires Python 2.6, 2.7 or 3.2-3.4 and NumPy 1.6.2 or greater. Source tarballs and release notes can be found at https://github.com/scipy/scipy/releases/tag/v0.17.0. Thanks to everyone who contributed to this release. Cheers, Evgeni -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 ========================== SciPy 0.17.0 Release Notes ========================== .. contents:: SciPy 0.17.0 is the culmination of 6 months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.17.x branch, and on adding new features on the master branch. This release requires Python 2.6, 2.7 or 3.2-3.5 and NumPy 1.6.2 or greater. Release highlights: - New functions for linear and nonlinear least squares optimization with constraints: `scipy.optimize.lsq_linear` and `scipy.optimize.least_squares` - Support for fitting with bounds in `scipy.optimize.curve_fit`. - Significant improvements to `scipy.stats`, providing many functions with better handing of inputs which have NaNs or are empty, improved documentation, and consistent behavior between `scipy.stats` and `scipy.stats.mstats`. - Significant performance improvements and new functionality in `scipy.spatial.cKDTree`. New features ============ `scipy.cluster` improvements - ---------------------------- A new function `scipy.cluster.hierarchy.cut_tree`, which determines a cut tree from a linkage matrix, was added. `scipy.io` improvements - ----------------------- `scipy.io.mmwrite` gained support for symmetric sparse matrices. `scipy.io.netcdf` gained support for masking and scaling data based on data attributes. `scipy.optimize` improvements - ----------------------------- Linear assignment problem solver ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ `scipy.optimize.linear_sum_assignment` is a new function for solving the linear sum assignment problem. It uses the Hungarian algorithm (Kuhn-Munkres). Least squares optimization ~~~~~~~~~~~~~~~~~~~~~~~~~~ A new function for *nonlinear* least squares optimization with constraints was added: `scipy.optimize.least_squares`. It provides several methods: Levenberg-Marquardt for unconstrained problems, and two trust-region methods for constrained ones. Furthermore it provides different loss functions. New trust-region methods also handle sparse Jacobians. A new function for *linear* least squares optimization with constraints was added: `scipy.optimize.lsq_linear`. It provides a trust-region method as well as an implementation of the Bounded-Variable Least-Squares (BVLS) algorithm. `scipy.optimize.curve_fit` now supports fitting with bounds. `scipy.signal` improvements - --------------------------- A ``mode`` keyword was added to `scipy.signal.spectrogram`, to let it return other spectrograms than power spectral density. `scipy.stats` improvements - -------------------------- Many functions in `scipy.stats` have gained a ``nan_policy`` keyword, which allows specifying how to treat input with NaNs in them: propagate the NaNs, raise an error, or omit the NaNs. Many functions in `scipy.stats` have been improved to correctly handle input arrays that are empty or contain infs/nans. A number of functions with the same name in `scipy.stats` and `scipy.stats.mstats` were changed to have matching signature and behavior. See `gh-5474 `__ for details. `scipy.stats.binom_test` and `scipy.stats.mannwhitneyu` gained a keyword ``alternative``, which allows specifying the hypothesis to test for. Eventually all hypothesis testing functions will get this keyword. For methods of many continuous distributions, complex input is now accepted. Matrix normal distribution has been implemented as `scipy.stats.matrix_normal`. `scipy.sparse` improvements - --------------------------- The `axis` keyword was added to sparse norms, `scipy.sparse.linalg.norm`. `scipy.spatial` improvements - ---------------------------- `scipy.spatial.cKDTree` was partly rewritten for improved performance and several new features were added to it: - - the ``query_ball_point`` method became significantly faster - - ``query`` and ``query_ball_point`` gained an ``n_jobs`` keyword for parallel execution - - build and query methods now release the GIL - - full pickling support - - support for periodic spaces - - the ``sparse_distance_matrix`` method can now return and sparse matrix type `scipy.interpolate` improvements - -------------------------------- Out-of-bounds behavior of `scipy.interpolate.interp1d` has been improved. Use a two-element tuple for the ``fill_value`` argument to specify separate fill values for input below and above the interpolation range. Linear and nearest interpolation kinds of `scipy.interpolate.interp1d` support extrapolation via the ``fill_value="extrapolate"`` keyword. ``fill_value`` can also be set to an array-like (or a two-element tuple of array-likes for separate below and above values) so long as it broadcasts properly to the non-interpolated dimensions of an array. This was implicitly supported by previous versions of scipy, but support has now been formalized and gets compatibility-checked before use. For example, a set of ``y`` values to interpolate with shape ``(2, 3, 5)`` interpolated along the last axis (2) could accept a ``fill_value`` array with shape ``()`` (singleton), ``(1,)``, ``(2, 1)``, ``(1, 3)``, ``(3,)``, or ``(2, 3)``; or it can be a 2-element tuple to specify separate below and above bounds, where each of the two tuple elements obeys proper broadcasting rules. `scipy.linalg` improvements - --------------------------- The default algorithm for `scipy.linalg.leastsq` has been changed to use LAPACK's function ``*gelsd``. Users wanting to get the previous behavior can use a new keyword ``lapack_driver="gelss"`` (allowed values are "gelss", "gelsd" and "gelsy"). ``scipy.sparse`` matrices and linear operators now support the matmul (``@``) operator when available (Python 3.5+). See [PEP 465](http://legacy.python.org/dev/peps/pep-0465/) A new function `scipy.linalg.ordqz`, for QZ decomposition with reordering, has been added. Deprecated features =================== ``scipy.stats.histogram`` is deprecated in favor of ``np.histogram``, which is faster and provides the same functionality. ``scipy.stats.threshold`` and ``scipy.mstats.threshold`` are deprecated in favor of ``np.clip``. See issue #617 for details. ``scipy.stats.ss`` is deprecated. This is a support function, not meant to be exposed to the user. Also, the name is unclear. See issue #663 for details. ``scipy.stats.square_of_sums`` is deprecated. This too is a support function not meant to be exposed to the user. See issues #665 and #663 for details. ``scipy.stats.f_value``, ``scipy.stats.f_value_multivariate``, ``scipy.stats.f_value_wilks_lambda``, and ``scipy.mstats.f_value_wilks_lambda`` are deprecated. These are related to ANOVA, for which ``scipy.stats`` provides quite limited functionality and these functions are not very useful standalone. See issues #660 and #650 for details. ``scipy.stats.chisqprob`` is deprecated. This is an alias. ``stats.chi2.sf`` should be used instead. ``scipy.stats.betai`` is deprecated. This is an alias for ``special.betainc`` which should be used instead. Backwards incompatible changes ============================== The functions ``stats.trim1`` and ``stats.trimboth`` now make sure the elements trimmed are the lowest and/or highest, depending on the case. Slicing without at least partial sorting was previously done, but didn't make sense for unsorted input. When ``variable_names`` is set to an empty list, ``scipy.io.loadmat`` now correctly returns no values instead of all the contents of the MAT file. Element-wise multiplication of sparse matrices now returns a sparse result in all cases. Previously, multiplying a sparse matrix with a dense matrix or array would return a dense matrix. The function ``misc.lena`` has been removed due to license incompatibility. The constructor for ``sparse.coo_matrix`` no longer accepts ``(None, (m,n))`` to construct an all-zero matrix of shape ``(m,n)``. This functionality was deprecated since at least 2007 and was already broken in the previous SciPy release. Use ``coo_matrix((m,n))`` instead. The Cython wrappers in ``linalg.cython_lapack`` for the LAPACK routines ``*gegs``, ``*gegv``, ``*gelsx``, ``*geqpf``, ``*ggsvd``, ``*ggsvp``, ``*lahrd``, ``*latzm``, ``*tzrqf`` have been removed since these routines are not present in the new LAPACK 3.6.0 release. With the exception of the routines ``*ggsvd`` and ``*ggsvp``, these were all deprecated in favor of routines that are currently present in our Cython LAPACK wrappers. Because the LAPACK ``*gegv`` routines were removed in LAPACK 3.6.0. The corresponding Python wrappers in ``scipy.linalg.lapack`` are now deprecated and will be removed in a future release. The source files for these routines have been temporarily included as a part of ``scipy.linalg`` so that SciPy can be built against LAPACK versions that do not provide these deprecated routines. Other changes ============= Html and pdf documentation of development versions of Scipy is now automatically rebuilt after every merged pull request. `scipy.constants` is updated to the CODATA 2014 recommended values. Usage of `scipy.fftpack` functions within Scipy has been changed in such a way that `PyFFTW `__ can easily replace `scipy.fftpack` functions (with improved performance). See `gh-5295 `__ for details. The ``imread`` functions in `scipy.misc` and `scipy.ndimage` were unified, for which a ``mode`` argument was added to `scipy.misc.imread`. Also, bugs for 1-bit and indexed RGB image formats were fixed. ``runtests.py``, the development script to build and test Scipy, now allows building in parallel with ``--parallel``. Authors ======= * @cel4 + * @chemelnucfin + * @endolith * @mamrehn + * @tosh1ki + * Joshua L. Adelman + * Anne Archibald * Herv? Audren + * Vincent Barrielle + * Bruno Beltran + * Sumit Binnani + * Joseph Jon Booker * Olga Botvinnik + * Michael Boyle + * Matthew Brett * Zaz Brown + * Lars Buitinck * Pete Bunch + * Evgeni Burovski * CJ Carey * Ien Cheng + * Cody + * Jaime Fernandez del Rio * Ales Erjavec + * Abraham Escalante * Yves-R?mi Van Eycke + * Yu Feng + * Eric Firing * Francis T. O'Donovan + * Andr? Gaul * Christoph Gohlke * Ralf Gommers * Alex Griffing * Alexander Grigorievskiy * Charles Harris * J?rn Hees + * Ian Henriksen * Derek Homeier + * David Men?ndez Hurtado * Gert-Ludwig Ingold * Aakash Jain + * Rohit Jamuar + * Jan Schl?ter * Johannes Ball? * Luke Zoltan Kelley + * Jason King + * Andreas Kopecky + * Eric Larson * Denis Laxalde * Antony Lee * Gregory R. Lee * Josh Levy-Kramer + * Sam Lewis + * Fran?ois Magimel + * Mart?n Gait?n + * Sam Mason + * Andreas Mayer * Nikolay Mayorov * Damon McDougall + * Robert McGibbon * Sturla Molden * Will Monroe + * Eric Moore * Maniteja Nandana * Vikram Natarajan + * Andrew Nelson * Marti Nito + * Behzad Nouri + * Daisuke Oyama + * Giorgio Patrini + * Fabian Paul + * Christoph Paulik + * Mad Physicist + * Irvin Probst * Sebastian Pucilowski + * Ted Pudlik + * Eric Quintero * Yoav Ram + * Joscha Reimer + * Juha Remes * Frederik Rietdijk + * R?my L?one + * Christian Sachs + * Skipper Seabold * Sebastian Skoup? + * Alex Seewald + * Andreas Sorge + * Bernardo Sulzbach + * Julian Taylor * Louis Tiao + * Utkarsh Upadhyay + * Jacob Vanderplas * Gael Varoquaux + * Pauli Virtanen * Fredrik Wallner + * Stefan van der Walt * James Webber + * Warren Weckesser * Raphael Wettinger + * Josh Wilson + * Nat Wilson + * Peter Yin + A total of 101 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Issues closed for 0.17.0 - ------------------------ - - `#1923 `__: problem with numpy 0's in stats.poisson.rvs (Trac #1398) - - `#2138 `__: scipy.misc.imread segfaults on 1 bit png (Trac #1613) - - `#2237 `__: distributions do not accept complex arguments (Trac #1718) - - `#2282 `__: scipy.special.hyp1f1(0.5, 1.5, -1000) fails (Trac #1763) - - `#2618 `__: poisson.pmf returns NaN if mu is 0 - - `#2957 `__: hyp1f1 precision issue - - `#2997 `__: FAIL: test_qhull.TestUtilities.test_more_barycentric_transforms - - `#3129 `__: No way to set ranges for fitting parameters in Optimize functions - - `#3191 `__: interp1d should contain a fill_value_below and a fill_value_above... - - `#3453 `__: PchipInterpolator sets slopes at edges differently than Matlab's... - - `#4106 `__: ndimage._ni_support._normalize_sequence() fails with numpy.int64 - - `#4118 `__: `scipy.integrate.ode.set_solout` called after `scipy.integrate.ode.set_initial_value` fails silently - - `#4233 `__: 1D scipy.interpolate.griddata using method=nearest produces nans... - - `#4375 `__: All tests fail due to bad file permissions - - `#4580 `__: scipy.ndimage.filters.convolve documenation is incorrect - - `#4627 `__: logsumexp with sign indicator - enable calculation with negative... - - `#4702 `__: logsumexp with zero scaling factor - - `#4834 `__: gammainc should return 1.0 instead of NaN for infinite x - - `#4838 `__: enh: exprel special function - - `#4862 `__: the scipy.special.boxcox function is inaccurate for denormal... - - `#4887 `__: Spherical harmonic incongruences - - `#4895 `__: some scipy ufuncs have inconsistent output dtypes? - - `#4923 `__: logm does not aggressively convert complex outputs to float - - `#4932 `__: BUG: stats: The `fit` method of the distributions silently ignores... - - `#4956 `__: Documentation error in `scipy.special.bi_zeros` - - `#4957 `__: Docstring for `pbvv_seq` is wrong - - `#4967 `__: block_diag should look at dtypes of all arguments, not only the... - - `#5037 `__: scipy.optimize.minimize error messages are printed to stdout... - - `#5039 `__: Cubic interpolation: On entry to DGESDD parameter number 12 had... - - `#5163 `__: Base case example of Hierarchical Clustering (offer) - - `#5181 `__: BUG: stats.genextreme.entropy should use the explicit formula - - `#5184 `__: Some? wheels don't express a numpy dependency - - `#5197 `__: mstats: test_kurtosis fails (ULP max is 2) - - `#5260 `__: Typo causing an error in splrep - - `#5263 `__: Default epsilon in rbf.py fails for colinear points - - `#5276 `__: Reading empty (no data) arff file fails - - `#5280 `__: 1d scipy.signal.convolve much slower than numpy.convolve - - `#5326 `__: Implementation error in scipy.interpolate.PchipInterpolator - - `#5370 `__: Test issue with test_quadpack and libm.so as a linker script - - `#5426 `__: ERROR: test_stats.test_chisquare_masked_arrays - - `#5427 `__: Automate installing correct numpy versions in numpy-vendor image - - `#5430 `__: Python3 : Numpy scalar types "not iterable"; specific instance... - - `#5450 `__: BUG: spatial.ConvexHull triggers a seg. fault when given nans. - - `#5478 `__: clarify the relation between matrix normal distribution and `multivariate_normal` - - `#5539 `__: lstsq related test failures on windows binaries from numpy-vendor - - `#5560 `__: doc: scipy.stats.burr pdf issue - - `#5571 `__: lstsq test failure after lapack_driver change - - `#5577 `__: ordqz segfault on Python 3.4 in Wine - - `#5578 `__: scipy.linalg test failures on python 3 in Wine - - `#5607 `__: Overloaded ?isnan(double&)? is ambiguous when compiling with... - - `#5629 `__: Test for lstsq randomly failed - - `#5630 `__: memory leak with scipy 0.16 spatial cKDEtree - - `#5689 `__: isnan errors compiling scipy/special/Faddeeva.cc with clang++ - - `#5694 `__: fftpack test failure in test_import - - `#5719 `__: curve_fit(method!="lm") ignores initial guess Pull requests for 0.17.0 - ------------------------ - - `#3022 `__: hyp1f1: better handling of large negative arguments - - `#3107 `__: ENH: Add ordered QZ decomposition - - `#4390 `__: ENH: Allow axis and keepdims arguments to be passed to scipy.linalg.norm. - - `#4671 `__: ENH: add axis to sparse norms - - `#4796 `__: ENH: Add cut tree function to scipy.cluster.hierarchy - - `#4809 `__: MAINT: cauchy moments are undefined - - `#4821 `__: ENH: stats: make distribution instances picklable - - `#4839 `__: ENH: Add scipy.special.exprel relative error exponential ufunc - - `#4859 `__: Logsumexp fixes - allows sign flags and b==0 - - `#4865 `__: BUG: scipy.io.mmio.write: error with big indices and low precision - - `#4869 `__: add as_inexact option to _lib._util._asarray_validated - - `#4884 `__: ENH: Finite difference approximation of Jacobian matrix - - `#4890 `__: ENH: Port cKDTree query methods to C++, allow pickling on Python... - - `#4892 `__: how much doctesting is too much? - - `#4896 `__: MAINT: work around a possible numpy ufunc loop selection bug - - `#4898 `__: MAINT: A bit of pyflakes-driven cleanup. - - `#4899 `__: ENH: add 'alternative' keyword to hypothesis tests in stats - - `#4903 `__: BENCH: Benchmarks for interpolate module - - `#4905 `__: MAINT: prepend underscore to mask_to_limits; delete masked_var. - - `#4906 `__: MAINT: Benchmarks for optimize.leastsq - - `#4910 `__: WIP: Trimmed statistics functions have inconsistent API. - - `#4912 `__: MAINT: fix typo in stats tutorial. Closes gh-4911. - - `#4914 `__: DEP: deprecate `scipy.stats.ss` and `scipy.stats.square_of_sums`. - - `#4924 `__: MAINT: if the imaginary part of logm of a real matrix is small,... - - `#4930 `__: BENCH: Benchmarks for signal module - - `#4941 `__: ENH: update `find_repeats`. - - `#4942 `__: MAINT: use np.float64_t instead of np.float_t in cKDTree - - `#4944 `__: BUG: integer overflow in correlate_nd - - `#4951 `__: do not ignore invalid kwargs in distributions fit method - - `#4958 `__: Add some detail to docstrings for special functions - - `#4961 `__: ENH: stats.describe: add bias kw and empty array handling - - `#4963 `__: ENH: scipy.sparse.coo.coo_matrix.__init__: less memory needed - - `#4968 `__: DEP: deprecate ``stats.f_value*`` and ``mstats.f_value*`` functions. - - `#4969 `__: ENH: review `stats.relfreq` and `stats.cumfreq`; fixes to `stats.histogram` - - `#4971 `__: Extend github source links to line ranges - - `#4972 `__: MAINT: impove the error message in validate_runtests_log - - `#4976 `__: DEP: deprecate `scipy.stats.threshold` - - `#4977 `__: MAINT: more careful dtype treatment in block diagonal matrix... - - `#4979 `__: ENH: distributions, complex arguments - - `#4984 `__: clarify dirichlet distribution error handling - - `#4992 `__: ENH: `stats.fligner` and `stats.bartlett` empty input handling. - - `#4996 `__: DOC: fix stats.spearmanr docs - - `#4997 `__: Fix up boxcox for underflow / loss of precision - - `#4998 `__: DOC: improved documentation for `stats.ppcc_max` - - `#5000 `__: ENH: added empty input handling `scipy.moment`; doc enhancements - - `#5003 `__: ENH: improves rankdata algorithm - - `#5005 `__: scipy.stats: numerical stability improvement - - `#5007 `__: ENH: nan handling in functions that use `stats._chk_asarray` - - `#5009 `__: remove coveralls.io - - `#5010 `__: Hypergeometric distribution log survival function - - `#5014 `__: Patch to compute the volume and area of convex hulls - - `#5015 `__: DOC: Fix mistaken variable name in sawtooth - - `#5016 `__: DOC: resample example - - `#5017 `__: DEP: deprecate `stats.betai` and `stats.chisqprob` - - `#5018 `__: ENH: Add test on random inpu to volume computations - - `#5026 `__: BUG: Fix return dtype of lil_matrix.getnnz(axis=0) - - `#5030 `__: DOC: resample slow for prime output too - - `#5033 `__: MAINT: integrate, special: remove unused R1MACH and Makefile - - `#5034 `__: MAINT: signal: lift max_len_seq validation out of Cython - - `#5035 `__: DOC/MAINT: refguide / doctest drudgery - - `#5041 `__: BUG: fixing some small memory leaks detected by cppcheck - - `#5044 `__: [GSoC] ENH: New least-squares algorithms - - `#5050 `__: MAINT: C fixes, trimmed a lot of dead code from Cephes - - `#5057 `__: ENH: sparse: avoid densifying on sparse/dense elementwise mult - - `#5058 `__: TST: stats: add a sample distribution to the test loop - - `#5061 `__: ENH: spatial: faster 2D Voronoi and Convex Hull plotting - - `#5065 `__: TST: improve test coverage for `stats.mvsdist` and `stats.bayes_mvs` - - `#5066 `__: MAINT: fitpack: remove a noop - - `#5067 `__: ENH: empty and nan input handling for `stats.kstat` and `stats.kstatvar` - - `#5071 `__: DOC: optimize: Correct paper reference, add doi - - `#5072 `__: MAINT: scipy.sparse cleanup - - `#5073 `__: DOC: special: Add an example showing the relation of diric to... - - `#5075 `__: DOC: clarified parameterization of stats.lognorm - - `#5076 `__: use int, float, bool instead of np.int, np.float, np.bool - - `#5078 `__: DOC: Rename fftpack docs to README - - `#5081 `__: BUG: Correct handling of scalar 'b' in lsmr and lsqr - - `#5082 `__: loadmat variable_names: don't confuse [] and None. - - `#5083 `__: Fix integrate.fixed_quad docstring to indicate None return value - - `#5086 `__: Use solve() instead of inv() for gaussian_kde - - `#5090 `__: MAINT: stats: add explicit _sf, _isf to gengamma distribution - - `#5094 `__: ENH: scipy.interpolate.NearestNDInterpolator: cKDTree configurable - - `#5098 `__: DOC: special: fix typesetting in ``*_roots quadrature`` functions - - `#5099 `__: DOC: make the docstring of stats.moment raw - - `#5104 `__: DOC/ENH fixes and micro-optimizations for scipy.linalg - - `#5105 `__: enh: made l-bfgs-b parameter for the maximum number of line search... - - `#5106 `__: TST: add NIST test cases to `stats.f_oneway` - - `#5110 `__: [GSoC]: Bounded linear least squares - - `#5111 `__: MAINT: special: Cephes cleanup - - `#5118 `__: BUG: FIR path failed if len(x) < len(b) in lfilter. - - `#5124 `__: ENH: move the filliben approximation to a publicly visible function - - `#5126 `__: StatisticsCleanup: `stats.kruskal` review - - `#5130 `__: DOC: update PyPi trove classifiers. Beta -> Stable. Add license. - - `#5131 `__: DOC: differential_evolution, improve docstring for mutation and... - - `#5132 `__: MAINT: differential_evolution improve init_population_lhs comments... - - `#5133 `__: MRG: rebased mmio refactoring - - `#5135 `__: MAINT: `stats.mstats` consistency with `stats.stats` - - `#5139 `__: TST: linalg: add a smoke test for gh-5039 - - `#5140 `__: EHN: Update constants.codata to CODATA 2014 - - `#5145 `__: added ValueError to docstring as possible error raised - - `#5146 `__: MAINT: Improve implementation details and doc in `stats.shapiro` - - `#5147 `__: [GSoC] ENH: Upgrades to curve_fit - - `#5150 `__: Fix misleading wavelets/cwt example - - `#5152 `__: BUG: cluster.hierarchy.dendrogram: missing font size doesn't... - - `#5153 `__: add keywords to control the summation in discrete distributions... - - `#5156 `__: DOC: added comments on algorithms used in Legendre function - - `#5158 `__: ENH: optimize: add the Hungarian algorithm - - `#5162 `__: FIX: Remove lena - - `#5164 `__: MAINT: fix cluster.hierarchy.dendrogram issues and docs - - `#5166 `__: MAINT: changed `stats.pointbiserialr` to delegate to `stats.pearsonr` - - `#5167 `__: ENH: add nan_policy to `stats.kendalltau`. - - `#5168 `__: TST: added nist test case (Norris) to `stats.linregress`. - - `#5169 `__: update lpmv docstring - - `#5171 `__: Clarify metric parameter in linkage docstring - - `#5172 `__: ENH: add mode keyword to signal.spectrogram - - `#5177 `__: DOC: graphical example for KDTree.query_ball_point - - `#5179 `__: MAINT: stats: tweak the formula for ncx2.pdf - - `#5188 `__: MAINT: linalg: A bit of clean up. - - `#5189 `__: BUG: stats: Use the explicit formula in stats.genextreme.entropy - - `#5193 `__: BUG: fix uninitialized use in lartg - - `#5194 `__: BUG: properly return error to fortran from ode_jacobian_function - - `#5198 `__: TST: Fix TestCtypesQuad failure on Python 3.5 for Windows - - `#5201 `__: allow extrapolation in interp1d - - `#5209 `__: MAINT: Change complex parameter to boolean in Y_() - - `#5213 `__: BUG: sparse: fix logical comparison dtype conflicts - - `#5216 `__: BUG: sparse: fixing unbound local error - - `#5218 `__: DOC and BUG: Bessel function docstring improvements, fix array_like,... - - `#5222 `__: MAINT: sparse: fix COO ctor - - `#5224 `__: DOC: optimize: type of OptimizeResult.hess_inv varies - - `#5228 `__: ENH: Add maskandscale support to netcdf; based on pupynere and... - - `#5229 `__: DOC: sparse.linalg.svds doc typo fixed - - `#5234 `__: MAINT: sparse: simplify COO ctor - - `#5235 `__: MAINT: sparse: warn on todia() with many diagonals - - `#5236 `__: MAINT: ndimage: simplify thread handling/recursion + constness - - `#5239 `__: BUG: integrate: Fixed issue 4118 - - `#5241 `__: qr_insert fixes, closes #5149 - - `#5246 `__: Doctest tutorial files - - `#5247 `__: DOC: optimize: typo/import fix in linear_sum_assignment - - `#5248 `__: remove inspect.getargspec and test python 3.5 on Travis CI - - `#5250 `__: BUG: Fix sparse multiply by single-element zero - - `#5261 `__: Fix bug causing a TypeError in splrep when a runtime warning... - - `#5262 `__: Follow up to 4489 (Addition LAPACK routines in linalg.lstsq) - - `#5264 `__: ignore zero-length edges for default epsilon - - `#5269 `__: DOC: Typos and spell-checking - - `#5272 `__: MAINT: signal: Convert array syntax to memoryviews - - `#5273 `__: DOC: raw strings for docstrings with math - - `#5274 `__: MAINT: sparse: update cython code for MST - - `#5278 `__: BUG: io: Stop guessing the data delimiter in ARFF files. - - `#5289 `__: BUG: misc: Fix the Pillow work-around for 1-bit images. - - `#5291 `__: ENH: call np.correlate for 1d in scipy.signal.correlate - - `#5294 `__: DOC: special: Remove a potentially misleading example from the... - - `#5295 `__: Simplify replacement of fftpack by pyfftw - - `#5296 `__: ENH: Add matrix normal distribution to stats - - `#5297 `__: Fixed leaf_rotation and leaf_font_size in Python 3 - - `#5303 `__: MAINT: stats: rewrite find_repeats - - `#5307 `__: MAINT: stats: remove unused Fortran routine - - `#5313 `__: BUG: sparse: fix diags for nonsquare matrices - - `#5315 `__: MAINT: special: Cephes cleanup - - `#5316 `__: fix input check for sparse.linalg.svds - - `#5319 `__: MAINT: Cython code maintenance - - `#5328 `__: BUG: Fix place_poles return values - - `#5329 `__: avoid a spurious divide-by-zero in Student t stats - - `#5334 `__: MAINT: integrate: miscellaneous cleanup - - `#5340 `__: MAINT: Printing Error Msg to STDERR and Removing iterate.dat - - `#5347 `__: ENH: add Py3.5-style matmul operator (e.g. A @ B) to sparse linear... - - `#5350 `__: FIX error, when reading 32-bit float wav files - - `#5351 `__: refactor the PCHIP interpolant's algorithm - - `#5354 `__: MAINT: construct csr and csc matrices from integer lists - - `#5359 `__: add a fast path to interp1d - - `#5364 `__: Add two fill_values to interp1d. - - `#5365 `__: ABCD docstrings - - `#5366 `__: Fixed typo in the documentation for scipy.signal.cwt() per #5290. - - `#5367 `__: DOC updated scipy.spatial.Delaunay example - - `#5368 `__: ENH: Do not create a throwaway class at every function call - - `#5372 `__: DOC: spectral: fix reference formatting - - `#5375 `__: PEP8 amendments to ffpack_basic.py - - `#5377 `__: BUG: integrate: builtin name no longer shadowed - - `#5381 `__: PEP8ified fftpack_pseudo_diffs.py - - `#5385 `__: BLD: fix Bento build for changes to optimize and spatial - - `#5386 `__: STY: PEP8 amendments to interpolate.py - - `#5387 `__: DEP: deprecate stats.histogram - - `#5388 `__: REL: add "make upload" command to doc/Makefile. - - `#5389 `__: DOC: updated origin param of scipy.ndimage.filters.convolve - - `#5395 `__: BUG: special: fix a number of edge cases related to `x = np.inf`. - - `#5398 `__: MAINT: stats: avoid spurious warnings in lognorm.pdf(0, s) - - `#5407 `__: ENH: stats: Handle mu=0 in stats.poisson - - `#5409 `__: Fix the behavior of discrete distributions at the right-hand... - - `#5412 `__: TST: stats: skip a test to avoid a spurious log(0) warning - - `#5413 `__: BUG: linalg: work around LAPACK single-precision lwork computation... - - `#5414 `__: MAINT: stats: move creation of namedtuples outside of function... - - `#5415 `__: DOC: fix up sections in ToC in the pdf reference guide - - `#5416 `__: TST: fix issue with a ctypes test for integrate on Fedora. - - `#5418 `__: DOC: fix bugs in signal.TransferFunction docstring. Closes gh-5287. - - `#5419 `__: MAINT: sparse: fix usage of NotImplementedError - - `#5420 `__: Raise proper error if maxiter < 1 - - `#5422 `__: DOC: changed documentation of brent to be consistent with bracket - - `#5444 `__: BUG: gaussian_filter, BPoly.from_derivatives fail on numpy int... - - `#5445 `__: MAINT: stats: fix incorrect deprecation warnings and test noise - - `#5446 `__: DOC: add note about PyFFTW in fftpack tutorial. - - `#5459 `__: DOC: integrate: Some improvements to the differential equation... - - `#5465 `__: BUG: Relax mstats kurtosis test tolerance by a few ulp - - `#5471 `__: ConvexHull should raise ValueError for NaNs. - - `#5473 `__: MAINT: update decorators.py module to version 4.0.5 - - `#5476 `__: BUG: imsave searches for wrong channel axis if image has 3 or... - - `#5477 `__: BLD: add numpy to setup/install_requires for OS X wheels - - `#5479 `__: ENH: return Jacobian/Hessian from BasinHopping - - `#5484 `__: BUG: fix ttest zero division handling - - `#5486 `__: Fix crash on kmeans2 - - `#5491 `__: MAINT: Expose parallel build option to runtests.py - - `#5494 `__: Sort OptimizeResult.__repr__ by key - - `#5496 `__: DOC: update the author name mapping - - `#5497 `__: Enhancement to binned_statistic: option to unraveled returned... - - `#5498 `__: BUG: sparse: fix a bug in sparsetools input dtype resolution - - `#5500 `__: DOC: detect unprintable characters in docstrings - - `#5505 `__: BUG: misc: Ensure fromimage converts mode 'P' to 'RGB' or 'RGBA'. - - `#5514 `__: DOC: further update the release notes - - `#5515 `__: ENH: optionally disable fixed-point acceleration - - `#5517 `__: DOC: Improvements and additions to the matrix_normal doc - - `#5518 `__: Remove wrappers for LAPACK deprecated routines - - `#5521 `__: TST: skip a linalg.orth memory test on 32-bit platforms. - - `#5523 `__: DOC: change a few floats to integers in docstring examples - - `#5524 `__: DOC: more updates to 0.17.0 release notes. - - `#5525 `__: Fix to minor typo in documentation for scipy.integrate.ode - - `#5527 `__: TST: bump arccosh tolerance to allow for inaccurate numpy or... - - `#5535 `__: DOC: signal: minor clarification to docstring of TransferFunction. - - `#5538 `__: DOC: signal: fix find_peaks_cwt documentation - - `#5545 `__: MAINT: Fix typo in linalg/basic.py - - `#5547 `__: TST: mark TestEig.test_singular as knownfail in master. - - `#5550 `__: MAINT: work around lstsq driver selection issue - - `#5556 `__: BUG: Fixed broken dogbox trust-region radius update - - `#5561 `__: BUG: eliminate warnings, exception (on Win) in test_maskandscale;... - - `#5567 `__: TST: a few cleanups in the test suite; run_module_suite and clearer... - - `#5568 `__: MAINT: simplify poisson's _argcheck - - `#5569 `__: TST: bump GMean test tolerance to make it pass on Wine - - `#5572 `__: TST: lstsq: bump test tolerance for TravisCI - - `#5573 `__: TST: remove use of np.fromfile from cluster.vq tests - - `#5576 `__: Lapack deprecations - - `#5579 `__: TST: skip tests of linalg.norm axis keyword on numpy <= 1.7.x - - `#5582 `__: Clarify language of survival function documentation - - `#5583 `__: MAINT: stats/tests: A bit of clean up. - - `#5588 `__: DOC: stats: Add a note that stats.burr is the Type III Burr distribution. - - `#5595 `__: TST: fix test_lamch failures on Python 3 - - `#5600 `__: MAINT: Ignore spatial/ckdtree.cxx and .h - - `#5602 `__: Explicitly numbered replacement fields for maintainability - - `#5605 `__: MAINT: collection of small fixes to test suite - - `#5614 `__: Minor doc change. - - `#5624 `__: FIX: Fix interpolate - - `#5625 `__: BUG: msvc9 binaries crash when indexing std::vector of size 0 - - `#5635 `__: BUG: misspelled __dealloc__ in cKDTree. - - `#5642 `__: STY: minor fixup of formatting of 0.17.0 release notes. - - `#5643 `__: BLD: fix a build issue in special/Faddeeva.cc with isnan. - - `#5661 `__: TST: linalg tests used stdlib random instead of numpy.random. - - `#5682 `__: backports for 0.17.0 - - `#5696 `__: Minor improvements to least_squares' docstring. - - `#5697 `__: BLD: fix for isnan/isinf issues in special/Faddeeva.cc - - `#5720 `__: TST: fix for file opening error in fftpack test_import.py - - `#5722 `__: BUG: Make curve_fit respect an initial guess with bounds - - `#5726 `__: Backports for v0.17.0rc2 - - `#5727 `__: API: Changes to least_squares API Checksums ========= MD5 ~~~ 5ff2971e1ce90e762c59d2cd84837224 scipy-0.17.0.tar.gz ef0949640ee73f845dde6dd7f84d7691 scipy-0.17.0.tar.xz 28a4fe29e980804db162524f10873211 scipy-0.17.0.zip SHA256 ~~~~~~ f600b755fb69437d0f70361f9e560ab4d304b1b66987ed5a28bdd9dd7793e089 scipy-0.17.0.tar.gz 2bc03ea36cd55bfd80869d87f690334b4cad240373e05857ddfa7a4d1e2f9a7a scipy-0.17.0.tar.xz ede6820030b2e5796126aa1571d86738b14bbd670d68c83378877b1d9eb9894d scipy-0.17.0.zip -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) iQEcBAEBAgAGBQJWokWbAAoJEIp0pQ0zQcu+JUgH/16Qqurptl3jUU11+4pl+8ji 3AFZN+dgzGLyiMz9s+V+OyL4hcdZY4++WM2QizF5uD3hDDFNi+aJRHAYySMAZeIZ O8y//v+DXDVOLZpKwcPFq5E5ZebTDh1jO4zZzpuyi1PnKgTAEKZeSyBSMd0RMx/p kZ0URTcAa/tPgkNZz3+i8By9b/zBWOlbI+v6fCVwV8E20YWfGBSp2s0KAtQk787D Q88ylnc3Zfv4IR1hgCFZz6oJA0RgmpH4USKi7guyg+fIKjf1nFe64zIuV3C2r/Uj y9Qbs/8x/HTteDp5owkRiSwnpTZHOtx/jqeh2z/w9aQe0i8Nag5Ere6JZ2kixG4= =jZUn -----END PGP SIGNATURE----- From charlesr.harris at gmail.com Sat Jan 23 13:12:49 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 23 Jan 2016 11:12:49 -0700 Subject: [Numpy-discussion] ANN: scipy 0.17.0 release In-Reply-To: References: Message-ID: Congratulations. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From kikocorreoso at gmail.com Sat Jan 23 14:12:48 2016 From: kikocorreoso at gmail.com (Kiko) Date: Sat, 23 Jan 2016 20:12:48 +0100 Subject: [Numpy-discussion] ANN: scipy 0.17.0 release In-Reply-To: References: Message-ID: is it python3.5 compatible? your message and github don't say the same. 2016-01-23 19:12 GMT+01:00, Charles R Harris : > > > Congratulations. > > Chuck > From kikocorreoso at gmail.com Sat Jan 23 14:13:23 2016 From: kikocorreoso at gmail.com (Kiko) Date: Sat, 23 Jan 2016 20:13:23 +0100 Subject: [Numpy-discussion] ANN: scipy 0.17.0 release In-Reply-To: References: Message-ID: BTW, congratulations and thanks for the hard work 2016-01-23 20:12 GMT+01:00, Kiko : > is it python3.5 compatible? your message and github don't say the same. > > 2016-01-23 19:12 GMT+01:00, Charles R Harris : >> >> >> Congratulations. >> >> Chuck >> > From evgeny.burovskiy at gmail.com Sat Jan 23 14:57:17 2016 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sat, 23 Jan 2016 19:57:17 +0000 Subject: [Numpy-discussion] ANN: scipy 0.17.0 release In-Reply-To: References: Message-ID: 23.01.2016 22:12 ???????????? "Kiko" ???????: > > is it python3.5 compatible? your message and github don't say the same. It is indeed --- thanks for catching it. My typo, my bad. Evgeni > 2016-01-23 19:12 GMT+01:00, Charles R Harris : > > > > > > Congratulations. > > > > Chuck > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Sat Jan 23 16:25:00 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Sat, 23 Jan 2016 22:25:00 +0100 Subject: [Numpy-discussion] Make as_strided result writeonly Message-ID: <1453584300.6435.6.camel@sipsolutions.net> Hi all, I have just opened a PR, to make as_strided writeonly (as default). The reasoning for this change is that an `as_strided` array often have self overlapping memory. However, writing to an array where multiple elements have the identical memory address can be confusing, and the results are typically unpredictable. Considering the danger, the proposal is to add a `readonly=True`. A poweruser (who that function is designed for anyway), could thus still get a writeable array. For the moment, writing to the result would raise a FutureWarning with `readonly="warn"`. Do you agree with this, or would it be a major inconvenience? - Sebastian -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From njs at pobox.com Sat Jan 23 17:20:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 23 Jan 2016 14:20:26 -0800 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: <1453584300.6435.6.camel@sipsolutions.net> References: <1453584300.6435.6.camel@sipsolutions.net> Message-ID: On Sat, Jan 23, 2016 at 1:25 PM, Sebastian Berg wrote: > > Hi all, > > I have just opened a PR, to make as_strided writeonly (as default). The I think you meant readonly :-) > reasoning for this change is that an `as_strided` array often have self > overlapping memory. However, writing to an array where multiple > elements have the identical memory address can be confusing, and the > results are typically unpredictable. > > Considering the danger, the proposal is to add a `readonly=True`. A > poweruser (who that function is designed for anyway), could thus still > get a writeable array. > > For the moment, writing to the result would raise a FutureWarning with > `readonly="warn"`. This should just be a deprecation warning, right? (Because switching an array from writeable->readonly might cause previously correct code to error out, but not to silently start returning different results.) > Do you agree with this, or would it be a major inconvenience? AFAIK the only use cases for as_strided involve self-overlap (for non-self-overlap you can generally use reshape / indexing / etc. and it's much simpler). So +1 from me. -n -- Nathaniel J. Smith -- https://vorpus.org From jni.soma at gmail.com Sat Jan 23 21:00:00 2016 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Sun, 24 Jan 2016 13:00:00 +1100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> Message-ID: I've used as_strided before to create an "endless" output array when I didn't care about the result of an operation, just the side effect. See eg here . So I would certainly like option to remain to get a writeable array. In general, I'm sceptical about whether the benefits outweigh the costs. On Sun, Jan 24, 2016 at 9:20 AM, Nathaniel Smith wrote: > On Sat, Jan 23, 2016 at 1:25 PM, Sebastian Berg > wrote: > > > > Hi all, > > > > I have just opened a PR, to make as_strided writeonly (as default). The > > I think you meant readonly :-) > > > reasoning for this change is that an `as_strided` array often have self > > overlapping memory. However, writing to an array where multiple > > elements have the identical memory address can be confusing, and the > > results are typically unpredictable. > > > > Considering the danger, the proposal is to add a `readonly=True`. A > > poweruser (who that function is designed for anyway), could thus still > > get a writeable array. > > > > For the moment, writing to the result would raise a FutureWarning with > > `readonly="warn"`. > > This should just be a deprecation warning, right? (Because switching > an array from writeable->readonly might cause previously correct code > to error out, but not to silently start returning different results.) > > > Do you agree with this, or would it be a major inconvenience? > > AFAIK the only use cases for as_strided involve self-overlap (for > non-self-overlap you can generally use reshape / indexing / etc. and > it's much simpler). So +1 from me. > > -n > > -- > Nathaniel J. Smith -- https://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Sat Jan 23 22:30:03 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Sat, 23 Jan 2016 22:30:03 -0500 Subject: [Numpy-discussion] PR #7090: ENH: Added 'doane' and 'sqrt' estimators to np.histogram Message-ID: As someone very new to relatively large projects such as numpy, I was wondering how the process works. I have announced my PR with some small enhancements to the histogram function and fixed up all of the comments that were made (originally to PR 7083). What happens next? All I have to go on is http://docs.scipy.org/doc/numpy-1.10.1/dev/gitwash/development_workflow.html#asking-for-your-changes-to-be-merged-with-the-main-repo, but that does not go into detail about what happens after the point I have already reached. Regards, -Joe On Thu, Jan 21, 2016 at 3:37 PM, Joseph Fox-Rabinovitz wrote: > Due to a mistake I made in my branch structure, I have replaced this > PR with #7090: https://github.com/numpy/numpy/pull/7090. All of the > changes and fixes so far are squashed into the new request. > > -Joe > > On Thu, Jan 21, 2016 at 1:51 AM, Joseph Fox-Rabinovitz > wrote: >> Please let me know if there is anything wrong or missing. I have added >> a couple of estimators that I find useful sometimes. >> >> -Joe From sebastian at sipsolutions.net Sun Jan 24 04:17:29 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Sun, 24 Jan 2016 10:17:29 +0100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> Message-ID: <1453627049.11352.14.camel@sipsolutions.net> On So, 2016-01-24 at 13:00 +1100, Juan Nunez-Iglesias wrote: > I've used as_strided before to create an "endless" output array when > I didn't care about the result of an operation, just the side effect. > See eg here. So I would certainly like option to remain to get a > writeable array. In general, I'm sceptical about whether the benefits > outweigh the costs. Yeah, that is a real use case. I am not planning to remove the option, but it would be as a `readonly` keyword argument, which means you would need to make the code depend on the numpy version if you require a writable array [1]. This actually somewhat defeats the purpose of all of this, but `np.ndarray` can do this dummy thing for you I think, so you could get around that, but.... The purpose is that if you actually would use an as_strided array in your operation, the result is unpredictable (not just complicated). And while as_strided is IMO designed to be used by people who know what they are doing, I have a feeling it is being used quite a lot in general. We did a similar thing for the new `broadcast_to`, though I think there we decided to skip the readonly until complains happen. Actually there is one more thing I might do. And that is issue a UserWarning when new array quite likely points to invalid memory. - Sebastian [1] as_strided does not currently support arr.flags.writable = True for its result array. > On Sun, Jan 24, 2016 at 9:20 AM, Nathaniel Smith > wrote: > > On Sat, Jan 23, 2016 at 1:25 PM, Sebastian Berg > > wrote: > > > > > > Hi all, > > > > > > I have just opened a PR, to make as_strided writeonly (as > > default). The > > > > I think you meant readonly :-) > > > > > reasoning for this change is that an `as_strided` array often > > have self > > > overlapping memory. However, writing to an array where multiple > > > elements have the identical memory address can be confusing, and > > the > > > results are typically unpredictable. > > > > > > Considering the danger, the proposal is to add a `readonly=True`. > > A > > > poweruser (who that function is designed for anyway), could thus > > still > > > get a writeable array. > > > > > > For the moment, writing to the result would raise a FutureWarning > > with > > > `readonly="warn"`. > > > > This should just be a deprecation warning, right? (Because > > switching > > an array from writeable->readonly might cause previously correct > > code > > to error out, but not to silently start returning different > > results.) > > > > > Do you agree with this, or would it be a major inconvenience? > > > > AFAIK the only use cases for as_strided involve self-overlap (for > > non-self-overlap you can generally use reshape / indexing / etc. > > and > > it's much simpler). So +1 from me. > > > > -n > > > > -- > > Nathaniel J. Smith -- https://vorpus.org > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From charlesr.harris at gmail.com Sun Jan 24 11:36:00 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 24 Jan 2016 09:36:00 -0700 Subject: [Numpy-discussion] PR #7090: ENH: Added 'doane' and 'sqrt' estimators to np.histogram In-Reply-To: References: Message-ID: On Sat, Jan 23, 2016 at 8:30 PM, Joseph Fox-Rabinovitz < jfoxrabinovitz at gmail.com> wrote: > As someone very new to relatively large projects such as numpy, I was > wondering how the process works. I have announced my PR with some > small enhancements to the histogram function and fixed up all of the > comments that were made (originally to PR 7083). What happens next? > All I have to go on is > > http://docs.scipy.org/doc/numpy-1.10.1/dev/gitwash/development_workflow.html#asking-for-your-changes-to-be-merged-with-the-main-repo > , > but that does not go into detail about what happens after the point I > have already reached. > > Basically, wait. Maybe squeak now and then. It's frustrating, but we have difficulty keeping up with the flow of pull requests. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfoxrabinovitz at gmail.com Sun Jan 24 17:08:54 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Sun, 24 Jan 2016 17:08:54 -0500 Subject: [Numpy-discussion] PR #7090: ENH: Added 'doane' and 'sqrt' estimators to np.histogram In-Reply-To: References: Message-ID: Sounds good. Waiting is not really frustrating as long as I know that that is what I am doing. I thought I was missing a step somewhere along the way. Regards, -Joe On Sun, Jan 24, 2016 at 11:36 AM, Charles R Harris wrote: > > > On Sat, Jan 23, 2016 at 8:30 PM, Joseph Fox-Rabinovitz > wrote: >> >> As someone very new to relatively large projects such as numpy, I was >> wondering how the process works. I have announced my PR with some >> small enhancements to the histogram function and fixed up all of the >> comments that were made (originally to PR 7083). What happens next? >> All I have to go on is >> >> http://docs.scipy.org/doc/numpy-1.10.1/dev/gitwash/development_workflow.html#asking-for-your-changes-to-be-merged-with-the-main-repo, >> but that does not go into detail about what happens after the point I >> have already reached. >> > > Basically, wait. Maybe squeak now and then. It's frustrating, but we have > difficulty keeping up with the flow of pull requests. > > Chuck > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > From jni.soma at gmail.com Sun Jan 24 19:46:55 2016 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Mon, 25 Jan 2016 11:46:55 +1100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: <1453627049.11352.14.camel@sipsolutions.net> References: <1453584300.6435.6.camel@sipsolutions.net> <1453627049.11352.14.camel@sipsolutions.net> Message-ID: > Yeah, that is a real use case. I am not planning to remove the option, > but it would be as a `readonly` keyword argument, which means you would > need to make the code depend on the numpy version if you require a > writable array [1]. > > [1] as_strided does not currently support arr.flags.writable = True for its result array. Can you explain this in more detail? I'm writing to the result without problem. Anyway, it only occurred to me after your response that the deprecation path would be quite disruptive to downstream libraries, including scikit-image. My feeling is that unless someone has been bitten by this (?), the benefit does not outweigh the cost of deprecation. Perhaps something to push to 2.0? On Sun, Jan 24, 2016 at 8:17 PM, Sebastian Berg wrote: > On So, 2016-01-24 at 13:00 +1100, Juan Nunez-Iglesias wrote: > > I've used as_strided before to create an "endless" output array when > > I didn't care about the result of an operation, just the side effect. > > See eg here. So I would certainly like option to remain to get a > > writeable array. In general, I'm sceptical about whether the benefits > > outweigh the costs. > > Yeah, that is a real use case. I am not planning to remove the option, > but it would be as a `readonly` keyword argument, which means you would > need to make the code depend on the numpy version if you require a > writable array [1]. > This actually somewhat defeats the purpose of all of this, but > `np.ndarray` can do this dummy thing for you I think, so you could get > around that, but.... > > The purpose is that if you actually would use an as_strided array in > your operation, the result is unpredictable (not just complicated). And > while as_strided is IMO designed to be used by people who know what > they are doing, I have a feeling it is being used quite a lot in > general. > > We did a similar thing for the new `broadcast_to`, though I think there > we decided to skip the readonly until complains happen. > > Actually there is one more thing I might do. And that is issue a > UserWarning when new array quite likely points to invalid memory. > > - Sebastian > > > [1] as_strided does not currently support arr.flags.writable = True for > its result array. > > > > On Sun, Jan 24, 2016 at 9:20 AM, Nathaniel Smith > > wrote: > > > On Sat, Jan 23, 2016 at 1:25 PM, Sebastian Berg > > > wrote: > > > > > > > > Hi all, > > > > > > > > I have just opened a PR, to make as_strided writeonly (as > > > default). The > > > > > > I think you meant readonly :-) > > > > > > > reasoning for this change is that an `as_strided` array often > > > have self > > > > overlapping memory. However, writing to an array where multiple > > > > elements have the identical memory address can be confusing, and > > > the > > > > results are typically unpredictable. > > > > > > > > Considering the danger, the proposal is to add a `readonly=True`. > > > A > > > > poweruser (who that function is designed for anyway), could thus > > > still > > > > get a writeable array. > > > > > > > > For the moment, writing to the result would raise a FutureWarning > > > with > > > > `readonly="warn"`. > > > > > > This should just be a deprecation warning, right? (Because > > > switching > > > an array from writeable->readonly might cause previously correct > > > code > > > to error out, but not to silently start returning different > > > results.) > > > > > > > Do you agree with this, or would it be a major inconvenience? > > > > > > AFAIK the only use cases for as_strided involve self-overlap (for > > > non-self-overlap you can generally use reshape / indexing / etc. > > > and > > > it's much simpler). So +1 from me. > > > > > > -n > > > > > > -- > > > Nathaniel J. Smith -- https://vorpus.org > > > _______________________________________________ > > > NumPy-Discussion mailing list > > > NumPy-Discussion at scipy.org > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Mon Jan 25 03:23:04 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Mon, 25 Jan 2016 08:23:04 +0000 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> <1453627049.11352.14.camel@sipsolutions.net> Message-ID: On Mon Jan 25 01:46:55 2016 GMT+0100, Juan Nunez-Iglesias wrote: > > Yeah, that is a real use case. I am not planning to remove the option, > > but it would be as a `readonly` keyword argument, which means you would > > need to make the code depend on the numpy version if you require a > > writable array [1]. > > > > [1] as_strided does not currently support arr.flags.writable = True for > its result array. > > Can you explain this in more detail? I'm writing to the result without > problem. Anyway, it only occurred to me after your response that the > deprecation path would be quite disruptive to downstream libraries, > including scikit-image. My feeling is that unless someone has been bitten > by this (?), the benefit does not outweigh the cost of deprecation. Perhaps > something to push to 2.0? > Setting the writeable flag to true, when it is set false before does not work with the way as_strided works right now with a DummyArray. I guess it is because its base is somehow not an ndarray. It should be possible to try/except creating it with np.ndarray, then it would be possible in your usecase. I do not have experience of beeing badly bitten, myself. Would you think it is fine if setting the flag to true would work in your case? > On Sun, Jan 24, 2016 at 8:17 PM, Sebastian Berg > wrote: > > > On So, 2016-01-24 at 13:00 +1100, Juan Nunez-Iglesias wrote: > > > I've used as_strided before to create an "endless" output array when > > > I didn't care about the result of an operation, just the side effect. > > > See eg here. So I would certainly like option to remain to get a > > > writeable array. In general, I'm sceptical about whether the benefits > > > outweigh the costs. > > > > Yeah, that is a real use case. I am not planning to remove the option, > > but it would be as a `readonly` keyword argument, which means you would > > need to make the code depend on the numpy version if you require a > > writable array [1]. > > This actually somewhat defeats the purpose of all of this, but > > `np.ndarray` can do this dummy thing for you I think, so you could get > > around that, but.... > > > > The purpose is that if you actually would use an as_strided array in > > your operation, the result is unpredictable (not just complicated). And > > while as_strided is IMO designed to be used by people who know what > > they are doing, I have a feeling it is being used quite a lot in > > general. > > > > We did a similar thing for the new `broadcast_to`, though I think there > > we decided to skip the readonly until complains happen. > > > > Actually there is one more thing I might do. And that is issue a > > UserWarning when new array quite likely points to invalid memory. > > > > - Sebastian > > > > > > [1] as_strided does not currently support arr.flags.writable = True for > > its result array. > > > > > > > On Sun, Jan 24, 2016 at 9:20 AM, Nathaniel Smith > > > wrote: > > > > On Sat, Jan 23, 2016 at 1:25 PM, Sebastian Berg > > > > wrote: > > > > > > > > > > Hi all, > > > > > > > > > > I have just opened a PR, to make as_strided writeonly (as > > > > default). The > > > > > > > > I think you meant readonly :-) > > > > > > > > > reasoning for this change is that an `as_strided` array often > > > > have self > > > > > overlapping memory. However, writing to an array where multiple > > > > > elements have the identical memory address can be confusing, and > > > > the > > > > > results are typically unpredictable. > > > > > > > > > > Considering the danger, the proposal is to add a `readonly=True`. > > > > A > > > > > poweruser (who that function is designed for anyway), could thus > > > > still > > > > > get a writeable array. > > > > > > > > > > For the moment, writing to the result would raise a FutureWarning > > > > with > > > > > `readonly="warn"`. > > > > > > > > This should just be a deprecation warning, right? (Because > > > > switching > > > > an array from writeable->readonly might cause previously correct > > > > code > > > > to error out, but not to silently start returning different > > > > results.) > > > > > > > > > Do you agree with this, or would it be a major inconvenience? > > > > > > > > AFAIK the only use cases for as_strided involve self-overlap (for > > > > non-self-overlap you can generally use reshape / indexing / etc. > > > > and > > > > it's much simpler). So +1 from me. > > > > > > > > -n > > > > > > > > -- > > > > Nathaniel J. Smith -- https://vorpus.org > > > > _______________________________________________ > > > > NumPy-Discussion mailing list > > > > NumPy-Discussion at scipy.org > > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > > > _______________________________________________ > > > NumPy-Discussion mailing list > > > NumPy-Discussion at scipy.org > > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > > From njs at pobox.com Mon Jan 25 04:50:59 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 25 Jan 2016 01:50:59 -0800 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> <1453627049.11352.14.camel@sipsolutions.net> Message-ID: On Sun, Jan 24, 2016 at 4:46 PM, Juan Nunez-Iglesias wrote: >> Yeah, that is a real use case. I am not planning to remove the option, >> but it would be as a `readonly` keyword argument, which means you would >> need to make the code depend on the numpy version if you require a >> writable array [1]. >> >> [1] as_strided does not currently support arr.flags.writable = True for > its result array. > > Can you explain this in more detail? I'm writing to the result without > problem. I think what Sebastian's saying is: - right now it returns an array with the writeable flag set to True - it would make more sense semantically (in the general "try to nudge people towards not shooting themselves in the foot" sense) if by default it returned an array with the writeable flag set to False - of course it should still be possible to get an array with writeable=True by explicitly requesting it (e.g. with as_strided(..., writeable=True)). - The transition period then would be that it continues to default to writeable=True, but if you don't specify writeable=True, and then you write to the array, it works but it issues a DeprecationWarning telling you that you ought to update your code to explicitly request writeable=True at some point over the next ~year. - Often, if you have a readonly ndarray object, you can actually switch it to being writeable by executing `a.flags.writeable = True`. This is a bit hacky, and there are various safety conditions that have to hold, but often it works. Except it turns out that if as_strided is modified to return a readonly array, then some details of those safety checks mean that this trick *doesn't* work -- if it was created readonly, then it will stay that way. This doesn't really matter, though -- it just means that you have to use something like a writeable=True kwarg to ensure that it gets created writeable in the first place, instead of creating it readonly and then fixing it up after the fact. But it does rule this out as a possible alternative API. Oh well, it'd be pretty klunky anyway. -n -- Nathaniel J. Smith -- https://vorpus.org From sturla.molden at gmail.com Mon Jan 25 10:11:03 2016 From: sturla.molden at gmail.com (Sturla Molden) Date: Mon, 25 Jan 2016 16:11:03 +0100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: <1453584300.6435.6.camel@sipsolutions.net> References: <1453584300.6435.6.camel@sipsolutions.net> Message-ID: On 23/01/16 22:25, Sebastian Berg wrote: > Do you agree with this, or would it be a major inconvenience? I think any user of as_strided should be considered a power user. This is an inherently dangerous function, that can easily segfault the process. Anyone who uses as_strided should be assumed to have taken all precautions. -1 Sturla From sebastian at sipsolutions.net Mon Jan 25 12:06:33 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Mon, 25 Jan 2016 18:06:33 +0100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> Message-ID: <1453741593.27237.24.camel@sipsolutions.net> On Mo, 2016-01-25 at 16:11 +0100, Sturla Molden wrote: > On 23/01/16 22:25, Sebastian Berg wrote: > > > Do you agree with this, or would it be a major inconvenience? > > I think any user of as_strided should be considered a power user. > This > is an inherently dangerous function, that can easily segfault the > process. Anyone who uses as_strided should be assumed to have taken > all > precautions. > I am ready to accept this notion (and I guess just put repeated warnings in the doc string). However, two things about it, first my impression is that for a lot of "not really power users" this function sometimes seems like a big hammer to solve all their problems. Second, I think even power users have sometimes wrong ideas about numpy's ufuncs and memory overlap. This starts with operations such as `arr[1:] += arr[:-1]` (see [1]) (for which there is at least a start on fixing it), and only gets worse with self-overlapping arrays. That said, I guess I could agree with you in the regard that there are so many *other* awful ways to use as_strided, that maybe it really is just so bad, that improving one thing doesn't actually help anyway ;). I was actually considering adding a UserWarning when it is likely that invalid memory is being addressed. I still dislike that it returns something writable *as default* though. Unless you write a single element, writing to such an array will be in many cases unpredictable, no matter how power user you are. - Sebastian [1] WARNING: This is a dangerous example, we ideally want it to be identical to arr[1:] += arr[:-1].copy() always: In [7]: arr = np.arange(10).reshape(5, 2) In [8]: arr[1:] += arr[:-1] In [9]: arr Out[9]: array([[ 0, 1], [ 2, 4], [ 6, 9], [12, 16], [20, 25]]) In [10]: arr = np.arange(10)[::-1].copy()[::-1].reshape(5, 2) In [11]: arr[1:] += arr[:-1] # happens to be "correct" In [12]: arr Out[12]: array([[ 0, 1], [ 2, 4], [ 6, 8], [10, 12], [14, 16]]) > -1 > > Sturla > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From sturla.molden at gmail.com Mon Jan 25 12:25:13 2016 From: sturla.molden at gmail.com (Sturla Molden) Date: Mon, 25 Jan 2016 18:25:13 +0100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: <1453741593.27237.24.camel@sipsolutions.net> References: <1453584300.6435.6.camel@sipsolutions.net> <1453741593.27237.24.camel@sipsolutions.net> Message-ID: On 25/01/16 18:06, Sebastian Berg wrote: > That said, I guess I could agree with you in the regard that there are > so many *other* awful ways to use as_strided, that maybe it really is > just so bad, that improving one thing doesn't actually help anyway ;). That is roughly my position on this, yes. :) Sturla From gfyoung17 at gmail.com Mon Jan 25 17:37:59 2016 From: gfyoung17 at gmail.com (G Young) Date: Mon, 25 Jan 2016 22:37:59 +0000 Subject: [Numpy-discussion] Appveyor Testing Changes Message-ID: Hello all, I currently have a branch on my fork (not PR) where I am experimenting with running Appveyor CI via Virtualenv instead of Conda. I have build running here . What do people think of using Virtualenv (as we do on Travis) instead of Conda for testing? Thanks, Greg -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Jan 25 17:50:07 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 25 Jan 2016 14:50:07 -0800 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: Message-ID: On Mon, Jan 25, 2016 at 2:37 PM, G Young wrote: > Hello all, > > I currently have a branch on my fork (not PR) where I am experimenting with > running Appveyor CI via Virtualenv instead of Conda. I have build running > here. What do people think of using Virtualenv (as we do on Travis) instead > of Conda for testing? Can you summarize the advantages and disadvantages that you're aware of? -n -- Nathaniel J. Smith -- https://vorpus.org From gfyoung17 at gmail.com Mon Jan 25 18:21:28 2016 From: gfyoung17 at gmail.com (G Young) Date: Mon, 25 Jan 2016 23:21:28 +0000 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: Message-ID: With regards to testing numpy, both Conda and Pip + Virtualenv work quite well. I have used both to install master and run unit tests, and both pass with flying colors. This chart here illustrates my point nicely as well. However, I can't seem to find / access Conda installations for slightly older versions of Python (e.g. Python 3.4). Perhaps this is not much of an issue now with the next release (1.12) being written only for Python 2.7 and Python 3.4 - 5. However, if we were to wind the clock slightly back to when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being able to test on a variety of Python distributions given the nature of Conda releases. Maybe that situation is no longer the case now, but in the long term, it could easily happen again. Greg On Mon, Jan 25, 2016 at 10:50 PM, Nathaniel Smith wrote: > On Mon, Jan 25, 2016 at 2:37 PM, G Young wrote: > > Hello all, > > > > I currently have a branch on my fork (not PR) where I am experimenting > with > > running Appveyor CI via Virtualenv instead of Conda. I have build > running > > here. What do people think of using Virtualenv (as we do on Travis) > instead > > of Conda for testing? > > Can you summarize the advantages and disadvantages that you're aware of? > > -n > > -- > Nathaniel J. Smith -- https://vorpus.org > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Mon Jan 25 19:07:46 2016 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Tue, 26 Jan 2016 11:07:46 +1100 Subject: [Numpy-discussion] Make as_strided result writeonly In-Reply-To: References: <1453584300.6435.6.camel@sipsolutions.net> <1453741593.27237.24.camel@sipsolutions.net> Message-ID: I agree that it's not ideal that the return value of as_strided is writable. However, to be clear, this *would* break the API, which should not happen between minor releases when using semantic versioning. Even with a deprecation cycle, for libraries such as scikit-image that want to maintain broad compatibility with multiple numpy versions, we would then have to have some code to detect which version of numpy we're dealing with, and do something different depending on the version. That's a big development cost for something that has not been shown to cause any problems. And btw, although some people might use as_strided that aren't super-amazing level 42 programmers, I would say that by that stage they are probably comfortable enough to troubleshoot the shitstorm that's about to fall on them. =P On Tue, Jan 26, 2016 at 4:25 AM, Sturla Molden wrote: > On 25/01/16 18:06, Sebastian Berg wrote: > > That said, I guess I could agree with you in the regard that there are >> so many *other* awful ways to use as_strided, that maybe it really is >> just so bad, that improving one thing doesn't actually help anyway ;). >> > > That is roughly my position on this, yes. :) > > > Sturla > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From msarahan at gmail.com Mon Jan 25 19:08:49 2016 From: msarahan at gmail.com (Michael Sarahan) Date: Tue, 26 Jan 2016 00:08:49 +0000 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: Message-ID: Conda can generally install older versions of python in environments: conda create -n myenv python=3.4 You really don't need any particular initial version of python/conda in order to do this. You do, however, need to activate the new environment to use it: activate myenv (For windows, you do not need "source") Hth, Michael On Mon, Jan 25, 2016, 17:21 G Young wrote: > With regards to testing numpy, both Conda and Pip + Virtualenv work quite > well. I have used both to install master and run unit tests, and both pass > with flying colors. This chart here > illustrates > my point nicely as well. > > However, I can't seem to find / access Conda installations for slightly > older versions of Python (e.g. Python 3.4). Perhaps this is not much of an > issue now with the next release (1.12) being written only for Python 2.7 > and Python 3.4 - 5. However, if we were to wind the clock slightly back to > when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being > able to test on a variety of Python distributions given the nature of Conda > releases. Maybe that situation is no longer the case now, but in the long > term, it could easily happen again. > > Greg > > > On Mon, Jan 25, 2016 at 10:50 PM, Nathaniel Smith wrote: > >> On Mon, Jan 25, 2016 at 2:37 PM, G Young wrote: >> > Hello all, >> > >> > I currently have a branch on my fork (not PR) where I am experimenting >> with >> > running Appveyor CI via Virtualenv instead of Conda. I have build >> running >> > here. What do people think of using Virtualenv (as we do on Travis) >> instead >> > of Conda for testing? >> >> Can you summarize the advantages and disadvantages that you're aware of? >> >> -n >> >> -- >> Nathaniel J. Smith -- https://vorpus.org >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bryanv at continuum.io Mon Jan 25 19:13:25 2016 From: bryanv at continuum.io (Bryan Van de Ven) Date: Mon, 25 Jan 2016 18:13:25 -0600 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: Message-ID: <1C88BDDD-238E-4A34-9648-6AAD71B88D53@continuum.io> > On Jan 25, 2016, at 5:21 PM, G Young wrote: > > With regards to testing numpy, both Conda and Pip + Virtualenv work quite well. I have used both to install master and run unit tests, and both pass with flying colors. This chart here illustrates my point nicely as well. > > However, I can't seem to find / access Conda installations for slightly older versions of Python (e.g. Python 3.4). Perhaps this is not much of an issue now with the next release (1.12) being written only for Python 2.7 and Python 3.4 - 5. However, if we were to wind the clock slightly back to when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being able to test on a variety of Python distributions given the nature of Conda releases. Maybe that situation is no longer the case now, but in the long term, it could easily happen again. Why do you need the installers? The whole point of conda is to be able to create environments with whatever configuration you need. Just pick the newest installer and use "conda create" from there: bryan at 0199-bryanv (git:streaming) ~/work/bokeh/bokeh $ conda create -n py26 python=2.6 Fetching package metadata: .............. Solving package specifications: .......... Package plan for installation in environment /Users/bryan/anaconda/envs/py26: The following packages will be downloaded: package | build ---------------------------|----------------- setuptools-18.0.1 | py26_0 343 KB pip-7.1.0 | py26_0 1.4 MB ------------------------------------------------------------ Total: 1.7 MB The following NEW packages will be INSTALLED: openssl: 1.0.1k-1 pip: 7.1.0-py26_0 python: 2.6.9-1 readline: 6.2-2 setuptools: 18.0.1-py26_0 sqlite: 3.9.2-0 tk: 8.5.18-0 zlib: 1.2.8-0 Proceed ([y]/n)? From gfyoung17 at gmail.com Mon Jan 25 20:13:32 2016 From: gfyoung17 at gmail.com (G Young) Date: Tue, 26 Jan 2016 01:13:32 +0000 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: <1C88BDDD-238E-4A34-9648-6AAD71B88D53@continuum.io> References: <1C88BDDD-238E-4A34-9648-6AAD71B88D53@continuum.io> Message-ID: Ah, yes, that is true. That point had completely escaped my mind. In light of this, it seems that it's not worth the while then to completely switch over to pip + virtualenv. It's might be better actually to rewrite the current Appveyor tests to use environments so that the test suite can be expanded, though I'm not sure how prudent that is given how slow Appveyor tests run. Greg On Tue, Jan 26, 2016 at 12:13 AM, Bryan Van de Ven wrote: > > > On Jan 25, 2016, at 5:21 PM, G Young wrote: > > > > With regards to testing numpy, both Conda and Pip + Virtualenv work > quite well. I have used both to install master and run unit tests, and > both pass with flying colors. This chart here illustrates my point nicely > as well. > > > > However, I can't seem to find / access Conda installations for slightly > older versions of Python (e.g. Python 3.4). Perhaps this is not much of an > issue now with the next release (1.12) being written only for Python 2.7 > and Python 3.4 - 5. However, if we were to wind the clock slightly back to > when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being > able to test on a variety of Python distributions given the nature of Conda > releases. Maybe that situation is no longer the case now, but in the long > term, it could easily happen again. > > Why do you need the installers? The whole point of conda is to be able to > create environments with whatever configuration you need. Just pick the > newest installer and use "conda create" from there: > > bryan at 0199-bryanv (git:streaming) ~/work/bokeh/bokeh $ conda create -n > py26 python=2.6 > Fetching package metadata: .............. > Solving package specifications: .......... > Package plan for installation in environment > /Users/bryan/anaconda/envs/py26: > > The following packages will be downloaded: > > package | build > ---------------------------|----------------- > setuptools-18.0.1 | py26_0 343 KB > pip-7.1.0 | py26_0 1.4 MB > ------------------------------------------------------------ > Total: 1.7 MB > > The following NEW packages will be INSTALLED: > > openssl: 1.0.1k-1 > pip: 7.1.0-py26_0 > python: 2.6.9-1 > readline: 6.2-2 > setuptools: 18.0.1-py26_0 > sqlite: 3.9.2-0 > tk: 8.5.18-0 > zlib: 1.2.8-0 > > Proceed ([y]/n)? > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremy.Solbrig at colostate.edu Tue Jan 26 00:43:34 2016 From: Jeremy.Solbrig at colostate.edu (Solbrig,Jeremy) Date: Tue, 26 Jan 2016 05:43:34 +0000 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X Message-ID: Hello, Much of what is below was copied from this stack overflow question. I am attempting to subclass numpy.ma.MaskedArray. I am currently using Python v2.7.10. The problem discussed below does not occur in Numpy v1.9.2, but does occur in all versions of Numpy v1.10.x. In all versions of Numpy v1.10.x, using mathematical operators on my subclass behaves differently than using the analogous ufunc. When using the ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, __array_finalize__, and __array_wrap__ are all called as expected, however, when using the symbolic operator (e.g. arr1-arr2) only __array_finalize__ is called. As a consequence, I lose any information stored in arr._optinfo when a mathematical operator is used. Here is a code snippet that illustrates the issue. #!/bin/env python import numpy as np from numpy.ma import MaskedArray, nomask class InfoArray(MaskedArray): def __new__(cls, info=None, data=None, mask=nomask, dtype=None, copy=False, subok=True, ndmin=0, fill_value=None, keep_mask=True, hard_mask=None, shrink=True, **kwargs): obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, fill_value=fill_value, hard_mask=hard_mask, shrink=shrink, **kwargs) obj._optinfo['info'] = info return obj def __array_prepare__(self, out, context=None): print '__array_prepare__' return super(InfoArray, self).__array_prepare__(out, context) def __array_wrap__(self, out, context=None): print '__array_wrap__' return super(InfoArray, self).__array_wrap__(out, context) def __array_finalize__(self, obj): print '__array_finalize__' return super(InfoArray, self).__array_finalize__(obj) if __name__ == "__main__": arr1 = InfoArray('test', data=[1,2,3,4,5,6]) arr2 = InfoArray(data=[0,1,2,3,4,5]) diff1 = np.subtract(arr1, arr2) print diff1._optinfo diff2 = arr1-arr2 print diff2._optinfo If run, the output looks like this: $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here __array_finalize__ __array_finalize__ __array_prepare__ __array_finalize__ __array_wrap__ __array_finalize__ {'info': 'test'} #Executing arr1-arr2 here __array_finalize__ {} Currently I have simply downgraded to 1.9.2 to solve the problem for myself, but have been having difficulty figuring out where the difference lies between 1.9.2 and 1.10.0. Thanks, Jeremy -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 12:11:42 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 10:11:42 -0700 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy < Jeremy.Solbrig at colostate.edu> wrote: > Hello, > > Much of what is below was copied from this stack overflow question. > > > > I am attempting to subclass numpy.ma.MaskedArray. I am currently using > Python v2.7.10. The problem discussed below does not occur in Numpy > v1.9.2, but does occur in all versions of Numpy v1.10.x. > > In all versions of Numpy v1.10.x, using mathematical operators on my > subclass behaves differently than using the analogous ufunc. When using the > ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, > __array_finalize__, and __array_wrap__ are all called as expected, however, > when using the symbolic operator (e.g. arr1-arr2) only __array_finalize__ > is called. As a consequence, I lose any information stored in arr._optinfo when > a mathematical operator is used. > > Here is a code snippet that illustrates the issue. > > #!/bin/env python > import numpy as npfrom numpy.ma import MaskedArray, nomask > class InfoArray(MaskedArray): > def __new__(cls, info=None, data=None, mask=nomask, dtype=None, > copy=False, subok=True, ndmin=0, fill_value=None, > keep_mask=True, hard_mask=None, shrink=True, **kwargs): > obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, > dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, > fill_value=fill_value, hard_mask=hard_mask, > shrink=shrink, **kwargs) > obj._optinfo['info'] = info > return obj > > def __array_prepare__(self, out, context=None): > print '__array_prepare__' > return super(InfoArray, self).__array_prepare__(out, context) > > def __array_wrap__(self, out, context=None): > print '__array_wrap__' > return super(InfoArray, self).__array_wrap__(out, context) > > def __array_finalize__(self, obj): > print '__array_finalize__' > return super(InfoArray, self).__array_finalize__(obj) > if __name__ == "__main__": > arr1 = InfoArray('test', data=[1,2,3,4,5,6]) > arr2 = InfoArray(data=[0,1,2,3,4,5]) > > diff1 = np.subtract(arr1, arr2) > print diff1._optinfo > > diff2 = arr1-arr2 > print diff2._optinfo > > If run, the output looks like this: > > $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here > __array_finalize__ > __array_finalize__ > __array_prepare__ > __array_finalize__ > __array_wrap__ > __array_finalize__{'info': 'test'}#Executing arr1-arr2 here > __array_finalize__{} > > Currently I have simply downgraded to 1.9.2 to solve the problem for > myself, but have been having difficulty figuring out where the difference > lies between 1.9.2 and 1.10.0. > I don't see a difference between 1.9.2 and 1.10.0 in this test, so I suspect it is something else. Could you try 1.10.4 to see if the something else has been fixed? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 12:17:30 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 10:17:30 -0700 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 10:11 AM, Charles R Harris < charlesr.harris at gmail.com> wrote: > > > On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy < > Jeremy.Solbrig at colostate.edu> wrote: > >> Hello, >> >> Much of what is below was copied from this stack overflow question. >> >> >> >> I am attempting to subclass numpy.ma.MaskedArray. I am currently using >> Python v2.7.10. The problem discussed below does not occur in Numpy >> v1.9.2, but does occur in all versions of Numpy v1.10.x. >> >> In all versions of Numpy v1.10.x, using mathematical operators on my >> subclass behaves differently than using the analogous ufunc. When using the >> ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, >> __array_finalize__, and __array_wrap__ are all called as expected, however, >> when using the symbolic operator (e.g. arr1-arr2) only >> __array_finalize__ is called. As a consequence, I lose any information >> stored in arr._optinfo when a mathematical operator is used. >> >> Here is a code snippet that illustrates the issue. >> >> #!/bin/env python >> import numpy as npfrom numpy.ma import MaskedArray, nomask >> class InfoArray(MaskedArray): >> def __new__(cls, info=None, data=None, mask=nomask, dtype=None, >> copy=False, subok=True, ndmin=0, fill_value=None, >> keep_mask=True, hard_mask=None, shrink=True, **kwargs): >> obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, >> dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, >> fill_value=fill_value, hard_mask=hard_mask, >> shrink=shrink, **kwargs) >> obj._optinfo['info'] = info >> return obj >> >> def __array_prepare__(self, out, context=None): >> print '__array_prepare__' >> return super(InfoArray, self).__array_prepare__(out, context) >> >> def __array_wrap__(self, out, context=None): >> print '__array_wrap__' >> return super(InfoArray, self).__array_wrap__(out, context) >> >> def __array_finalize__(self, obj): >> print '__array_finalize__' >> return super(InfoArray, self).__array_finalize__(obj) >> if __name__ == "__main__": >> arr1 = InfoArray('test', data=[1,2,3,4,5,6]) >> arr2 = InfoArray(data=[0,1,2,3,4,5]) >> >> diff1 = np.subtract(arr1, arr2) >> print diff1._optinfo >> >> diff2 = arr1-arr2 >> print diff2._optinfo >> >> If run, the output looks like this: >> >> $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here >> __array_finalize__ >> __array_finalize__ >> __array_prepare__ >> __array_finalize__ >> __array_wrap__ >> __array_finalize__{'info': 'test'}#Executing arr1-arr2 here >> __array_finalize__{} >> >> Currently I have simply downgraded to 1.9.2 to solve the problem for >> myself, but have been having difficulty figuring out where the difference >> lies between 1.9.2 and 1.10.0. >> > > I don't see a difference between 1.9.2 and 1.10.0 in this test, so I > suspect it is something else. Could you try 1.10.4 to see if the something > else has been fixed? > Which is to say, it isn't in the calls to prepare, wrap, and finalize. Now to look in _optinfo. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremy.Solbrig at colostate.edu Tue Jan 26 12:27:43 2016 From: Jeremy.Solbrig at colostate.edu (Solbrig,Jeremy) Date: Tue, 26 Jan 2016 17:27:43 +0000 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: Hello Chuck, I receive the same result with 1.10.4. I agree that it looks like __array_prepare__, __array_finalize__, and __array_wrap__ have not been changed. I?m starting to dig into the source again, but focusing on the _MaskedBinaryOperation class to try to understand what is going on there. Jeremy From: NumPy-Discussion > on behalf of Charles R Harris > Reply-To: Discussion of Numerical Python > Date: Tuesday, January 26, 2016 at 10:17 AM To: Discussion of Numerical Python > Subject: Re: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X On Tue, Jan 26, 2016 at 10:11 AM, Charles R Harris > wrote: On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy > wrote: Hello, Much of what is below was copied from this stack overflow question. I am attempting to subclass numpy.ma.MaskedArray. I am currently using Python v2.7.10. The problem discussed below does not occur in Numpy v1.9.2, but does occur in all versions of Numpy v1.10.x. In all versions of Numpy v1.10.x, using mathematical operators on my subclass behaves differently than using the analogous ufunc. When using the ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, __array_finalize__, and __array_wrap__ are all called as expected, however, when using the symbolic operator (e.g. arr1-arr2) only __array_finalize__ is called. As a consequence, I lose any information stored in arr._optinfo when a mathematical operator is used. Here is a code snippet that illustrates the issue. #!/bin/env pythonimport numpy as np from numpy.ma import MaskedArray, nomask class InfoArray(MaskedArray): def __new__(cls, info=None, data=None, mask=nomask, dtype=None, copy=False, subok=True, ndmin=0, fill_value=None, keep_mask=True, hard_mask=None, shrink=True, **kwargs): obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, fill_value=fill_value, hard_mask=hard_mask, shrink=shrink, **kwargs) obj._optinfo['info'] = info return obj def __array_prepare__(self, out, context=None): print '__array_prepare__' return super(InfoArray, self).__array_prepare__(out, context) def __array_wrap__(self, out, context=None): print '__array_wrap__' return super(InfoArray, self).__array_wrap__(out, context) def __array_finalize__(self, obj): print '__array_finalize__' return super(InfoArray, self).__array_finalize__(obj)if __name__ == "__main__": arr1 = InfoArray('test', data=[1,2,3,4,5,6]) arr2 = InfoArray(data=[0,1,2,3,4,5]) diff1 = np.subtract(arr1, arr2) print diff1._optinfo diff2 = arr1-arr2 print diff2._optinfo If run, the output looks like this: $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here __array_finalize__ __array_finalize__ __array_prepare__ __array_finalize__ __array_wrap__ __array_finalize__ {'info': 'test'}#Executing arr1-arr2 here __array_finalize__ {} Currently I have simply downgraded to 1.9.2 to solve the problem for myself, but have been having difficulty figuring out where the difference lies between 1.9.2 and 1.10.0. I don't see a difference between 1.9.2 and 1.10.0 in this test, so I suspect it is something else. Could you try 1.10.4 to see if the something else has been fixed? Which is to say, it isn't in the calls to prepare, wrap, and finalize. Now to look in _optinfo. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 12:35:19 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 10:35:19 -0700 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 10:27 AM, Solbrig,Jeremy < Jeremy.Solbrig at colostate.edu> wrote: > Hello Chuck, > > I receive the same result with 1.10.4. I agree that it looks like > __array_prepare__, __array_finalize__, and __array_wrap__ have not been > changed. I?m starting to dig into the source again, but focusing on the > _MaskedBinaryOperation class to try to understand what is going on there. > > Jeremy > > From: NumPy-Discussion on behalf of > Charles R Harris > Reply-To: Discussion of Numerical Python > Date: Tuesday, January 26, 2016 at 10:17 AM > To: Discussion of Numerical Python > Subject: Re: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy > v1.10.X > > > > On Tue, Jan 26, 2016 at 10:11 AM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> >> >> On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy < >> Jeremy.Solbrig at colostate.edu> wrote: >> >>> Hello, >>> >>> Much of what is below was copied from this stack overflow question. >>> >>> >>> >>> I am attempting to subclass numpy.ma.MaskedArray. I am currently using >>> Python v2.7.10. The problem discussed below does not occur in Numpy >>> v1.9.2, but does occur in all versions of Numpy v1.10.x. >>> >>> In all versions of Numpy v1.10.x, using mathematical operators on my >>> subclass behaves differently than using the analogous ufunc. When using the >>> ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, >>> __array_finalize__, and __array_wrap__ are all called as expected, however, >>> when using the symbolic operator (e.g. arr1-arr2) only >>> __array_finalize__ is called. As a consequence, I lose any information >>> stored in arr._optinfo when a mathematical operator is used. >>> >>> Here is a code snippet that illustrates the issue. >>> >>> #!/bin/env pythonimport numpy as npfrom numpy.ma import MaskedArray, nomask >>> class InfoArray(MaskedArray): >>> def __new__(cls, info=None, data=None, mask=nomask, dtype=None, >>> copy=False, subok=True, ndmin=0, fill_value=None, >>> keep_mask=True, hard_mask=None, shrink=True, **kwargs): >>> obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, >>> dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, >>> fill_value=fill_value, hard_mask=hard_mask, >>> shrink=shrink, **kwargs) >>> obj._optinfo['info'] = info >>> return obj >>> >>> def __array_prepare__(self, out, context=None): >>> print '__array_prepare__' >>> return super(InfoArray, self).__array_prepare__(out, context) >>> >>> def __array_wrap__(self, out, context=None): >>> print '__array_wrap__' >>> return super(InfoArray, self).__array_wrap__(out, context) >>> >>> def __array_finalize__(self, obj): >>> print '__array_finalize__' >>> return super(InfoArray, self).__array_finalize__(obj)if __name__ == "__main__": >>> arr1 = InfoArray('test', data=[1,2,3,4,5,6]) >>> arr2 = InfoArray(data=[0,1,2,3,4,5]) >>> >>> diff1 = np.subtract(arr1, arr2) >>> print diff1._optinfo >>> >>> diff2 = arr1-arr2 >>> print diff2._optinfo >>> >>> If run, the output looks like this: >>> >>> $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here >>> __array_finalize__ >>> __array_finalize__ >>> __array_prepare__ >>> __array_finalize__ >>> __array_wrap__ >>> __array_finalize__{'info': 'test'}#Executing arr1-arr2 here >>> __array_finalize__{} >>> >>> Currently I have simply downgraded to 1.9.2 to solve the problem for >>> myself, but have been having difficulty figuring out where the difference >>> lies between 1.9.2 and 1.10.0. >>> >> >> I don't see a difference between 1.9.2 and 1.10.0 in this test, so I >> suspect it is something else. Could you try 1.10.4 to see if the something >> else has been fixed? >> > > Which is to say, it isn't in the calls to prepare, wrap, and finalize. Now > to look in _optinfo. > > Please open an issue on github, the mailing list is not a good place to deal with this. I've got a bisect script running, so we should soon know where the change occurred. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremy.Solbrig at colostate.edu Tue Jan 26 12:36:03 2016 From: Jeremy.Solbrig at colostate.edu (Solbrig,Jeremy) Date: Tue, 26 Jan 2016 17:36:03 +0000 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: <02355075-BDDA-4936-B450-02C78A6CD0F0@colostate.edu> Will do. Thanks for looking into this! From: NumPy-Discussion > on behalf of Charles R Harris > Reply-To: Discussion of Numerical Python > Date: Tuesday, January 26, 2016 at 10:35 AM To: Discussion of Numerical Python > Subject: Re: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X On Tue, Jan 26, 2016 at 10:27 AM, Solbrig,Jeremy > wrote: Hello Chuck, I receive the same result with 1.10.4. I agree that it looks like __array_prepare__, __array_finalize__, and __array_wrap__ have not been changed. I?m starting to dig into the source again, but focusing on the _MaskedBinaryOperation class to try to understand what is going on there. Jeremy From: NumPy-Discussion > on behalf of Charles R Harris > Reply-To: Discussion of Numerical Python > Date: Tuesday, January 26, 2016 at 10:17 AM To: Discussion of Numerical Python > Subject: Re: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X On Tue, Jan 26, 2016 at 10:11 AM, Charles R Harris > wrote: On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy > wrote: Hello, Much of what is below was copied from this stack overflow question. I am attempting to subclass numpy.ma.MaskedArray. I am currently using Python v2.7.10. The problem discussed below does not occur in Numpy v1.9.2, but does occur in all versions of Numpy v1.10.x. In all versions of Numpy v1.10.x, using mathematical operators on my subclass behaves differently than using the analogous ufunc. When using the ufunc directly (e.g. np.subtract(arr1, arr2)), __array_prepare__, __array_finalize__, and __array_wrap__ are all called as expected, however, when using the symbolic operator (e.g. arr1-arr2) only __array_finalize__ is called. As a consequence, I lose any information stored in arr._optinfo when a mathematical operator is used. Here is a code snippet that illustrates the issue. #!/bin/env pythonimport numpy as np from numpy.ma import MaskedArray, nomask class InfoArray(MaskedArray): def __new__(cls, info=None, data=None, mask=nomask, dtype=None, copy=False, subok=True, ndmin=0, fill_value=None, keep_mask=True, hard_mask=None, shrink=True, **kwargs): obj = super(InfoArray, cls).__new__(cls, data=data, mask=mask, dtype=dtype, copy=copy, subok=subok, ndmin=ndmin, fill_value=fill_value, hard_mask=hard_mask, shrink=shrink, **kwargs) obj._optinfo['info'] = info return obj def __array_prepare__(self, out, context=None): print '__array_prepare__' return super(InfoArray, self).__array_prepare__(out, context) def __array_wrap__(self, out, context=None): print '__array_wrap__' return super(InfoArray, self).__array_wrap__(out, context) def __array_finalize__(self, obj): print '__array_finalize__' return super(InfoArray, self).__array_finalize__(obj)if __name__ == "__main__": arr1 = InfoArray('test', data=[1,2,3,4,5,6]) arr2 = InfoArray(data=[0,1,2,3,4,5]) diff1 = np.subtract(arr1, arr2) print diff1._optinfo diff2 = arr1-arr2 print diff2._optinfo If run, the output looks like this: $ python test_ma_sub.py #Call to np.subtract(arr1, arr2) here __array_finalize__ __array_finalize__ __array_prepare__ __array_finalize__ __array_wrap__ __array_finalize__ {'info': 'test'}#Executing arr1-arr2 here __array_finalize__ {} Currently I have simply downgraded to 1.9.2 to solve the problem for myself, but have been having difficulty figuring out where the difference lies between 1.9.2 and 1.10.0. I don't see a difference between 1.9.2 and 1.10.0 in this test, so I suspect it is something else. Could you try 1.10.4 to see if the something else has been fixed? Which is to say, it isn't in the calls to prepare, wrap, and finalize. Now to look in _optinfo. Please open an issue on github, the mailing list is not a good place to deal with this. I've got a bisect script running, so we should soon know where the change occurred. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Tue Jan 26 12:45:04 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Tue, 26 Jan 2016 18:45:04 +0100 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: References: Message-ID: <1453830304.26205.4.camel@sipsolutions.net> On Di, 2016-01-26 at 17:27 +0000, Solbrig,Jeremy wrote: > Hello Chuck, > > I receive the same result with 1.10.4. I agree that it looks like > __array_prepare__, __array_finalize__, and __array_wrap__ have not > been changed. I?m starting to dig into the source again, but > focusing on the _MaskedBinaryOperation class to try to understand > what is going on there. > Well, there was definitely a change in that code that will cause this, i.e. the code block: if isinstance(b, MaskedArray): if isinstance(a, MaskedArray): result._update_from(a) else: result._update_from(b) elif isinstance(a, MaskedArray): result._update_from(a) was changed to something like: masked_result._update_from(result) My guess is almost that this fixed some other bug (but you should probably check git blame to see why it was put in). It might be that _optinfo should be handled more specifically. It seems like a very weird feature to me though when the info is always copied from the left operand... Is _optinfo even *documented* to exist? Because frankly, unless it is used far more, and the fact that it is hidden away with an underscore, I am not sure we should prioritize that it would keep working, considering that I am unsure that it ever did something very elegantly. - Sebastian > Jeremy > > From: NumPy-Discussion on behalf > of Charles R Harris > Reply-To: Discussion of Numerical Python > Date: Tuesday, January 26, 2016 at 10:17 AM > To: Discussion of Numerical Python > Subject: Re: [Numpy-discussion] Inconsistent behavior for ufuncs in > numpy v1.10.X > > > > On Tue, Jan 26, 2016 at 10:11 AM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > > > > > > On Mon, Jan 25, 2016 at 10:43 PM, Solbrig,Jeremy < > > Jeremy.Solbrig at colostate.edu> wrote: > > > Hello, > > > > > > Much of what is below was copied from this stack overflow > > > question. > > > I am attempting to subclass numpy.ma.MaskedArray. I am currently > > > using Python v2.7.10. The problem discussed below does not occur > > > in Numpy v1.9.2, but does occur in all versions of Numpy v1.10.x. > > > In all versions of Numpy v1.10.x, using mathematical operators on > > > my subclass behaves differently than using the analogous ufunc. > > > When using the ufunc directly (e.g. np.subtract(arr1, > > > arr2)), __array_prepare__, __array_finalize__, and > > > __array_wrap__ are all called as expected, however, when using > > > the symbolic operator (e.g. arr1-arr2) only __array_finalize__ is > > > called. As a consequence, I lose any information stored in > > > arr._optinfo when a mathematical operator is used. > > > Here is a code snippet that illustrates the issue. > > > > > > #!/bin/env pythonimport numpy as np > > > from numpy.ma import MaskedArray, nomask > > > > > > class InfoArray(MaskedArray): > > > def __new__(cls, info=None, data=None, mask=nomask, > > > dtype=None, > > > copy=False, subok=True, ndmin=0, fill_value=None, > > > keep_mask=True, hard_mask=None, shrink=True, > > > **kwargs): > > > obj = super(InfoArray, cls).__new__(cls, data=data, > > > mask=mask, > > > dtype=dtype, copy=copy, subok=subok, > > > ndmin=ndmin, > > > fill_value=fill_value, hard_mask=hard_mask, > > > shrink=shrink, **kwargs) > > > obj._optinfo['info'] = info > > > return obj > > > > > > def __array_prepare__(self, out, context=None): > > > print '__array_prepare__' > > > return super(InfoArray, self).__array_prepare__(out, > > > context) > > > > > > def __array_wrap__(self, out, context=None): > > > print '__array_wrap__' > > > return super(InfoArray, self).__array_wrap__(out, > > > context) > > > > > > def __array_finalize__(self, obj): > > > print '__array_finalize__' > > > return super(InfoArray, self).__array_finalize__(obj)if > > > __name__ == "__main__": > > > arr1 = InfoArray('test', data=[1,2,3,4,5,6]) > > > arr2 = InfoArray(data=[0,1,2,3,4,5]) > > > > > > diff1 = np.subtract(arr1, arr2) > > > print diff1._optinfo > > > > > > diff2 = arr1-arr2 > > > print diff2._optinfo > > > If run, the output looks like this: > > > > > > $ python test_ma_sub.py > > > #Call to np.subtract(arr1, arr2) here > > > __array_finalize__ > > > __array_finalize__ > > > __array_prepare__ > > > __array_finalize__ > > > __array_wrap__ > > > __array_finalize__ > > > {'info': 'test'}#Executing arr1-arr2 here > > > __array_finalize__ > > > {} > > > Currently I have simply downgraded to 1.9.2 to solve the problem > > > for myself, but have been having difficulty figuring out where > > > the difference lies between 1.9.2 and 1.10.0. > > > > > I don't see a difference between 1.9.2 and 1.10.0 in this test, so > > I suspect it is something else. Could you try 1.10.4 to see if the > > something else has been fixed? > > > Which is to say, it isn't in the calls to prepare, wrap, and > finalize. Now to look in _optinfo. > > Chuck > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From Jeremy.Solbrig at colostate.edu Tue Jan 26 12:49:10 2016 From: Jeremy.Solbrig at colostate.edu (Solbrig,Jeremy) Date: Tue, 26 Jan 2016 17:49:10 +0000 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: <1453830304.26205.4.camel@sipsolutions.net> References: <1453830304.26205.4.camel@sipsolutions.net> Message-ID: <54F0D60E-6822-4BA7-B83F-96541DD69C0E@colostate.edu> The problem isn?t specifically with _optinfo. _optinfo losing information is just a symptom of the fact that __array_prepare__, __array_wrap__, and __array_finalize__ do not appear to be working as documented. So far as I can tell, there is no way to access attributes of subclasses when using mathematical operators so it is impossible to maintain subclasses that contain additional attributes. On 1/26/16, 10:45 AM, "NumPy-Discussion on behalf of Sebastian Berg" wrote: >was changed to something like: > > masked_result._update_from(result) > >My guess is almost that this fixed some other bug (but you should >probably check git blame to see why it was put in). It might be that >_optinfo should be handled more specifically. It seems like a very >weird feature to me though when the info is always copied from the left >operand... From Jeremy.Solbrig at colostate.edu Tue Jan 26 12:50:56 2016 From: Jeremy.Solbrig at colostate.edu (Solbrig,Jeremy) Date: Tue, 26 Jan 2016 17:50:56 +0000 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: <54F0D60E-6822-4BA7-B83F-96541DD69C0E@colostate.edu> References: <1453830304.26205.4.camel@sipsolutions.net> <54F0D60E-6822-4BA7-B83F-96541DD69C0E@colostate.edu> Message-ID: <0A05D950-3353-4463-B07C-81DCFEE8E728@colostate.edu> New issue submitted here: https://github.com/numpy/numpy/issues/7122 I suggest moving this discussion there. On 1/26/16, 10:49 AM, "NumPy-Discussion on behalf of Solbrig,Jeremy" wrote: >The problem isn?t specifically with _optinfo. _optinfo losing information is just a symptom of the fact that __array_prepare__, __array_wrap__, and __array_finalize__ do not appear to be working as documented. So far as I can tell, there is no way to access attributes of subclasses when using mathematical operators so it is impossible to maintain subclasses that contain additional attributes. > > > >On 1/26/16, 10:45 AM, "NumPy-Discussion on behalf of Sebastian Berg" wrote: > >>was changed to something like: >> >> masked_result._update_from(result) >> >>My guess is almost that this fixed some other bug (but you should >>probably check git blame to see why it was put in). It might be that >>_optinfo should be handled more specifically. It seems like a very >>weird feature to me though when the info is always copied from the left >>operand... >_______________________________________________ >NumPy-Discussion mailing list >NumPy-Discussion at scipy.org >https://mail.scipy.org/mailman/listinfo/numpy-discussion From charlesr.harris at gmail.com Tue Jan 26 13:12:10 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 11:12:10 -0700 Subject: [Numpy-discussion] Inconsistent behavior for ufuncs in numpy v1.10.X In-Reply-To: <1453830304.26205.4.camel@sipsolutions.net> References: <1453830304.26205.4.camel@sipsolutions.net> Message-ID: On Tue, Jan 26, 2016 at 10:45 AM, Sebastian Berg wrote: > On Di, 2016-01-26 at 17:27 +0000, Solbrig,Jeremy wrote: > > Hello Chuck, > > > > I receive the same result with 1.10.4. I agree that it looks like > > __array_prepare__, __array_finalize__, and __array_wrap__ have not > > been changed. I?m starting to dig into the source again, but > > focusing on the _MaskedBinaryOperation class to try to understand > > what is going on there. > > > > Well, there was definitely a change in that code that will cause this, > i.e. the code block: > > if isinstance(b, MaskedArray): > if isinstance(a, MaskedArray): > result._update_from(a) > else: > result._update_from(b) > elif isinstance(a, MaskedArray): > result._update_from(a) > > was changed to something like: > > masked_result._update_from(result) > That looks like it, 3c6b6baba, #3907 . That's old... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jan 26 13:59:36 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 26 Jan 2016 19:59:36 +0100 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: <1C88BDDD-238E-4A34-9648-6AAD71B88D53@continuum.io> Message-ID: On Tue, Jan 26, 2016 at 2:13 AM, G Young wrote: > Ah, yes, that is true. That point had completely escaped my mind. In > light of this, it seems that it's not worth the while then to completely > switch over to pip + virtualenv. It's might be better actually to rewrite > the current Appveyor tests to use environments so that the test suite can > be expanded, though I'm not sure how prudent that is given how slow > Appveyor tests run. > At the moment Appveyor is already a bit of a bottleneck - it regularly hasn't started yet when TravisCI is already done. This can be solved via a paid account, we should seriously consider that when we have a bit more experience with it (Appveyor tests have been running for less than a month I think). But it does mean we should go for a sparse test matrix, and use a more complete one (all Python versions for example) on TravisCI. In the near future we'll have to add MingwPy test runs to Appveyor. Beyond that I'm not sure what needs to be added? Ralf > > Greg > > On Tue, Jan 26, 2016 at 12:13 AM, Bryan Van de Ven > wrote: > >> >> > On Jan 25, 2016, at 5:21 PM, G Young wrote: >> > >> > With regards to testing numpy, both Conda and Pip + Virtualenv work >> quite well. I have used both to install master and run unit tests, and >> both pass with flying colors. This chart here illustrates my point nicely >> as well. >> > >> > However, I can't seem to find / access Conda installations for slightly >> older versions of Python (e.g. Python 3.4). Perhaps this is not much of an >> issue now with the next release (1.12) being written only for Python 2.7 >> and Python 3.4 - 5. However, if we were to wind the clock slightly back to >> when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being >> able to test on a variety of Python distributions given the nature of Conda >> releases. Maybe that situation is no longer the case now, but in the long >> term, it could easily happen again. >> >> Why do you need the installers? The whole point of conda is to be able to >> create environments with whatever configuration you need. Just pick the >> newest installer and use "conda create" from there: >> >> bryan at 0199-bryanv (git:streaming) ~/work/bokeh/bokeh $ conda create -n >> py26 python=2.6 >> Fetching package metadata: .............. >> Solving package specifications: .......... >> Package plan for installation in environment >> /Users/bryan/anaconda/envs/py26: >> >> The following packages will be downloaded: >> >> package | build >> ---------------------------|----------------- >> setuptools-18.0.1 | py26_0 343 KB >> pip-7.1.0 | py26_0 1.4 MB >> ------------------------------------------------------------ >> Total: 1.7 MB >> >> The following NEW packages will be INSTALLED: >> >> openssl: 1.0.1k-1 >> pip: 7.1.0-py26_0 >> python: 2.6.9-1 >> readline: 6.2-2 >> setuptools: 18.0.1-py26_0 >> sqlite: 3.9.2-0 >> tk: 8.5.18-0 >> zlib: 1.2.8-0 >> >> Proceed ([y]/n)? >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Tue Jan 26 14:37:04 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Tue, 26 Jan 2016 20:37:04 +0100 Subject: [Numpy-discussion] Testing warnings Message-ID: <1453837024.27411.34.camel@sipsolutions.net> Hi all, so I have been thinking about this a little more, and I do not think there is a truly nice solution to the python bug: http://bugs.python.org/issue4180 (does not create problems for new pythons). However, I have been so annoyed by trying to test FutureWarnings or DeprecationWarnings in the past that I want *some* improvement. You can do quite a lot by adding some new features, but there are also some limitations. I think that we must be able to: o Filter out warnings on the global test run level. o Find all not explicitly filtered warnings during development easily. o We should be able to test any (almost any?) warnings, even those that would be filtered globally. The next line of considerations for me is, whether we want: o To be able to *print* warnings during test runs? (in release mode) o Be able to not repeat filtering of globally filtered warnings when filtering additional warnings in an individual test? o Be able to count warnings, but ignore other warnings (not the global ones, though). o Filter warnings by module? (might be hard to impossible) And one further option: o Could we accept that testing warnings is *only* possible reliable in Python 3.4+? It would however even mean that we have to fully *skip* tests that would ensure specific warnings to be given. The first things can be achieved setting all warnings to errors on the global level and trying to make the local tests as specific specific as possible. I could go ahead with it. There will likely be some uglier points, but it should work. They do not require funnier new hacks. For all I can see, the second bunch of things requires new features such as in my current PR. So, I want to know whether we can/want to go ahead with this kind of idea [1]. For me personally, I cannot accept we do not provide the first points. The second bunch, I would like some of them (I do not know about printing warnings in release mode?), and skipping tests on Python 2, seems to me even worse then ugly hacks. Getting there is a bit uglier (requires a new context manager for all I see), an I tend to think it is worth the trouble, but I don't think it is vital. In other words, I don't care too much about those points, but I want to get somewhere because I have been bitten often enough by the annoying and in my opinion simply unacceptable (on python 2) use of "ignore" warnings filters in tests. The current state makes finding warnings given in our own tests almost impossible, in the best case they will have to be fixed much much later when the change actually occurs, in the worst case we never find our own real bugs. So where to go? :) - Sebastian [1] I need to fix the module filtering point, the module filtering does not work reliably currently, I think it can be fixed at least 99.5%, but it is not too pretty (not that the user should notice). -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From charlesr.harris at gmail.com Tue Jan 26 15:49:30 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 13:49:30 -0700 Subject: [Numpy-discussion] Numpy 1.11.0b1 is out Message-ID: Hi All, I'm pleased to announce that Numpy 1.11.0b1 is now available on sourceforge. This is a source release as the mingw32 toolchain is broken. Please test it out and report any errors that you discover. Hopefully we can do better with 1.11.0 than we did with 1.10.0 ;) Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From derek at astro.physik.uni-goettingen.de Tue Jan 26 18:48:34 2016 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Wed, 27 Jan 2016 00:48:34 +0100 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: Hi Chuck, > I'm pleased to announce that Numpy 1.11.0b1 is now available on sourceforge. This is a source release as the mingw32 toolchain is broken. Please test it out and report any errors that you discover. Hopefully we can do better with 1.11.0 than we did with 1.10.0 ;) the tarball seems to be incomplete, hope that does not bode ill ;-) adding 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' to sources. executing numpy/core/code_generators/generate_numpy_api.py error: [Errno 2] No such file or directory: 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h -rw-rw-r-- charris/charris 981 2016-01-21 20:38 numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) builds and passes all tests with Python 2.7.11 and 3.5.1 on Mac OS X 10.10. Cheers, Derek From gfyoung17 at gmail.com Tue Jan 26 18:57:18 2016 From: gfyoung17 at gmail.com (G Young) Date: Tue, 26 Jan 2016 23:57:18 +0000 Subject: [Numpy-discussion] Appveyor Testing Changes In-Reply-To: References: <1C88BDDD-238E-4A34-9648-6AAD71B88D53@continuum.io> Message-ID: Perhaps a pip + virtualenv build as well since that's one way that is mentioned in the online docs for installing source code. I can't think of anything else beyond that and what you suggested for the time being. Greg On Tue, Jan 26, 2016 at 6:59 PM, Ralf Gommers wrote: > > > On Tue, Jan 26, 2016 at 2:13 AM, G Young wrote: > >> Ah, yes, that is true. That point had completely escaped my mind. In >> light of this, it seems that it's not worth the while then to completely >> switch over to pip + virtualenv. It's might be better actually to rewrite >> the current Appveyor tests to use environments so that the test suite can >> be expanded, though I'm not sure how prudent that is given how slow >> Appveyor tests run. >> > > At the moment Appveyor is already a bit of a bottleneck - it regularly > hasn't started yet when TravisCI is already done. This can be solved via a > paid account, we should seriously consider that when we have a bit more > experience with it (Appveyor tests have been running for less than a month > I think). But it does mean we should go for a sparse test matrix, and use a > more complete one (all Python versions for example) on TravisCI. In the > near future we'll have to add MingwPy test runs to Appveyor. Beyond that > I'm not sure what needs to be added? > > Ralf > > > >> >> Greg >> >> On Tue, Jan 26, 2016 at 12:13 AM, Bryan Van de Ven >> wrote: >> >>> >>> > On Jan 25, 2016, at 5:21 PM, G Young wrote: >>> > >>> > With regards to testing numpy, both Conda and Pip + Virtualenv work >>> quite well. I have used both to install master and run unit tests, and >>> both pass with flying colors. This chart here illustrates my point nicely >>> as well. >>> > >>> > However, I can't seem to find / access Conda installations for >>> slightly older versions of Python (e.g. Python 3.4). Perhaps this is not >>> much of an issue now with the next release (1.12) being written only for >>> Python 2.7 and Python 3.4 - 5. However, if we were to wind the clock >>> slightly back to when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls >>> short in being able to test on a variety of Python distributions given the >>> nature of Conda releases. Maybe that situation is no longer the case now, >>> but in the long term, it could easily happen again. >>> >>> Why do you need the installers? The whole point of conda is to be able >>> to create environments with whatever configuration you need. Just pick the >>> newest installer and use "conda create" from there: >>> >>> bryan at 0199-bryanv (git:streaming) ~/work/bokeh/bokeh $ conda create -n >>> py26 python=2.6 >>> Fetching package metadata: .............. >>> Solving package specifications: .......... >>> Package plan for installation in environment >>> /Users/bryan/anaconda/envs/py26: >>> >>> The following packages will be downloaded: >>> >>> package | build >>> ---------------------------|----------------- >>> setuptools-18.0.1 | py26_0 343 KB >>> pip-7.1.0 | py26_0 1.4 MB >>> ------------------------------------------------------------ >>> Total: 1.7 MB >>> >>> The following NEW packages will be INSTALLED: >>> >>> openssl: 1.0.1k-1 >>> pip: 7.1.0-py26_0 >>> python: 2.6.9-1 >>> readline: 6.2-2 >>> setuptools: 18.0.1-py26_0 >>> sqlite: 3.9.2-0 >>> tk: 8.5.18-0 >>> zlib: 1.2.8-0 >>> >>> Proceed ([y]/n)? >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 20:58:51 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 18:58:51 -0700 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier < derek at astro.physik.uni-goettingen.de> wrote: > Hi Chuck, > > > I'm pleased to announce that Numpy 1.11.0b1 is now available on > sourceforge. This is a source release as the mingw32 toolchain is broken. > Please test it out and report any errors that you discover. Hopefully we > can do better with 1.11.0 than we did with 1.10.0 ;) > > the tarball seems to be incomplete, hope that does not bode ill ;-) > > adding > 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' > to sources. > executing numpy/core/code_generators/generate_numpy_api.py > error: [Errno 2] No such file or directory: > 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' > Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The changes are minimal, the only thing that would seem to matter is the removal of setupegg.py. Ralf, any ideas. > > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes > -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 > numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h > -rw-rw-r-- charris/charris 981 2016-01-21 20:38 > numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h > > FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) builds > and passes all tests with Python 2.7.11 > and 3.5.1 on Mac OS X 10.10. > > You probably didn't fetch the tags, if they can't be reached from the branch head they don't download automatically. Try `git fetch --tags upstream` Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 21:13:30 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 19:13:30 -0700 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 6:58 PM, Charles R Harris wrote: > > > On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier < > derek at astro.physik.uni-goettingen.de> wrote: > >> Hi Chuck, >> >> > I'm pleased to announce that Numpy 1.11.0b1 is now available on >> sourceforge. This is a source release as the mingw32 toolchain is broken. >> Please test it out and report any errors that you discover. Hopefully we >> can do better with 1.11.0 than we did with 1.10.0 ;) >> >> the tarball seems to be incomplete, hope that does not bode ill ;-) >> >> adding >> 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' >> to sources. >> executing numpy/core/code_generators/generate_numpy_api.py >> error: [Errno 2] No such file or directory: >> 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' >> > > Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the > `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The > changes are minimal, the only thing that would seem to matter is the > removal of setupegg.py. Ralf, any ideas. > > >> > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes >> -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 >> numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h >> -rw-rw-r-- charris/charris 981 2016-01-21 20:38 >> numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h >> >> FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) >> builds and passes all tests with Python 2.7.11 >> and 3.5.1 on Mac OS X 10.10. >> >> > You probably didn't fetch the tags, if they can't be reached from the > branch head they don't download automatically. Try `git fetch --tags > upstream` > > setupegg.py doesn't seem to matter... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 21:45:11 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 19:45:11 -0700 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 7:13 PM, Charles R Harris wrote: > > > On Tue, Jan 26, 2016 at 6:58 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> >> >> On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier < >> derek at astro.physik.uni-goettingen.de> wrote: >> >>> Hi Chuck, >>> >>> > I'm pleased to announce that Numpy 1.11.0b1 is now available on >>> sourceforge. This is a source release as the mingw32 toolchain is broken. >>> Please test it out and report any errors that you discover. Hopefully we >>> can do better with 1.11.0 than we did with 1.10.0 ;) >>> >>> the tarball seems to be incomplete, hope that does not bode ill ;-) >>> >>> adding >>> 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' >>> to sources. >>> executing numpy/core/code_generators/generate_numpy_api.py >>> error: [Errno 2] No such file or directory: >>> 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' >>> >> >> Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the >> `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The >> changes are minimal, the only thing that would seem to matter is the >> removal of setupegg.py. Ralf, any ideas. >> >> >>> > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes >>> -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 >>> numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h >>> -rw-rw-r-- charris/charris 981 2016-01-21 20:38 >>> numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h >>> >>> FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) >>> builds and passes all tests with Python 2.7.11 >>> and 3.5.1 on Mac OS X 10.10. >>> >>> >> You probably didn't fetch the tags, if they can't be reached from the >> branch head they don't download automatically. Try `git fetch --tags >> upstream` >> >> > setupegg.py doesn't seem to matter... > > OK, it is the changes in the root setup.py file, probably the switch to setuptools. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jan 26 22:47:42 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 26 Jan 2016 20:47:42 -0700 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On Tue, Jan 26, 2016 at 7:45 PM, Charles R Harris wrote: > > > On Tue, Jan 26, 2016 at 7:13 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> >> >> On Tue, Jan 26, 2016 at 6:58 PM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> >>> >>> >>> On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier < >>> derek at astro.physik.uni-goettingen.de> wrote: >>> >>>> Hi Chuck, >>>> >>>> > I'm pleased to announce that Numpy 1.11.0b1 is now available on >>>> sourceforge. This is a source release as the mingw32 toolchain is broken. >>>> Please test it out and report any errors that you discover. Hopefully we >>>> can do better with 1.11.0 than we did with 1.10.0 ;) >>>> >>>> the tarball seems to be incomplete, hope that does not bode ill ;-) >>>> >>>> adding >>>> 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' >>>> to sources. >>>> executing numpy/core/code_generators/generate_numpy_api.py >>>> error: [Errno 2] No such file or directory: >>>> 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' >>>> >>> >>> Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the >>> `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The >>> changes are minimal, the only thing that would seem to matter is the >>> removal of setupegg.py. Ralf, any ideas. >>> >>> >>>> > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes >>>> -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 >>>> numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h >>>> -rw-rw-r-- charris/charris 981 2016-01-21 20:38 >>>> numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h >>>> >>>> FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) >>>> builds and passes all tests with Python 2.7.11 >>>> and 3.5.1 on Mac OS X 10.10. >>>> >>>> >>> You probably didn't fetch the tags, if they can't be reached from the >>> branch head they don't download automatically. Try `git fetch --tags >>> upstream` >>> >>> >> setupegg.py doesn't seem to matter... >> >> > OK, it is the changes in the root setup.py file, probably the switch to > setuptools. > Yep, it's setuptools. If I import sdist from distutils instead, everything works fine. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Jan 27 04:23:31 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 27 Jan 2016 10:23:31 +0100 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On Wed, Jan 27, 2016 at 4:47 AM, Charles R Harris wrote: > > > On Tue, Jan 26, 2016 at 7:45 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> >> >> On Tue, Jan 26, 2016 at 7:13 PM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> >>> >>> >>> On Tue, Jan 26, 2016 at 6:58 PM, Charles R Harris < >>> charlesr.harris at gmail.com> wrote: >>> >>>> >>>> >>>> On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier < >>>> derek at astro.physik.uni-goettingen.de> wrote: >>>> >>>>> Hi Chuck, >>>>> >>>>> > I'm pleased to announce that Numpy 1.11.0b1 is now available on >>>>> sourceforge. This is a source release as the mingw32 toolchain is broken. >>>>> Please test it out and report any errors that you discover. Hopefully we >>>>> can do better with 1.11.0 than we did with 1.10.0 ;) >>>>> >>>>> the tarball seems to be incomplete, hope that does not bode ill ;-) >>>>> >>>>> adding >>>>> 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h' >>>>> to sources. >>>>> executing numpy/core/code_generators/generate_numpy_api.py >>>>> error: [Errno 2] No such file or directory: >>>>> 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src' >>>>> >>>> >>>> Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the >>>> `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The >>>> changes are minimal, the only thing that would seem to matter is the >>>> removal of setupegg.py. Ralf, any ideas. >>>> >>>> >>>>> > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes >>>>> -rw-rw-r-- charris/charris 62563 2016-01-21 20:38 >>>>> numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h >>>>> -rw-rw-r-- charris/charris 981 2016-01-21 20:38 >>>>> numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h >>>>> >>>>> FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) >>>>> builds and passes all tests with Python 2.7.11 >>>>> and 3.5.1 on Mac OS X 10.10. >>>>> >>>>> >>>> You probably didn't fetch the tags, if they can't be reached from the >>>> branch head they don't download automatically. Try `git fetch --tags >>>> upstream` >>>> >>>> >>> setupegg.py doesn't seem to matter... >>> >>> >> OK, it is the changes in the root setup.py file, probably the switch to >> setuptools. >> > > Yep, it's setuptools. If I import sdist from distutils instead, everything > works fine. > Setuptools starts build_src, don't know why yet but it doesn't look to me like it should be doing that. Issue for more detailed discussion: https://github.com/numpy/numpy/issues/7127 Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Wed Jan 27 04:39:46 2016 From: gfyoung17 at gmail.com (G Young) Date: Wed, 27 Jan 2016 09:39:46 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS Message-ID: Hello all, I'm trying to update the documentation for building Numpy from source, and I've hit a brick wall in trying to build the library using OpenBLAS because I can't seem to link the libopenblas.dll file. I tried following the suggestion of placing the DLL in numpy/core as suggested here but it still doesn't pick it up. What am I doing wrong? Thanks, Greg -------------- next part -------------- An HTML attachment was scrubbed... URL: From nadavh at visionsense.com Wed Jan 27 06:19:02 2016 From: nadavh at visionsense.com (Nadav Horesh) Date: Wed, 27 Jan 2016 11:19:02 +0000 Subject: [Numpy-discussion] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: Why the dot function/method is slower than @ on python 3.5.1? Tested from the latest 1.11 maintenance branch. np.__version__ Out[39]: '1.11.0.dev0+Unknown' %timeit A @ c 10000 loops, best of 3: 185 ?s per loop %timeit A.dot(c) 1000 loops, best of 3: 526 ?s per loop %timeit np.dot(A,c) 1000 loops, best of 3: 527 ?s per loop A.dtype, A.shape, A.flags Out[43]:? (dtype('float32'), (100, 100, 3), ? C_CONTIGUOUS : True ? ?F_CONTIGUOUS : False ? ?OWNDATA : True ? ?WRITEABLE : True ? ?ALIGNED : True ? ?UPDATEIFCOPY : False) c.dtype, c.shape, c.flags Out[44]:? (dtype('float32'), (3, 3), ? C_CONTIGUOUS : True ? ?F_CONTIGUOUS : False ? ?OWNDATA : True ? ?WRITEABLE : True ? ?ALIGNED : True ? ?UPDATEIFCOPY : False) From: NumPy-Discussion on behalf of Charles R Harris Sent: 26 January 2016 22:49 To: numpy-discussion; SciPy Developers List; SciPy Users List Subject: [Numpy-discussion] Numpy 1.11.0b1 is out ? Hi All, I'm pleased to announce that Numpy 1.11.0b1 is now available on sourceforge. This is a source release as the mingw32 toolchain is broken. Please test it out and report any errors that you discover. Hopefully we can do better with 1.11.0 than we did with 1.10.0 ;) Chuck From sebastian at sipsolutions.net Wed Jan 27 07:10:36 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Wed, 27 Jan 2016 13:10:36 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: <1453896636.18959.2.camel@sipsolutions.net> On Mi, 2016-01-27 at 11:19 +0000, Nadav Horesh wrote: > Why the dot function/method is slower than @ on python 3.5.1? Tested > from the latest 1.11 maintenance branch. > The explanation I think is that you do not have a blas optimization. In which case the fallback mode is probably faster in the @ case (since it has SSE2 optimization by using einsum, while np.dot does not do that). Btw. thanks for all the work getting this done Chuck! - Sebastian > > > np.__version__ > Out[39]: '1.11.0.dev0+Unknown' > > > %timeit A @ c > 10000 loops, best of 3: 185 ?s per loop > > > %timeit A.dot(c) > 1000 loops, best of 3: 526 ?s per loop > > > %timeit np.dot(A,c) > 1000 loops, best of 3: 527 ?s per loop > > > A.dtype, A.shape, A.flags > Out[43]: > (dtype('float32'), (100, 100, 3), C_CONTIGUOUS : True > F_CONTIGUOUS : False > OWNDATA : True > WRITEABLE : True > ALIGNED : True > UPDATEIFCOPY : False) > > > c.dtype, c.shape, c.flags > Out[44]: > (dtype('float32'), (3, 3), C_CONTIGUOUS : True > F_CONTIGUOUS : False > OWNDATA : True > WRITEABLE : True > ALIGNED : True > UPDATEIFCOPY : False) > > > > > > From: NumPy-Discussion on behalf > of Charles R Harris > Sent: 26 January 2016 22:49 > To: numpy-discussion; SciPy Developers List; SciPy Users List > Subject: [Numpy-discussion] Numpy 1.11.0b1 is out > > > > > Hi All, > > I'm pleased to announce that Numpy 1.11.0b1 is now available on > sourceforge. This is a source release as the mingw32 toolchain is > broken. Please test it out and report any errors that you discover. > Hopefully we can do better with 1.11.0 than we did with 1.10.0 ;) > > Chuck > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From msarahan at gmail.com Wed Jan 27 08:14:15 2016 From: msarahan at gmail.com (Michael Sarahan) Date: Wed, 27 Jan 2016 13:14:15 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: I'm not sure about the mingw tool chain, but usually on windows at link time you need a .lib file, called the import library. The .dll is used at runtime, not at link time. This is different from *nix, where the .so serves both purposes. The link you posted mentions import files, so I hope this is helpful information. Best, Michael On Wed, Jan 27, 2016, 03:39 G Young wrote: > Hello all, > > I'm trying to update the documentation for building Numpy from source, and > I've hit a brick wall in trying to build the library using OpenBLAS because > I can't seem to link the libopenblas.dll file. I tried following the > suggestion of placing the DLL in numpy/core as suggested here > but it > still doesn't pick it up. What am I doing wrong? > > Thanks, > > Greg > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Wed Jan 27 08:40:28 2016 From: gfyoung17 at gmail.com (G Young) Date: Wed, 27 Jan 2016 13:40:28 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: I do have my site.cfg file pointing to my library which contains a .lib file along with the appropriate include_dirs parameter. However, NumPy can't seem to find / use the DLL file no matter where I put it (numpy/core, same directory as openblas.lib). By the way, I should mention that I am using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have any effect I would imagine. Greg On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan wrote: > I'm not sure about the mingw tool chain, but usually on windows at link > time you need a .lib file, called the import library. The .dll is used at > runtime, not at link time. This is different from *nix, where the .so > serves both purposes. The link you posted mentions import files, so I hope > this is helpful information. > > Best, > Michael > > On Wed, Jan 27, 2016, 03:39 G Young wrote: > >> Hello all, >> >> I'm trying to update the documentation for building Numpy from source, >> and I've hit a brick wall in trying to build the library using OpenBLAS >> because I can't seem to link the libopenblas.dll file. I tried following >> the suggestion of placing the DLL in numpy/core as suggested here >> but >> it still doesn't pick it up. What am I doing wrong? >> >> Thanks, >> >> Greg >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From msarahan at gmail.com Wed Jan 27 08:58:10 2016 From: msarahan at gmail.com (Michael Sarahan) Date: Wed, 27 Jan 2016 13:58:10 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: When you say find/use, can you please clarify whether you have completed the compilation/linking successfully? I'm not clear on exactly when you're having problems. What is the error output? One very helpful tool in diagnosing dll problems is dependency walker: http://www.dependencywalker.com/ It may be that your openblas has a dependency that it can't load for some reason. Dependency walker works on .pyd files as well as .dll files. Hth, Michael On Wed, Jan 27, 2016, 07:40 G Young wrote: > I do have my site.cfg file pointing to my library which contains a .lib > file along with the appropriate include_dirs parameter. However, NumPy > can't seem to find / use the DLL file no matter where I put it (numpy/core, > same directory as openblas.lib). By the way, I should mention that I am > using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have > any effect I would imagine. > > Greg > > On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan > wrote: > >> I'm not sure about the mingw tool chain, but usually on windows at link >> time you need a .lib file, called the import library. The .dll is used at >> runtime, not at link time. This is different from *nix, where the .so >> serves both purposes. The link you posted mentions import files, so I hope >> this is helpful information. >> >> Best, >> Michael >> >> On Wed, Jan 27, 2016, 03:39 G Young wrote: >> >>> Hello all, >>> >>> I'm trying to update the documentation for building Numpy from source, >>> and I've hit a brick wall in trying to build the library using OpenBLAS >>> because I can't seem to link the libopenblas.dll file. I tried following >>> the suggestion of placing the DLL in numpy/core as suggested here >>> but >>> it still doesn't pick it up. What am I doing wrong? >>> >>> Thanks, >>> >>> Greg >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Wed Jan 27 09:19:52 2016 From: gfyoung17 at gmail.com (G Young) Date: Wed, 27 Jan 2016 14:19:52 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: NumPy will "build" successfully, but then when I type "import numpy", it cannot import the multiarray PYD file. I am using dependency walker, and that's how I know it's the libopenblas.dll file that it's not linking to properly, hence my original question. Greg On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan wrote: > When you say find/use, can you please clarify whether you have completed > the compilation/linking successfully? I'm not clear on exactly when you're > having problems. What is the error output? > > One very helpful tool in diagnosing dll problems is dependency walker: > http://www.dependencywalker.com/ > > It may be that your openblas has a dependency that it can't load for some > reason. Dependency walker works on .pyd files as well as .dll files. > > Hth, > Michael > > On Wed, Jan 27, 2016, 07:40 G Young wrote: > >> I do have my site.cfg file pointing to my library which contains a .lib >> file along with the appropriate include_dirs parameter. However, NumPy >> can't seem to find / use the DLL file no matter where I put it (numpy/core, >> same directory as openblas.lib). By the way, I should mention that I am >> using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have >> any effect I would imagine. >> >> Greg >> >> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >> wrote: >> >>> I'm not sure about the mingw tool chain, but usually on windows at link >>> time you need a .lib file, called the import library. The .dll is used at >>> runtime, not at link time. This is different from *nix, where the .so >>> serves both purposes. The link you posted mentions import files, so I hope >>> this is helpful information. >>> >>> Best, >>> Michael >>> >>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>> >>>> Hello all, >>>> >>>> I'm trying to update the documentation for building Numpy from source, >>>> and I've hit a brick wall in trying to build the library using OpenBLAS >>>> because I can't seem to link the libopenblas.dll file. I tried following >>>> the suggestion of placing the DLL in numpy/core as suggested here >>>> but >>>> it still doesn't pick it up. What am I doing wrong? >>>> >>>> Thanks, >>>> >>>> Greg >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Jan 27 09:48:21 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 27 Jan 2016 15:48:21 +0100 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: On Wed, Jan 27, 2016 at 3:19 PM, G Young wrote: > NumPy will "build" successfully, but then when I type "import numpy", it > cannot import the multiarray PYD file. > > I am using dependency walker, and that's how I know it's the > libopenblas.dll file that it's not linking to properly, hence my original > question. > The support for MingwPy in numpy.distutils had to be temporarily reverted (see https://github.com/numpy/numpy/pull/6536), because the patch caused other issues. So likely it just won't work out of the box now. If you need it, maybe you can reapply that reverted patch. But otherwise I'd wait a little bit; we'll sort out the MingwPy build in the near future. Ralf > Greg > > On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan > wrote: > >> When you say find/use, can you please clarify whether you have completed >> the compilation/linking successfully? I'm not clear on exactly when you're >> having problems. What is the error output? >> >> One very helpful tool in diagnosing dll problems is dependency walker: >> http://www.dependencywalker.com/ >> >> It may be that your openblas has a dependency that it can't load for some >> reason. Dependency walker works on .pyd files as well as .dll files. >> >> Hth, >> Michael >> >> On Wed, Jan 27, 2016, 07:40 G Young wrote: >> >>> I do have my site.cfg file pointing to my library which contains a .lib >>> file along with the appropriate include_dirs parameter. However, NumPy >>> can't seem to find / use the DLL file no matter where I put it (numpy/core, >>> same directory as openblas.lib). By the way, I should mention that I am >>> using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have >>> any effect I would imagine. >>> >>> Greg >>> >>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >>> wrote: >>> >>>> I'm not sure about the mingw tool chain, but usually on windows at link >>>> time you need a .lib file, called the import library. The .dll is used at >>>> runtime, not at link time. This is different from *nix, where the .so >>>> serves both purposes. The link you posted mentions import files, so I hope >>>> this is helpful information. >>>> >>>> Best, >>>> Michael >>>> >>>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>>> >>>>> Hello all, >>>>> >>>>> I'm trying to update the documentation for building Numpy from source, >>>>> and I've hit a brick wall in trying to build the library using OpenBLAS >>>>> because I can't seem to link the libopenblas.dll file. I tried following >>>>> the suggestion of placing the DLL in numpy/core as suggested here >>>>> but >>>>> it still doesn't pick it up. What am I doing wrong? >>>>> >>>>> Thanks, >>>>> >>>>> Greg >>>>> _______________________________________________ >>>>> NumPy-Discussion mailing list >>>>> NumPy-Discussion at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>> >>>> >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Wed Jan 27 09:51:10 2016 From: gfyoung17 at gmail.com (G Young) Date: Wed, 27 Jan 2016 14:51:10 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: I don't need it at this point. I'm just going through the exercise for purposes of updating building from source on Windows. But that's good to know though. Thanks! Greg On Wed, Jan 27, 2016 at 2:48 PM, Ralf Gommers wrote: > > > On Wed, Jan 27, 2016 at 3:19 PM, G Young wrote: > >> NumPy will "build" successfully, but then when I type "import numpy", it >> cannot import the multiarray PYD file. >> >> I am using dependency walker, and that's how I know it's the >> libopenblas.dll file that it's not linking to properly, hence my original >> question. >> > > The support for MingwPy in numpy.distutils had to be temporarily reverted > (see https://github.com/numpy/numpy/pull/6536), because the patch caused > other issues. So likely it just won't work out of the box now. If you need > it, maybe you can reapply that reverted patch. But otherwise I'd wait a > little bit; we'll sort out the MingwPy build in the near future. > > Ralf > > > >> Greg >> >> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan >> wrote: >> >>> When you say find/use, can you please clarify whether you have completed >>> the compilation/linking successfully? I'm not clear on exactly when you're >>> having problems. What is the error output? >>> >>> One very helpful tool in diagnosing dll problems is dependency walker: >>> http://www.dependencywalker.com/ >>> >>> It may be that your openblas has a dependency that it can't load for >>> some reason. Dependency walker works on .pyd files as well as .dll files. >>> >>> Hth, >>> Michael >>> >>> On Wed, Jan 27, 2016, 07:40 G Young wrote: >>> >>>> I do have my site.cfg file pointing to my library which contains a .lib >>>> file along with the appropriate include_dirs parameter. However, NumPy >>>> can't seem to find / use the DLL file no matter where I put it (numpy/core, >>>> same directory as openblas.lib). By the way, I should mention that I am >>>> using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have >>>> any effect I would imagine. >>>> >>>> Greg >>>> >>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >>>> wrote: >>>> >>>>> I'm not sure about the mingw tool chain, but usually on windows at >>>>> link time you need a .lib file, called the import library. The .dll is >>>>> used at runtime, not at link time. This is different from *nix, where the >>>>> .so serves both purposes. The link you posted mentions import files, so I >>>>> hope this is helpful information. >>>>> >>>>> Best, >>>>> Michael >>>>> >>>>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>>>> >>>>>> Hello all, >>>>>> >>>>>> I'm trying to update the documentation for building Numpy from >>>>>> source, and I've hit a brick wall in trying to build the library using >>>>>> OpenBLAS because I can't seem to link the libopenblas.dll file. I tried >>>>>> following the suggestion of placing the DLL in numpy/core as suggested >>>>>> here >>>>>> but >>>>>> it still doesn't pick it up. What am I doing wrong? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Greg >>>>>> _______________________________________________ >>>>>> NumPy-Discussion mailing list >>>>>> NumPy-Discussion at scipy.org >>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>> >>>>> >>>>> _______________________________________________ >>>>> NumPy-Discussion mailing list >>>>> NumPy-Discussion at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>> >>>>> >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Jan 27 10:01:34 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 27 Jan 2016 16:01:34 +0100 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: On Wed, Jan 27, 2016 at 3:51 PM, G Young wrote: > I don't need it at this point. I'm just going through the exercise for > purposes of updating building from source on Windows. But that's good to > know though. Thanks! > That effort is much appreciated by the way. Updating the build info on all platforms on http://scipy.org/scipylib/building/index.html is a significant amount of work, and it has never been in a state one could call complete. So more contributions definitely welcome! Ralf > > Greg > > On Wed, Jan 27, 2016 at 2:48 PM, Ralf Gommers > wrote: > >> >> >> On Wed, Jan 27, 2016 at 3:19 PM, G Young wrote: >> >>> NumPy will "build" successfully, but then when I type "import numpy", it >>> cannot import the multiarray PYD file. >>> >>> I am using dependency walker, and that's how I know it's the >>> libopenblas.dll file that it's not linking to properly, hence my original >>> question. >>> >> >> The support for MingwPy in numpy.distutils had to be temporarily reverted >> (see https://github.com/numpy/numpy/pull/6536), because the patch caused >> other issues. So likely it just won't work out of the box now. If you need >> it, maybe you can reapply that reverted patch. But otherwise I'd wait a >> little bit; we'll sort out the MingwPy build in the near future. >> >> Ralf >> >> >> >>> Greg >>> >>> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan >>> wrote: >>> >>>> When you say find/use, can you please clarify whether you have >>>> completed the compilation/linking successfully? I'm not clear on exactly >>>> when you're having problems. What is the error output? >>>> >>>> One very helpful tool in diagnosing dll problems is dependency walker: >>>> http://www.dependencywalker.com/ >>>> >>>> It may be that your openblas has a dependency that it can't load for >>>> some reason. Dependency walker works on .pyd files as well as .dll files. >>>> >>>> Hth, >>>> Michael >>>> >>>> On Wed, Jan 27, 2016, 07:40 G Young wrote: >>>> >>>>> I do have my site.cfg file pointing to my library which contains a >>>>> .lib file along with the appropriate include_dirs parameter. However, >>>>> NumPy can't seem to find / use the DLL file no matter where I put it >>>>> (numpy/core, same directory as openblas.lib). By the way, I should mention >>>>> that I am using a slightly dated version of OpenBLAS (0.2.9), but that >>>>> shouldn't have any effect I would imagine. >>>>> >>>>> Greg >>>>> >>>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >>>>> wrote: >>>>> >>>>>> I'm not sure about the mingw tool chain, but usually on windows at >>>>>> link time you need a .lib file, called the import library. The .dll is >>>>>> used at runtime, not at link time. This is different from *nix, where the >>>>>> .so serves both purposes. The link you posted mentions import files, so I >>>>>> hope this is helpful information. >>>>>> >>>>>> Best, >>>>>> Michael >>>>>> >>>>>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>>>>> >>>>>>> Hello all, >>>>>>> >>>>>>> I'm trying to update the documentation for building Numpy from >>>>>>> source, and I've hit a brick wall in trying to build the library using >>>>>>> OpenBLAS because I can't seem to link the libopenblas.dll file. I tried >>>>>>> following the suggestion of placing the DLL in numpy/core as suggested >>>>>>> here >>>>>>> but >>>>>>> it still doesn't pick it up. What am I doing wrong? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Greg >>>>>>> _______________________________________________ >>>>>>> NumPy-Discussion mailing list >>>>>>> NumPy-Discussion at scipy.org >>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> NumPy-Discussion mailing list >>>>>> NumPy-Discussion at scipy.org >>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>> >>>>>> >>>>> _______________________________________________ >>>>> NumPy-Discussion mailing list >>>>> NumPy-Discussion at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>> >>>> >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>>> >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From derek at astro.physik.uni-goettingen.de Wed Jan 27 10:37:12 2016 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Wed, 27 Jan 2016 16:37:12 +0100 Subject: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out In-Reply-To: References: Message-ID: On 27 Jan 2016, at 2:58 AM, Charles R Harris wrote: > > FWIW, the maintenance/1.11.x branch (there is no tag for the beta?) builds and passes all tests with Python 2.7.11 > and 3.5.1 on Mac OS X 10.10. > > > You probably didn't fetch the tags, if they can't be reached from the branch head they don't download automatically. Try `git fetch --tags upstream` Thanks, that did it. Successfully tested v1.11.0b1 on 10.11 and with Python 2.7.8 and 3.4.1 on openSUSE 13.2 as well. Derek From sebastian at sipsolutions.net Wed Jan 27 13:26:17 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Wed, 27 Jan 2016 19:26:17 +0100 Subject: [Numpy-discussion] Bump warning stacklevel Message-ID: <1453919177.19979.5.camel@sipsolutions.net> Hi all, in my PR about warnings suppression, I currently also have a commit which bumps the warning stacklevel to two (or three), i.e. use: warnings.warn(..., stacklevel=2) (almost) everywhere. This means that for example (take only the empty warning): np.mean([]) would not print: /usr/lib/python2.7/dist-packages/numpy/core/_methods.py:55: RuntimeWarning: Mean of empty slice. warnings.warn("Mean of empty slice.", RuntimeWarning) but instead print the actual `np.mean([])` code line (the repetition of the warning command is always a bit funny). The advantage is nicer printing for the user. The disadvantage would probably mostly be that existing warning filters that use the `module` keyword argument, will fail. Any objections/thoughts about doing this change to try to better report the offending code line? Frankly, I am not sure whether there might be a python standard about this, but I would expect that for a library such as numpy, it makes sense to change. But, if downstream uses warning filters with modules, we might want to reconsider for example. - Sebastian -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From cmkleffner at gmail.com Wed Jan 27 13:27:41 2016 From: cmkleffner at gmail.com (Carl Kleffner) Date: Wed, 27 Jan 2016 19:27:41 +0100 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: In my timeplan the next mingwpy PR on numpy master is anticipated to take place at the weekend, together with a build description. This PR is targeted to build numpy with OpenBLAS. Carl 2016-01-27 16:01 GMT+01:00 Ralf Gommers : > > > On Wed, Jan 27, 2016 at 3:51 PM, G Young wrote: > >> I don't need it at this point. I'm just going through the exercise for >> purposes of updating building from source on Windows. But that's good to >> know though. Thanks! >> > > That effort is much appreciated by the way. Updating the build info on all > platforms on http://scipy.org/scipylib/building/index.html is a > significant amount of work, and it has never been in a state one could call > complete. So more contributions definitely welcome! > > Ralf > > > >> >> Greg >> >> On Wed, Jan 27, 2016 at 2:48 PM, Ralf Gommers >> wrote: >> >>> >>> >>> On Wed, Jan 27, 2016 at 3:19 PM, G Young wrote: >>> >>>> NumPy will "build" successfully, but then when I type "import numpy", >>>> it cannot import the multiarray PYD file. >>>> >>>> I am using dependency walker, and that's how I know it's the >>>> libopenblas.dll file that it's not linking to properly, hence my original >>>> question. >>>> >>> >>> The support for MingwPy in numpy.distutils had to be temporarily >>> reverted (see https://github.com/numpy/numpy/pull/6536), because the >>> patch caused other issues. So likely it just won't work out of the box now. >>> If you need it, maybe you can reapply that reverted patch. But otherwise >>> I'd wait a little bit; we'll sort out the MingwPy build in the near future. >>> >>> Ralf >>> >>> >>> >>>> Greg >>>> >>>> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan >>>> wrote: >>>> >>>>> When you say find/use, can you please clarify whether you have >>>>> completed the compilation/linking successfully? I'm not clear on exactly >>>>> when you're having problems. What is the error output? >>>>> >>>>> One very helpful tool in diagnosing dll problems is dependency walker: >>>>> http://www.dependencywalker.com/ >>>>> >>>>> It may be that your openblas has a dependency that it can't load for >>>>> some reason. Dependency walker works on .pyd files as well as .dll files. >>>>> >>>>> Hth, >>>>> Michael >>>>> >>>>> On Wed, Jan 27, 2016, 07:40 G Young wrote: >>>>> >>>>>> I do have my site.cfg file pointing to my library which contains a >>>>>> .lib file along with the appropriate include_dirs parameter. However, >>>>>> NumPy can't seem to find / use the DLL file no matter where I put it >>>>>> (numpy/core, same directory as openblas.lib). By the way, I should mention >>>>>> that I am using a slightly dated version of OpenBLAS (0.2.9), but that >>>>>> shouldn't have any effect I would imagine. >>>>>> >>>>>> Greg >>>>>> >>>>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >>>>>> wrote: >>>>>> >>>>>>> I'm not sure about the mingw tool chain, but usually on windows at >>>>>>> link time you need a .lib file, called the import library. The .dll is >>>>>>> used at runtime, not at link time. This is different from *nix, where the >>>>>>> .so serves both purposes. The link you posted mentions import files, so I >>>>>>> hope this is helpful information. >>>>>>> >>>>>>> Best, >>>>>>> Michael >>>>>>> >>>>>>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>>>>>> >>>>>>>> Hello all, >>>>>>>> >>>>>>>> I'm trying to update the documentation for building Numpy from >>>>>>>> source, and I've hit a brick wall in trying to build the library using >>>>>>>> OpenBLAS because I can't seem to link the libopenblas.dll file. I tried >>>>>>>> following the suggestion of placing the DLL in numpy/core as suggested >>>>>>>> here >>>>>>>> but >>>>>>>> it still doesn't pick it up. What am I doing wrong? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Greg >>>>>>>> _______________________________________________ >>>>>>>> NumPy-Discussion mailing list >>>>>>>> NumPy-Discussion at scipy.org >>>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> NumPy-Discussion mailing list >>>>>>> NumPy-Discussion at scipy.org >>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>> >>>>>>> >>>>>> _______________________________________________ >>>>>> NumPy-Discussion mailing list >>>>>> NumPy-Discussion at scipy.org >>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>> >>>>> >>>>> _______________________________________________ >>>>> NumPy-Discussion mailing list >>>>> NumPy-Discussion at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>>> >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Wed Jan 27 15:01:32 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 27 Jan 2016 21:01:32 +0100 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: <1453919177.19979.5.camel@sipsolutions.net> References: <1453919177.19979.5.camel@sipsolutions.net> Message-ID: On Wed, Jan 27, 2016 at 7:26 PM, Sebastian Berg wrote: > Hi all, > > in my PR about warnings suppression, I currently also have a commit > which bumps the warning stacklevel to two (or three), i.e. use: > > warnings.warn(..., stacklevel=2) > > (almost) everywhere. This means that for example (take only the empty > warning): > > np.mean([]) > > would not print: > > /usr/lib/python2.7/dist-packages/numpy/core/_methods.py:55: > RuntimeWarning: Mean of empty slice. > warnings.warn("Mean of empty slice.", RuntimeWarning) > > but instead print the actual `np.mean([])` code line (the repetition of > the warning command is always a bit funny). > > The advantage is nicer printing for the user. > > The disadvantage would probably mostly be that existing warning filters > that use the `module` keyword argument, will fail. > > Any objections/thoughts about doing this change to try to better report > the offending code line? This has annoyed me for a long time, it's hard now to figure out where warnings really come from. Especially when running something large like scipy.test(). So +1. > Frankly, I am not sure whether there might be > a python standard about this, but I would expect that for a library > such as numpy, it makes sense to change. But, if downstream uses > warning filters with modules, we might want to reconsider for example. > There probably are usages of `module`, but I'd expect that it's used a lot less than `category` or `message`. A quick search through the scipy repo gave me only a single case where `module` was used, and that's in deprecated weave code so soon the count is zero. Also, even for relevant usage, nothing will break in a bad way - some more noise or a spurious test failure in numpy-using code isn't the end of the world I'd say. One issue will be how to keep this consistent. `stacklevel` is used so rarely that new PRs will always omit it for new warnings. Will we just rely on code review, or would a private wrapper around `warn` to use inside numpy plus a test that checks that the wrapper is used everywhere be helpful here? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed Jan 27 15:23:34 2016 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 27 Jan 2016 12:23:34 -0800 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: <1453919177.19979.5.camel@sipsolutions.net> References: <1453919177.19979.5.camel@sipsolutions.net> Message-ID: +1 On Wed, Jan 27, 2016 at 10:26 AM, Sebastian Berg wrote: > Hi all, > > in my PR about warnings suppression, I currently also have a commit > which bumps the warning stacklevel to two (or three), i.e. use: > > warnings.warn(..., stacklevel=2) > > (almost) everywhere. This means that for example (take only the empty > warning): > > np.mean([]) > > would not print: > > /usr/lib/python2.7/dist-packages/numpy/core/_methods.py:55: > RuntimeWarning: Mean of empty slice. > warnings.warn("Mean of empty slice.", RuntimeWarning) > > but instead print the actual `np.mean([])` code line (the repetition of > the warning command is always a bit funny). > > The advantage is nicer printing for the user. > > The disadvantage would probably mostly be that existing warning filters > that use the `module` keyword argument, will fail. > > Any objections/thoughts about doing this change to try to better report > the offending code line? Frankly, I am not sure whether there might be > a python standard about this, but I would expect that for a library > such as numpy, it makes sense to change. But, if downstream uses > warning filters with modules, we might want to reconsider for example. > > - Sebastian > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- Nathaniel J. Smith -- https://vorpus.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From gfyoung17 at gmail.com Wed Jan 27 15:30:34 2016 From: gfyoung17 at gmail.com (G Young) Date: Wed, 27 Jan 2016 20:30:34 +0000 Subject: [Numpy-discussion] Building Numpy with OpenBLAS In-Reply-To: References: Message-ID: That's great! I look forward to seeing that. Greg On Wed, Jan 27, 2016 at 6:27 PM, Carl Kleffner wrote: > In my timeplan the next mingwpy PR on numpy master is anticipated to take > place at the weekend, together with a build description. This PR is > targeted to build numpy with OpenBLAS. > > Carl > > 2016-01-27 16:01 GMT+01:00 Ralf Gommers : > >> >> >> On Wed, Jan 27, 2016 at 3:51 PM, G Young wrote: >> >>> I don't need it at this point. I'm just going through the exercise for >>> purposes of updating building from source on Windows. But that's good to >>> know though. Thanks! >>> >> >> That effort is much appreciated by the way. Updating the build info on >> all platforms on http://scipy.org/scipylib/building/index.html is a >> significant amount of work, and it has never been in a state one could call >> complete. So more contributions definitely welcome! >> >> Ralf >> >> >> >>> >>> Greg >>> >>> On Wed, Jan 27, 2016 at 2:48 PM, Ralf Gommers >>> wrote: >>> >>>> >>>> >>>> On Wed, Jan 27, 2016 at 3:19 PM, G Young wrote: >>>> >>>>> NumPy will "build" successfully, but then when I type "import numpy", >>>>> it cannot import the multiarray PYD file. >>>>> >>>>> I am using dependency walker, and that's how I know it's the >>>>> libopenblas.dll file that it's not linking to properly, hence my original >>>>> question. >>>>> >>>> >>>> The support for MingwPy in numpy.distutils had to be temporarily >>>> reverted (see https://github.com/numpy/numpy/pull/6536), because the >>>> patch caused other issues. So likely it just won't work out of the box now. >>>> If you need it, maybe you can reapply that reverted patch. But otherwise >>>> I'd wait a little bit; we'll sort out the MingwPy build in the near future. >>>> >>>> Ralf >>>> >>>> >>>> >>>>> Greg >>>>> >>>>> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan >>>>> wrote: >>>>> >>>>>> When you say find/use, can you please clarify whether you have >>>>>> completed the compilation/linking successfully? I'm not clear on exactly >>>>>> when you're having problems. What is the error output? >>>>>> >>>>>> One very helpful tool in diagnosing dll problems is dependency >>>>>> walker: http://www.dependencywalker.com/ >>>>>> >>>>>> It may be that your openblas has a dependency that it can't load for >>>>>> some reason. Dependency walker works on .pyd files as well as .dll files. >>>>>> >>>>>> Hth, >>>>>> Michael >>>>>> >>>>>> On Wed, Jan 27, 2016, 07:40 G Young wrote: >>>>>> >>>>>>> I do have my site.cfg file pointing to my library which contains a >>>>>>> .lib file along with the appropriate include_dirs parameter. However, >>>>>>> NumPy can't seem to find / use the DLL file no matter where I put it >>>>>>> (numpy/core, same directory as openblas.lib). By the way, I should mention >>>>>>> that I am using a slightly dated version of OpenBLAS (0.2.9), but that >>>>>>> shouldn't have any effect I would imagine. >>>>>>> >>>>>>> Greg >>>>>>> >>>>>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan >>>>>> > wrote: >>>>>>> >>>>>>>> I'm not sure about the mingw tool chain, but usually on windows at >>>>>>>> link time you need a .lib file, called the import library. The .dll is >>>>>>>> used at runtime, not at link time. This is different from *nix, where the >>>>>>>> .so serves both purposes. The link you posted mentions import files, so I >>>>>>>> hope this is helpful information. >>>>>>>> >>>>>>>> Best, >>>>>>>> Michael >>>>>>>> >>>>>>>> On Wed, Jan 27, 2016, 03:39 G Young wrote: >>>>>>>> >>>>>>>>> Hello all, >>>>>>>>> >>>>>>>>> I'm trying to update the documentation for building Numpy from >>>>>>>>> source, and I've hit a brick wall in trying to build the library using >>>>>>>>> OpenBLAS because I can't seem to link the libopenblas.dll file. I tried >>>>>>>>> following the suggestion of placing the DLL in numpy/core as suggested >>>>>>>>> here >>>>>>>>> but >>>>>>>>> it still doesn't pick it up. What am I doing wrong? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Greg >>>>>>>>> _______________________________________________ >>>>>>>>> NumPy-Discussion mailing list >>>>>>>>> NumPy-Discussion at scipy.org >>>>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>>>> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> NumPy-Discussion mailing list >>>>>>>> NumPy-Discussion at scipy.org >>>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>>> >>>>>>>> >>>>>>> _______________________________________________ >>>>>>> NumPy-Discussion mailing list >>>>>>> NumPy-Discussion at scipy.org >>>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> NumPy-Discussion mailing list >>>>>> NumPy-Discussion at scipy.org >>>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> NumPy-Discussion mailing list >>>>> NumPy-Discussion at scipy.org >>>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> NumPy-Discussion mailing list >>>> NumPy-Discussion at scipy.org >>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>>> >>>> >>> >>> _______________________________________________ >>> NumPy-Discussion mailing list >>> NumPy-Discussion at scipy.org >>> https://mail.scipy.org/mailman/listinfo/numpy-discussion >>> >>> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Wed Jan 27 17:02:04 2016 From: sebastian at sipsolutions.net (sebastian) Date: Wed, 27 Jan 2016 23:02:04 +0100 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: References: <1453919177.19979.5.camel@sipsolutions.net> Message-ID: <2262a25756f1c2f43abe3b343d0ceb1e@sipsolutions.net> On 2016-01-27 21:01, Ralf Gommers wrote: > On Wed, Jan 27, 2016 at 7:26 PM, Sebastian Berg > wrote: > >> Hi all, >> >> in my PR about warnings suppression, I currently also have a commit >> which bumps the warning stacklevel to two (or three), i.e. use: >> >> warnings.warn(..., stacklevel=2) >> >> (almost) everywhere. This means that for example (take only the >> empty >> warning): >> >> np.mean([]) >> >> would not print: >> >> /usr/lib/python2.7/dist-packages/numpy/core/_methods.py:55: >> RuntimeWarning: Mean of empty slice. >> warnings.warn("Mean of empty slice.", RuntimeWarning) >> >> but instead print the actual `np.mean([])` code line (the repetition >> of >> the warning command is always a bit funny). >> >> The advantage is nicer printing for the user. >> >> The disadvantage would probably mostly be that existing warning >> filters >> that use the `module` keyword argument, will fail. >> >> Any objections/thoughts about doing this change to try to better >> report >> the offending code line? > > This has annoyed me for a long time, it's hard now to figure out where > warnings really come from. Especially when running something large > like scipy.test(). So +1. > >> Frankly, I am not sure whether there might be >> a python standard about this, but I would expect that for a library >> such as numpy, it makes sense to change. But, if downstream uses >> warning filters with modules, we might want to reconsider for >> example. > > There probably are usages of `module`, but I'd expect that it's used a > lot less than `category` or `message`. A quick search through the > scipy repo gave me only a single case where `module` was used, and > that's in deprecated weave code so soon the count is zero. Also, even > for relevant usage, nothing will break in a bad way - some more noise > or a spurious test failure in numpy-using code isn't the end of the > world I'd say. > > One issue will be how to keep this consistent. `stacklevel` is used so > rarely that new PRs will always omit it for new warnings. Will we just > rely on code review, or would a private wrapper around `warn` to use > inside numpy plus a test that checks that the wrapper is used > everywhere be helpful here? > Yeah, I mean you could add tests for the individual functions in principle. I am not sure if adding an alias helps much, how are we going to test that warnings.warn is not being used? Seems like quite a bit of voodoo necessary for that. - Sebastian > Ralf > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion From ralf.gommers at gmail.com Wed Jan 27 17:07:33 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 27 Jan 2016 23:07:33 +0100 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: <2262a25756f1c2f43abe3b343d0ceb1e@sipsolutions.net> References: <1453919177.19979.5.camel@sipsolutions.net> <2262a25756f1c2f43abe3b343d0ceb1e@sipsolutions.net> Message-ID: On Wed, Jan 27, 2016 at 11:02 PM, sebastian wrote: > On 2016-01-27 21:01, Ralf Gommers wrote: > >> >> One issue will be how to keep this consistent. `stacklevel` is used so >> rarely that new PRs will always omit it for new warnings. Will we just >> rely on code review, or would a private wrapper around `warn` to use >> inside numpy plus a test that checks that the wrapper is used >> everywhere be helpful here? >> >> > Yeah, I mean you could add tests for the individual functions in principle. > I am not sure if adding an alias helps much, how are we going to test that > warnings.warn is not being used? Seems like quite a bit of voodoo necessary > for that. > I was thinking something along these lines, but with a regexp checking for warnings.warn: https://github.com/scipy/scipy/blob/master/scipy/fftpack/tests/test_import.py Probably more trouble than it's worth though. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.v.root at gmail.com Wed Jan 27 17:12:53 2016 From: ben.v.root at gmail.com (Benjamin Root) Date: Wed, 27 Jan 2016 17:12:53 -0500 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: References: <1453919177.19979.5.camel@sipsolutions.net> <2262a25756f1c2f43abe3b343d0ceb1e@sipsolutions.net> Message-ID: I like the idea of bumping the stacklevel in principle, but I am not sure it is all that practical. For example, if a warning came up when doing "x / y", I am assuming that it is emitted from within the ufunc np.divide(). So, you would need two stacklevels based on whether the entry point was the operator or a direct call to np.divide()? Also, I would imagine it might get weird for numpy functions called within other numpy functions. Or perhaps I am not totally understanding how this would be done? Ben Root On Wed, Jan 27, 2016 at 5:07 PM, Ralf Gommers wrote: > > > On Wed, Jan 27, 2016 at 11:02 PM, sebastian > wrote: > >> On 2016-01-27 21:01, Ralf Gommers wrote: >> >>> >>> One issue will be how to keep this consistent. `stacklevel` is used so >>> rarely that new PRs will always omit it for new warnings. Will we just >>> rely on code review, or would a private wrapper around `warn` to use >>> inside numpy plus a test that checks that the wrapper is used >>> everywhere be helpful here? >>> >>> >> Yeah, I mean you could add tests for the individual functions in >> principle. >> I am not sure if adding an alias helps much, how are we going to test that >> warnings.warn is not being used? Seems like quite a bit of voodoo >> necessary >> for that. >> > > I was thinking something along these lines, but with a regexp checking for > warnings.warn: > https://github.com/scipy/scipy/blob/master/scipy/fftpack/tests/test_import.py > > Probably more trouble than it's worth though. > > Ralf > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at sipsolutions.net Thu Jan 28 03:18:25 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Thu, 28 Jan 2016 09:18:25 +0100 Subject: [Numpy-discussion] Bump warning stacklevel In-Reply-To: References: <1453919177.19979.5.camel@sipsolutions.net> <2262a25756f1c2f43abe3b343d0ceb1e@sipsolutions.net> Message-ID: <1453969105.19979.13.camel@sipsolutions.net> On Mi, 2016-01-27 at 17:12 -0500, Benjamin Root wrote: > I like the idea of bumping the stacklevel in principle, but I am not > sure it is all that practical. For example, if a warning came up when > doing "x / y", I am assuming that it is emitted from within the ufunc > np.divide(). So, you would need two stacklevels based on whether the > entry point was the operator or a direct call to np.divide()? Also, I > would imagine it might get weird for numpy functions called within > other numpy functions. Or perhaps I am not totally understanding how > this would be done? You are right of course, and the answer is that I was never planning on fixing that. This is only for warnings by functions directly called by the users (or those which always go through one more level, though I did not check the C-api for such funcs). Also C-Api warnings are already correct in this regard automatically. Anyway, I still think it is worth it, even if in practice a lot of warnings are such things as ufunc warnings from inside a python func. And there is no real way to change that, that I am aware of, unless maybe we add a warning_stacklevel argument to those C-api funcs ;). - Sebastian > Ben Root > > > On Wed, Jan 27, 2016 at 5:07 PM, Ralf Gommers > wrote: > > > > > > On Wed, Jan 27, 2016 at 11:02 PM, sebastian < > > sebastian at sipsolutions.net> wrote: > > > On 2016-01-27 21:01, Ralf Gommers wrote: > > > > > > > > One issue will be how to keep this consistent. `stacklevel` is > > > > used so > > > > rarely that new PRs will always omit it for new warnings. Will > > > > we just > > > > rely on code review, or would a private wrapper around `warn` > > > > to use > > > > inside numpy plus a test that checks that the wrapper is used > > > > everywhere be helpful here? > > > > > > > > > > > Yeah, I mean you could add tests for the individual functions in > > > principle. > > > I am not sure if adding an alias helps much, how are we going to > > > test that > > > warnings.warn is not being used? Seems like quite a bit of voodoo > > > necessary > > > for that. > > > > > I was thinking something along these lines, but with a regexp > > checking for warnings.warn: https://github.com/scipy/scipy/blob/mas > > ter/scipy/fftpack/tests/test_import.py > > > > Probably more trouble than it's worth though. > > > > Ralf > > > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From jfoxrabinovitz at gmail.com Thu Jan 28 14:00:25 2016 From: jfoxrabinovitz at gmail.com (Joseph Fox-Rabinovitz) Date: Thu, 28 Jan 2016 14:00:25 -0500 Subject: [Numpy-discussion] Proposal for IQR function Message-ID: I have created an IQR function to add to the other dispersion metrics such as standard deviation. I have described the purpose and nature of the proposal in PR#7137, so I am pasting the text here as well: Motivation ---------- This function is used in one place in numpy already (to compute the Freedman-Diaconis histogram bin estimator) in addition to being requested on Stack Overflow a couple of times: - http://stackoverflow.com/questions/23228244/how-do-you-find-the-iqr-in-numpy - http://stackoverflow.com/questions/27472330/how-should-the-interquartile-range-be-calculated-in-python It is also used in matplotlib for box and violin plots: http://matplotlib.org/faq/howto_faq.html#interpreting-box-plots-and-violin-plots. It is a very simple, common and robust dispersion estimator. There does not appear to be an implementation for it anywhere in numpy or scipy. About --------- This function is a convenience combination of `np.percentile` and `np.subtract`. As such, it allows the the difference between any two percentiles to be computed, not necessarily (25, 75), which is the default. All of the recent enhancements to percentile are used. The documentation and testing is borrowed heavily from `np.percentile`. Wikipedia Reference: https://en.wikipedia.org/wiki/Interquartile_range Note ---------- The tests will not pass until the bug-fix for `np.percentile` kwarg `interpolation='midpoint'` (#7129) is incorporated and this PR is rebased. From charlesr.harris at gmail.com Thu Jan 28 15:51:49 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 28 Jan 2016 13:51:49 -0700 Subject: [Numpy-discussion] Numpy 1.11.0b2 released Message-ID: Hi All, I hope I am pleased to announce the Numpy 1.11.0b2 release. The first beta was a damp squib due to missing files in the released source files, this release fixes that. The new source filese may be downloaded from sourceforge, no binaries will be released until the mingw tool chain problems are sorted. Please test and report any problem. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Jan 28 16:29:11 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 28 Jan 2016 13:29:11 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: Hi, On Thu, Jan 28, 2016 at 12:51 PM, Charles R Harris wrote: > Hi All, > > I hope I am pleased to announce the Numpy 1.11.0b2 release. The first beta > was a damp squib due to missing files in the released source files, this > release fixes that. The new source filese may be downloaded from > sourceforge, no binaries will be released until the mingw tool chain > problems are sorted. > > Please test and report any problem. OSX wheels build OK: https://travis-ci.org/MacPython/numpy-wheels/builds/105521850 Y'all can test with: pip install --pre --trusted-host wheels.scipy.org -f http://wheels.scipy.org numpy Cheers, Matthew From njs at pobox.com Thu Jan 28 16:36:12 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 28 Jan 2016 13:36:12 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: Maybe we should upload to pypi? This allows us to upload binaries for osx at least, and in general will make the beta available to anyone who does 'pip install --pre numpy'. (But not regular 'pip install numpy', because pip is clever enough to recognize that this is a prerelease and should not be used by default.) (For bonus points, start a campaign to convince everyone to add --pre to their ci setups, so that merely uploading a prerelease will ensure that it starts getting tested automatically.) On Jan 28, 2016 12:51 PM, "Charles R Harris" wrote: > Hi All, > > I hope I am pleased to announce the Numpy 1.11.0b2 release. The first beta > was a damp squib due to missing files in the released source files, this > release fixes that. The new source filese may be downloaded from > sourceforge, no binaries will be released until the mingw tool chain > problems are sorted. > > Please test and report any problem. > > Chuck > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanzod at accre.vanderbilt.edu Thu Jan 28 16:43:11 2016 From: vanzod at accre.vanderbilt.edu (Davide Vanzo) Date: Thu, 28 Jan 2016 15:43:11 -0600 Subject: [Numpy-discussion] Problem with NumPy 1.10.4 with ATLAS on Python 2.7.8 Message-ID: <1454017391.8709.6.camel@accre.vanderbilt.edu> Hi all, I recently upgraded NumPy from 1.9.1 to 1.10.4 on Python 2.7.8 by using pip. As always I specified the paths to Blas, Lapack and Atlas in the respective environment variables. I used the same compiler I used to compile both Python and the libraries (GCC 4.6.1). The problem is that it always tries to get Blas symbols in the wrong library: >>> import numpy Traceback (most recent call last): ? File "", line 1, in ? File "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/__init__.py", line 180, in ????from . import add_newdocs ? File "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/add_newdocs.py", line 13, in ????from numpy.lib import add_newdoc ? File "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/lib/__init__.py", line 8, in ????from .type_check import * ? File "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/lib/type_check.py", line 11, in ????import numpy.core.numeric as _nx ? File "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/core/__init__.py", line 14, in ????from . import multiarray ImportError: /usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- packages/numpy/core/multiarray.so: undefined symbol: cblas_sgemm I also tried to install from source instead of pip but no luck either. The only way to get it to work is to downgrade to 1.9.1. Any idea why? Thanks. Davide -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Thu Jan 28 17:03:14 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 28 Jan 2016 15:03:14 -0700 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: > Maybe we should upload to pypi? This allows us to upload binaries for osx > at least, and in general will make the beta available to anyone who does > 'pip install --pre numpy'. (But not regular 'pip install numpy', because > pip is clever enough to recognize that this is a prerelease and should not > be used by default.) > > (For bonus points, start a campaign to convince everyone to add --pre to > their ci setups, so that merely uploading a prerelease will ensure that it > starts getting tested automatically.) > On Jan 28, 2016 12:51 PM, "Charles R Harris" > wrote: > >> Hi All, >> >> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >> beta was a damp squib due to missing files in the released source files, >> this release fixes that. The new source filese may be downloaded from >> sourceforge, no binaries will be released until the mingw tool chain >> problems are sorted. >> >> Please test and report any problem. >> > So what happens if I use twine to upload a beta? Mind, I'd give it a try if pypi weren't an irreversible machine of doom. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 28 17:20:25 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 28 Jan 2016 14:20:25 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: AFAIK beta releases act just like regular releases, except that the pip ui and the pypi ui continue to emphasize the older "real" release. On Jan 28, 2016 2:03 PM, "Charles R Harris" wrote: > > > On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: > >> Maybe we should upload to pypi? This allows us to upload binaries for osx >> at least, and in general will make the beta available to anyone who does >> 'pip install --pre numpy'. (But not regular 'pip install numpy', because >> pip is clever enough to recognize that this is a prerelease and should not >> be used by default.) >> >> (For bonus points, start a campaign to convince everyone to add --pre to >> their ci setups, so that merely uploading a prerelease will ensure that it >> starts getting tested automatically.) >> On Jan 28, 2016 12:51 PM, "Charles R Harris" >> wrote: >> >>> Hi All, >>> >>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >>> beta was a damp squib due to missing files in the released source files, >>> this release fixes that. The new source filese may be downloaded from >>> sourceforge, no binaries will be released until the mingw tool chain >>> problems are sorted. >>> >>> Please test and report any problem. >>> >> > So what happens if I use twine to upload a beta? Mind, I'd give it a try > if pypi weren't an irreversible machine of doom. > > Chuck > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Jan 28 17:23:45 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 28 Jan 2016 23:23:45 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris < charlesr.harris at gmail.com> wrote: > > > On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: > >> Maybe we should upload to pypi? This allows us to upload binaries for osx >> at least, and in general will make the beta available to anyone who does >> 'pip install --pre numpy'. (But not regular 'pip install numpy', because >> pip is clever enough to recognize that this is a prerelease and should not >> be used by default.) >> >> (For bonus points, start a campaign to convince everyone to add --pre to >> their ci setups, so that merely uploading a prerelease will ensure that it >> starts getting tested automatically.) >> On Jan 28, 2016 12:51 PM, "Charles R Harris" >> wrote: >> >>> Hi All, >>> >>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >>> beta was a damp squib due to missing files in the released source files, >>> this release fixes that. The new source filese may be downloaded from >>> sourceforge, no binaries will be released until the mingw tool chain >>> problems are sorted. >>> >>> Please test and report any problem. >>> >> > So what happens if I use twine to upload a beta? Mind, I'd give it a try > if pypi weren't an irreversible machine of doom. > One of the things that will probably happen but needs to be avoided is that 1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy. By default I think the status of all releases but the last uploaded one (or highest version number?) is set to hidden. Other ways that users can get a pre-release by accident are: - they have pip <1.4 (released in July 2013) - other packages have a requirement on numpy with a prerelease version (see https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions) Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 28 17:23:42 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 28 Jan 2016 14:23:42 -0800 Subject: [Numpy-discussion] Problem with NumPy 1.10.4 with ATLAS on Python 2.7.8 In-Reply-To: <1454017391.8709.6.camel@accre.vanderbilt.edu> References: <1454017391.8709.6.camel@accre.vanderbilt.edu> Message-ID: What does ldd /usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/core/multiarray.so say? (I'm not a numpy build expert but that should at least give a hint at which kind of brokenness you're running into... I'm also somewhat curious why you're using such an ancient compiler, but that's unlikely to be the issue.) On Jan 28, 2016 1:43 PM, "Davide Vanzo" wrote: > Hi all, > I recently upgraded NumPy from 1.9.1 to 1.10.4 on Python 2.7.8 by using > pip. As always I specified the paths to Blas, Lapack and Atlas in the > respective environment variables. I used the same compiler I used to > compile both Python and the libraries (GCC 4.6.1). The problem is that it > always tries to get Blas symbols in the wrong library: > > >>> import numpy > Traceback (most recent call last): > File "", line 1, in > File > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/__init__.py", > line 180, in > from . import add_newdocs > File > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/add_newdocs.py", > line 13, in > from numpy.lib import add_newdoc > File > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/lib/__init__.py", > line 8, in > from .type_check import * > File > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/lib/type_check.py", > line 11, in > import numpy.core.numeric as _nx > File > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/core/__init__.py", > line 14, in > from . import multiarray > ImportError: > /usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site-packages/numpy/core/multiarray.so: > undefined symbol: cblas_sgemm > > I also tried to install from source instead of pip but no luck either. > The only way to get it to work is to downgrade to 1.9.1. > Any idea why? > > Thanks. > > Davide > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanzod at accre.vanderbilt.edu Thu Jan 28 17:31:23 2016 From: vanzod at accre.vanderbilt.edu (Davide Vanzo) Date: Thu, 28 Jan 2016 16:31:23 -0600 Subject: [Numpy-discussion] Problem with NumPy 1.10.4 with ATLAS on Python 2.7.8 In-Reply-To: References: <1454017391.8709.6.camel@accre.vanderbilt.edu> Message-ID: <1454020283.10000.6.camel@accre.vanderbilt.edu> Nathaniel, thanks for your reply. Here is the output you requested (and a little more). # ldd /usr/local/python2/2.7.8/x86_64/gcc46/nonet/lib/python2.7/site- packages/numpy/core/multiarray.so linux-vdso.so.1 =>??(0x00007fffce3f2000) libatlas.so => /usr/local/atlas/latest/x86_64/gcc46/nonet/lib/libatlas.so (0x00007f86253df000) libm.so.6 => /lib64/libm.so.6 (0x00007f8625147000) libpython2.7.so.1.0 => /usr/local/python2/2.7.8/x86_64/gcc46/nonet/lib/libpython2.7.so.1.0 (0x00007f8624d6c000) libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8624b4f000) libc.so.6 => /lib64/libc.so.6 (0x00007f86247ba000) /lib64/ld-linux-x86-64.so.2 (0x00007f8626409000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f86245b6000) libutil.so.1 => /lib64/libutil.so.1 (0x00007f86243b3000) # nm -D /usr/local/atlas/latest/x86_64/gcc46/nonet/lib/libatlas.so | grep cblas # nm -D /usr/local/atlas/latest/x86_64/gcc46/nonet/lib/liblapack.so | grep cblas ? ? ? ? ? ? ? ? ?U cblas_cdotc_sub ? ? ? ? ? ? ? ? ?U cblas_cgemm ? ? ? ? ? ? ? ? ? ? ? ? ? [?] ? ? ? ? ? ? ? ? ?U cblas_sgemm ? ? ? ? ? ? ? ? ?U cblas_sgemv ? ? ? ? ? ? ? ? ? ? ? ?[?] ? ? ? ? ? ? ? ? ?U cblas_ztrsv I'm pretty familiar with this error and I always solved it by correctly pointing the installer to the correct library paths with the BLAS/LAPACK/ATLAS environment variables. However with 1.10.4 no matter how I tried to define such variables (even by inserting them in the site.cfg file), there was no way to make it work. Davide On Thu, 2016-01-28 at 14:23 -0800, Nathaniel Smith wrote: > What does > ldd > /usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- > packages/numpy/core/multiarray.so > say? > (I'm not a numpy build expert but that should at least give a hint at > which kind of brokenness you're running into... I'm also somewhat > curious why you're using such an ancient compiler, but that's > unlikely to be the issue.) > On Jan 28, 2016 1:43 PM, "Davide Vanzo" > wrote: > > Hi all, > > I recently upgraded NumPy from 1.9.1 to 1.10.4 on Python 2.7.8 by > > using pip. As always I specified the paths to Blas, Lapack and > > Atlas in the respective environment variables. I used the same > > compiler I used to compile both Python and the libraries (GCC > > 4.6.1). The problem is that it always tries to get Blas symbols in > > the wrong library: > > > > >>> import numpy > > Traceback (most recent call last): > > ? File "", line 1, in > > ? File > > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site > > -packages/numpy/__init__.py", line 180, in > > ????from . import add_newdocs > > ? File > > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site > > -packages/numpy/add_newdocs.py", line 13, in > > ????from numpy.lib import add_newdoc > > ? File > > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site > > -packages/numpy/lib/__init__.py", line 8, in > > ????from .type_check import * > > ? File > > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site > > -packages/numpy/lib/type_check.py", line 11, in > > ????import numpy.core.numeric as _nx > > ? File > > "/usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site > > -packages/numpy/core/__init__.py", line 14, in > > ????from . import multiarray > > ImportError: > > /usr/local/python2/2.7.8/x86_64/gcc46/New_build/lib/python2.7/site- > > packages/numpy/core/multiarray.so: undefined symbol: cblas_sgemm > > > > I also tried to install from source instead of pip but no luck > > either. > > The only way to get it to work is to downgrade to 1.9.1. > > Any idea why? > > > > Thanks. > > > > Davide > > > > _______________________________________________ > > NumPy-Discussion mailing list > > NumPy-Discussion at scipy.org > > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 28 17:57:34 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 28 Jan 2016 14:57:34 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers wrote: > > > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris > wrote: >> >> >> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: >>> >>> Maybe we should upload to pypi? This allows us to upload binaries for osx >>> at least, and in general will make the beta available to anyone who does >>> 'pip install --pre numpy'. (But not regular 'pip install numpy', because pip >>> is clever enough to recognize that this is a prerelease and should not be >>> used by default.) >>> >>> (For bonus points, start a campaign to convince everyone to add --pre to >>> their ci setups, so that merely uploading a prerelease will ensure that it >>> starts getting tested automatically.) >>> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" >>> wrote: >>>> >>>> Hi All, >>>> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >>>> beta was a damp squib due to missing files in the released source files, >>>> this release fixes that. The new source filese may be downloaded from >>>> sourceforge, no binaries will be released until the mingw tool chain >>>> problems are sorted. >>>> >>>> Please test and report any problem. >> >> >> So what happens if I use twine to upload a beta? Mind, I'd give it a try >> if pypi weren't an irreversible machine of doom. > > > One of the things that will probably happen but needs to be avoided is that > 1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy. By > default I think the status of all releases but the last uploaded one (or > highest version number?) is set to hidden. Huh, I had the impression that if it was ambiguous whether the "latest version" was a pre-release or not, then pypi would list all of them on that page -- at least I know I've seen projects where going to the main pypi URL gives a list of several versions like that. Or maybe the next-to-latest one gets hidden by default and you're supposed to go back and "un-hide" the last release manually. Could try uploading to https://testpypi.python.org/pypi and see what happens... > Other ways that users can get a pre-release by accident are: > - they have pip <1.4 (released in July 2013) It looks like ~a year ago this was ~20% of users -- https://caremad.io/2015/04/a-year-of-pypi-downloads/ I wouldn't be surprised if it dropped quite a bit since then, but if this is something that will affect our decision then we can ping @dstufft to ask for updated numbers. -n -- Nathaniel J. Smith -- https://vorpus.org From ralf.gommers at gmail.com Thu Jan 28 18:25:32 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 29 Jan 2016 00:25:32 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith wrote: > On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers > wrote: > > > > > > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris > > wrote: > >> > >> > >> > >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: > >>> > >>> Maybe we should upload to pypi? This allows us to upload binaries for > osx > >>> at least, and in general will make the beta available to anyone who > does > >>> 'pip install --pre numpy'. (But not regular 'pip install numpy', > because pip > >>> is clever enough to recognize that this is a prerelease and should not > be > >>> used by default.) > >>> > >>> (For bonus points, start a campaign to convince everyone to add --pre > to > >>> their ci setups, so that merely uploading a prerelease will ensure > that it > >>> starts getting tested automatically.) > >>> > >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" < > charlesr.harris at gmail.com> > >>> wrote: > >>>> > >>>> Hi All, > >>>> > >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first > >>>> beta was a damp squib due to missing files in the released source > files, > >>>> this release fixes that. The new source filese may be downloaded from > >>>> sourceforge, no binaries will be released until the mingw tool chain > >>>> problems are sorted. > >>>> > >>>> Please test and report any problem. > >> > >> > >> So what happens if I use twine to upload a beta? Mind, I'd give it a try > >> if pypi weren't an irreversible machine of doom. > > > > > > One of the things that will probably happen but needs to be avoided is > that > > 1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy. > By > > default I think the status of all releases but the last uploaded one (or > > highest version number?) is set to hidden. > > Huh, I had the impression that if it was ambiguous whether the "latest > version" was a pre-release or not, then pypi would list all of them on > that page -- at least I know I've seen projects where going to the > main pypi URL gives a list of several versions like that. Or maybe the > next-to-latest one gets hidden by default and you're supposed to go > back and "un-hide" the last release manually. > > Could try uploading to > > https://testpypi.python.org/pypi > > and see what happens... > That's worth a try, would be good to know what the behavior is. > > > Other ways that users can get a pre-release by accident are: > > - they have pip <1.4 (released in July 2013) > > It looks like ~a year ago this was ~20% of users -- > https://caremad.io/2015/04/a-year-of-pypi-downloads/ > I wouldn't be surprised if it dropped quite a bit since then, but if > this is something that will affect our decision then we can ping > @dstufft to ask for updated numbers. > Hmm, that's more than I expected. Even if it dropped by a factor of 10 over the last year, that would still be a lot of failed installs for the current beta1. It looks to me like this is a bad trade-off. It would be much better to encourage people to test against numpy master instead of a pre-release (and we were trying to do that anyway). So the benefit is then fairly limited, mostly typing the longer line including wheels.scipy.org when someone wants to test a pre-release. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 28 22:21:50 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 28 Jan 2016 19:21:50 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Jan 28, 2016 3:25 PM, "Ralf Gommers" wrote: > > > > On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith wrote: >> >> On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers wrote: >> > >> > >> > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris >> > wrote: >> >> >> >> >> >> >> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith wrote: >> >>> >> >>> Maybe we should upload to pypi? This allows us to upload binaries for osx >> >>> at least, and in general will make the beta available to anyone who does >> >>> 'pip install --pre numpy'. (But not regular 'pip install numpy', because pip >> >>> is clever enough to recognize that this is a prerelease and should not be >> >>> used by default.) >> >>> >> >>> (For bonus points, start a campaign to convince everyone to add --pre to >> >>> their ci setups, so that merely uploading a prerelease will ensure that it >> >>> starts getting tested automatically.) >> >>> >> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" >> >>> wrote: >> >>>> >> >>>> Hi All, >> >>>> >> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >> >>>> beta was a damp squib due to missing files in the released source files, >> >>>> this release fixes that. The new source filese may be downloaded from >> >>>> sourceforge, no binaries will be released until the mingw tool chain >> >>>> problems are sorted. >> >>>> >> >>>> Please test and report any problem. >> >> >> >> >> >> So what happens if I use twine to upload a beta? Mind, I'd give it a try >> >> if pypi weren't an irreversible machine of doom. >> > >> > >> > One of the things that will probably happen but needs to be avoided is that >> > 1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy. By >> > default I think the status of all releases but the last uploaded one (or >> > highest version number?) is set to hidden. >> >> Huh, I had the impression that if it was ambiguous whether the "latest >> version" was a pre-release or not, then pypi would list all of them on >> that page -- at least I know I've seen projects where going to the >> main pypi URL gives a list of several versions like that. Or maybe the >> next-to-latest one gets hidden by default and you're supposed to go >> back and "un-hide" the last release manually. >> >> Could try uploading to >> >> https://testpypi.python.org/pypi >> >> and see what happens... > > > That's worth a try, would be good to know what the behavior is. > >> >> >> > Other ways that users can get a pre-release by accident are: >> > - they have pip <1.4 (released in July 2013) >> >> It looks like ~a year ago this was ~20% of users -- >> https://caremad.io/2015/04/a-year-of-pypi-downloads/ >> I wouldn't be surprised if it dropped quite a bit since then, but if >> this is something that will affect our decision then we can ping >> @dstufft to ask for updated numbers. > > > Hmm, that's more than I expected. Even if it dropped by a factor of 10 over the last year, that would still be a lot of failed installs for the current beta1. It looks to me like this is a bad trade-off. It would be much better to encourage people to test against numpy master instead of a pre-release (and we were trying to do that anyway). So the benefit is then fairly limited, mostly typing the longer line including wheels.scipy.org when someone wants to test a pre-release. After the disastrous lack of testing for the 1.10 prereleases, it might almost be a good thing if we accidentally swept up some pip 1.3 users into doing prerelease testing... I mean, if they don't test it now, they'll just end up testing it later, and at least there will be fewer of them to start with? Plus all they have to do to opt out is to maintain a vaguely up-to-date environment, which is a good thing for the ecosystem anyway :-). It's bad for everyone if pip and PyPI are collaborating to provide this rather nice, standard feature for distributing and QAing pre-releases, but we can't actually use it because of people not upgrading pip... Regarding CI setups and testing against master: I think of these as being complementary. The fact is that master *will* sometimes just be broken, or contain tentative API changes that get changed before the release, etc. So it's really great that there are some projects who are willing to take on the work of testing numpy master directly as part of their own CI setups, but it is going to be extra work and risk for them, they'll probably have to switch it off sometimes and then turn it back on, and they really need to have decent channels of communication with us whenever things go wrong because sometimes the answer will be "doh, we didn't mean to change that, please leave your code alone and we'll fix it on our end". (My nightmare here is that downstream projects start working around bugs in master, and then we find ourselves having to jump through hoops to maintain backcompat with code that was never even released. __numpy_ufunc__ is stuck in this situation -- we know that the final version will have to change its name, because scipy has been shipping code that assumes a different calling convention than the final released version will have.) So, testing master is *great*, but also tricky and not really something I think we should be advocating to all 5000 downstream projects [1]. OTOH, once a project has put up a prerelease, then *everyone* wants to be testing that, because if they don't then things definitely *will* break soon. (And this isn't specific to numpy -- this applies to pretty much all upstream dependencies.) So IMO we should be teaching everyone that their CI setups should just always use --pre when running pip install, and this will automatically improve QA coverage for the whole ecosystem. ...It does help if we run at least some minimal QA against the sdist before uploading it though, to avoid the 1.11.0b1 problem :-). (Though the new travis test for sdists should cover that.) Something else for the release checklist I guess... -n [1] http://depsy.org/package/python/numpy From ralf.gommers at gmail.com Fri Jan 29 02:49:32 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Fri, 29 Jan 2016 08:49:32 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: On Fri, Jan 29, 2016 at 4:21 AM, Nathaniel Smith wrote: > On Jan 28, 2016 3:25 PM, "Ralf Gommers" wrote: > > > > > > > > On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith wrote: > >> > >> On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers > wrote: > >> > > >> > > >> > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris > >> > wrote: > >> >> > >> >> > >> >> > >> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith > wrote: > >> >>> > >> >>> Maybe we should upload to pypi? This allows us to upload binaries > for osx > >> >>> at least, and in general will make the beta available to anyone who > does > >> >>> 'pip install --pre numpy'. (But not regular 'pip install numpy', > because pip > >> >>> is clever enough to recognize that this is a prerelease and should > not be > >> >>> used by default.) > >> >>> > >> >>> (For bonus points, start a campaign to convince everyone to add > --pre to > >> >>> their ci setups, so that merely uploading a prerelease will ensure > that it > >> >>> starts getting tested automatically.) > >> >>> > >> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" < > charlesr.harris at gmail.com> > >> >>> wrote: > >> >>>> > >> >>>> Hi All, > >> >>>> > >> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The > first > >> >>>> beta was a damp squib due to missing files in the released source > files, > >> >>>> this release fixes that. The new source filese may be downloaded > from > >> >>>> sourceforge, no binaries will be released until the mingw tool > chain > >> >>>> problems are sorted. > >> >>>> > >> >>>> Please test and report any problem. > >> >> > >> >> > >> >> So what happens if I use twine to upload a beta? Mind, I'd give it a > try > >> >> if pypi weren't an irreversible machine of doom. > >> > > >> > > >> > One of the things that will probably happen but needs to be avoided > is that > >> > 1.11b2 becomes the visible release at > https://pypi.python.org/pypi/numpy. By > >> > default I think the status of all releases but the last uploaded one > (or > >> > highest version number?) is set to hidden. > >> > >> Huh, I had the impression that if it was ambiguous whether the "latest > >> version" was a pre-release or not, then pypi would list all of them on > >> that page -- at least I know I've seen projects where going to the > >> main pypi URL gives a list of several versions like that. Or maybe the > >> next-to-latest one gets hidden by default and you're supposed to go > >> back and "un-hide" the last release manually. > >> > >> Could try uploading to > >> > >> https://testpypi.python.org/pypi > >> > >> and see what happens... > > > > > > That's worth a try, would be good to know what the behavior is. > > > >> > >> > >> > Other ways that users can get a pre-release by accident are: > >> > - they have pip <1.4 (released in July 2013) > >> > >> It looks like ~a year ago this was ~20% of users -- > >> https://caremad.io/2015/04/a-year-of-pypi-downloads/ > >> I wouldn't be surprised if it dropped quite a bit since then, but if > >> this is something that will affect our decision then we can ping > >> @dstufft to ask for updated numbers. > > > > > > Hmm, that's more than I expected. Even if it dropped by a factor of 10 > over the last year, that would still be a lot of failed installs for the > current beta1. It looks to me like this is a bad trade-off. It would be > much better to encourage people to test against numpy master instead of a > pre-release (and we were trying to do that anyway). So the benefit is then > fairly limited, mostly typing the longer line including wheels.scipy.org > when someone wants to test a pre-release. > > After the disastrous lack of testing for the 1.10 prereleases, it > might almost be a good thing if we accidentally swept up some pip 1.3 > users into doing prerelease testing... I mean, if they don't test it > now, they'll just end up testing it later, and at least there will be > fewer of them to start with? Plus all they have to do to opt out is to > maintain a vaguely up-to-date environment, which is a good thing for > the ecosystem anyway :-). It's bad for everyone if pip and PyPI are > collaborating to provide this rather nice, standard feature for > distributing and QAing pre-releases, but we can't actually use it > because of people not upgrading pip... > That's a fair point. And given the amount of brokenness in (especially older versions of ) pip, plus how easy it is to upgrade pip, we should probably just say that we expect a recent pip (say last 3 major releases). > > Regarding CI setups and testing against master: I think of these as > being complementary. The fact is that master *will* sometimes just be > broken, or contain tentative API changes that get changed before the > release, etc. So it's really great that there are some projects who > are willing to take on the work of testing numpy master directly as > part of their own CI setups, but it is going to be extra work and risk > for them, they'll probably have to switch it off sometimes and then > turn it back on, and they really need to have decent channels of > communication with us whenever things go wrong because sometimes the > answer will be "doh, we didn't mean to change that, please leave your > code alone and we'll fix it on our end". (My nightmare here is that > downstream projects start working around bugs in master, and then we > find ourselves having to jump through hoops to maintain backcompat > with code that was never even released. __numpy_ufunc__ is stuck in > this situation -- we know that the final version will have to change > its name, because scipy has been shipping code that assumes a > different calling convention than the final released version will > have.) > > So, testing master is *great*, but also tricky and not really > something I think we should be advocating to all 5000 downstream > projects [1]. > > OTOH, once a project has put up a prerelease, then *everyone* wants to > be testing that, because if they don't then things definitely *will* > break soon. (And this isn't specific to numpy -- this applies to > pretty much all upstream dependencies.) So IMO we should be teaching > everyone that their CI setups should just always use --pre when > running pip install, and this will automatically improve QA coverage > for the whole ecosystem. > OK, persuasive argument. In the past this wouldn't have worked, but our CI setup is much better now. Until we had Appveyor testing for example, it was almost the rule that MSVC builds were broken for every first beta. So, with some hesitation: let's go for it. > ...It does help if we run at least some minimal QA against the sdist > before uploading it though, to avoid the 1.11.0b1 problem :-). (Though > the new travis test for sdists should cover that.) Something else for > the release checklist I guess... > There's still a large number of ways that one can install numpy that aren't tested (see list in https://github.com/numpy/numpy/issues/6599), but the only one relevant for pip is when easy_install is triggered by `setup_requires=numpy`. It's actually not too hard to add that to TravisCI testing (just install a dummy package that uses setup_requires. I'll add that to the todo list. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From t3kcit at gmail.com Fri Jan 29 12:45:54 2016 From: t3kcit at gmail.com (Andreas Mueller) Date: Fri, 29 Jan 2016 12:45:54 -0500 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: Message-ID: <56ABA552.1090002@gmail.com> Is this the point when scikit-learn should build against it? Or do we wait for an RC? Also, we need a scipy build against it. Who does that? Our continuous integration doesn't usually build scipy or numpy, so it will be a bit tricky to add to our config. Would you run our master tests? [did we ever finish this discussion?] Andy On 01/28/2016 03:51 PM, Charles R Harris wrote: > Hi All, > > I hope I am pleased to announce the Numpy 1.11.0b2 release. The first > beta was a damp squib due to missing files in the released source > files, this release fixes that. The new source filese may be > downloaded from sourceforge, no binaries will be released until the > mingw tool chain problems are sorted. > > Please test and report any problem. > > Chuck > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- An HTML attachment was scrubbed... URL: From jtaylor.debian at googlemail.com Fri Jan 29 12:53:43 2016 From: jtaylor.debian at googlemail.com (Julian Taylor) Date: Fri, 29 Jan 2016 18:53:43 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: <56ABA552.1090002@gmail.com> References: <56ABA552.1090002@gmail.com> Message-ID: <56ABA727.7090405@googlemail.com> You most likely don't need a scipy build against it. You should be able to use the oldest scipy our project supports. Numpy does try to not break its reverse dependencies, if stuff breaks it should only occur in edge cases not affecting functionality of real applications (like warnings or overzealous testing). Of course that only works if people bother to test against the numpy prereleases. On 01/29/2016 06:45 PM, Andreas Mueller wrote: > Is this the point when scikit-learn should build against it? > Or do we wait for an RC? > Also, we need a scipy build against it. Who does that? > Our continuous integration doesn't usually build scipy or numpy, so it > will be a bit tricky to add to our config. > Would you run our master tests? [did we ever finish this discussion?] > > Andy > > On 01/28/2016 03:51 PM, Charles R Harris wrote: >> Hi All, >> >> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first >> beta was a damp squib due to missing files in the released source >> files, this release fixes that. The new source filese may be >> downloaded from sourceforge, no binaries will be released until the >> mingw tool chain problems are sorted. >> >> Please test and report any problem. >> >> Chuck >> >> >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion at scipy.org >> https://mail.scipy.org/mailman/listinfo/numpy-discussion > > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > From njs at pobox.com Fri Jan 29 17:39:54 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 29 Jan 2016 14:39:54 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: <56ABA552.1090002@gmail.com> References: <56ABA552.1090002@gmail.com> Message-ID: On Jan 29, 2016 9:46 AM, "Andreas Mueller" wrote: > > Is this the point when scikit-learn should build against it? Yes please! > Or do we wait for an RC? This is still all in flux, but I think we might actually want a rule that says it can't become an RC until after we've tested scikit-learn (and a list of similarly prominent packages). On the theory that RC means "we think this is actually good enough to release" :-). OTOH I'm not sure the alpha/beta/RC distinction is very helpful; maybe they should all just be betas. > Also, we need a scipy build against it. Who does that? Like Julian says, it shouldn't be necessary. In fact using old builds of scipy and scikit-learn is even better than rebuilding them, because it tests numpy's ABI compatibility -- if you find you *have* to rebuild something then we *definitely* want to know that. > Our continuous integration doesn't usually build scipy or numpy, so it will be a bit tricky to add to our config. > Would you run our master tests? [did we ever finish this discussion?] We didn't, and probably should... :-) It occurs to me that the best solution might be to put together a .travis.yml for the release branches that does: "for pkg in IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; pkg.test()'" This might not be viable right now, but will be made more viable if pypi starts allowing official Linux wheels, which looks likely to happen before 1.12... (see PEP 513) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jan 30 12:27:15 2016 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 30 Jan 2016 18:27:15 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: <56ABA552.1090002@gmail.com> Message-ID: On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith wrote: > It occurs to me that the best solution might be to put together a > .travis.yml for the release branches that does: "for pkg in > IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; pkg.test()'" > This might not be viable right now, but will be made more viable if pypi > starts allowing official Linux wheels, which looks likely to happen before > 1.12... (see PEP 513) > > On Jan 29, 2016 9:46 AM, "Andreas Mueller" wrote: > > > > Is this the point when scikit-learn should build against it? > Yes please! > > > Or do we wait for an RC? > This is still all in flux, but I think we might actually want a rule that > says it can't become an RC until after we've tested scikit-learn (and a > list of similarly prominent packages). On the theory that RC means "we > think this is actually good enough to release" :-). OTOH I'm not sure the > alpha/beta/RC distinction is very helpful; maybe they should all just be > betas. > > > Also, we need a scipy build against it. Who does that? > Like Julian says, it shouldn't be necessary. In fact using old builds of > scipy and scikit-learn is even better than rebuilding them, because it > tests numpy's ABI compatibility -- if you find you *have* to rebuild > something then we *definitely* want to know that. > > > Our continuous integration doesn't usually build scipy or numpy, so it > will be a bit tricky to add to our config. > > Would you run our master tests? [did we ever finish this discussion?] > We didn't, and probably should... :-) > > Why would that be necessary if scikit-learn simply tests pre-releases of numpy as you suggested earlier in the thread (with --pre)? There's also https://github.com/MacPython/scipy-stack-osx-testing by the way, which could have scikit-learn and scikit-image added to it. That's two options that are imho both better than adding more workload for the numpy release manager. Also from a principled point of view, packages should test with new versions of their dependencies, not the other way around. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat Jan 30 13:16:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 30 Jan 2016 10:16:26 -0800 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: <56ABA552.1090002@gmail.com> Message-ID: On Jan 30, 2016 9:27 AM, "Ralf Gommers" wrote: > > > > On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith wrote: >> >> It occurs to me that the best solution might be to put together a .travis.yml for the release branches that does: "for pkg in IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; pkg.test()'" >> This might not be viable right now, but will be made more viable if pypi starts allowing official Linux wheels, which looks likely to happen before 1.12... (see PEP 513) >> >> On Jan 29, 2016 9:46 AM, "Andreas Mueller" wrote: >> > >> > Is this the point when scikit-learn should build against it? >> >> Yes please! >> >> > Or do we wait for an RC? >> >> This is still all in flux, but I think we might actually want a rule that says it can't become an RC until after we've tested scikit-learn (and a list of similarly prominent packages). On the theory that RC means "we think this is actually good enough to release" :-). OTOH I'm not sure the alpha/beta/RC distinction is very helpful; maybe they should all just be betas. >> >> > Also, we need a scipy build against it. Who does that? >> >> Like Julian says, it shouldn't be necessary. In fact using old builds of scipy and scikit-learn is even better than rebuilding them, because it tests numpy's ABI compatibility -- if you find you *have* to rebuild something then we *definitely* want to know that. >> >> > Our continuous integration doesn't usually build scipy or numpy, so it will be a bit tricky to add to our config. >> > Would you run our master tests? [did we ever finish this discussion?] >> >> We didn't, and probably should... :-) > > Why would that be necessary if scikit-learn simply tests pre-releases of numpy as you suggested earlier in the thread (with --pre)? > > There's also https://github.com/MacPython/scipy-stack-osx-testing by the way, which could have scikit-learn and scikit-image added to it. > > That's two options that are imho both better than adding more workload for the numpy release manager. Also from a principled point of view, packages should test with new versions of their dependencies, not the other way around. Sorry, that was unclear. I meant that we should finish the discussion, not that we should necessarily be the ones running the tests. "The discussion" being this one: https://github.com/numpy/numpy/issues/6462#issuecomment-148094591 https://github.com/numpy/numpy/issues/6494 I'm not saying that the release manager necessarily should be running the tests (though it's one option). But the 1.10 experience seems to indicate that we need *some* process for the release manager to make sure that some basic downstream testing has happened. Another option would be keeping a checklist of downstream projects and making sure they've all checked in and confirmed that they've run tests before making the release. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeffreback at gmail.com Sat Jan 30 13:40:03 2016 From: jeffreback at gmail.com (Jeff Reback) Date: Sat, 30 Jan 2016 13:40:03 -0500 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: <56ABA552.1090002@gmail.com> Message-ID: <31866A9B-CA5B-4576-9BF6-9471C438D7F6@gmail.com> just my 2c it's fairly straightforward to add a test to the Travis matrix to grab numpy wheels built numpy wheels (works for conda or pip installs). so in pandas we r testing 2.7/3.5 against numpy master continuously https://github.com/pydata/pandas/blob/master/ci/install-3.5_NUMPY_DEV.sh > On Jan 30, 2016, at 1:16 PM, Nathaniel Smith wrote: > > On Jan 30, 2016 9:27 AM, "Ralf Gommers" wrote: > > > > > > > > On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith wrote: > >> > >> It occurs to me that the best solution might be to put together a .travis.yml for the release branches that does: "for pkg in IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; pkg.test()'" > >> This might not be viable right now, but will be made more viable if pypi starts allowing official Linux wheels, which looks likely to happen before 1.12... (see PEP 513) > >> > >> On Jan 29, 2016 9:46 AM, "Andreas Mueller" wrote: > >> > > >> > Is this the point when scikit-learn should build against it? > >> > >> Yes please! > >> > >> > Or do we wait for an RC? > >> > >> This is still all in flux, but I think we might actually want a rule that says it can't become an RC until after we've tested scikit-learn (and a list of similarly prominent packages). On the theory that RC means "we think this is actually good enough to release" :-). OTOH I'm not sure the alpha/beta/RC distinction is very helpful; maybe they should all just be betas. > >> > >> > Also, we need a scipy build against it. Who does that? > >> > >> Like Julian says, it shouldn't be necessary. In fact using old builds of scipy and scikit-learn is even better than rebuilding them, because it tests numpy's ABI compatibility -- if you find you *have* to rebuild something then we *definitely* want to know that. > >> > >> > Our continuous integration doesn't usually build scipy or numpy, so it will be a bit tricky to add to our config. > >> > Would you run our master tests? [did we ever finish this discussion?] > >> > >> We didn't, and probably should... :-) > > > > Why would that be necessary if scikit-learn simply tests pre-releases of numpy as you suggested earlier in the thread (with --pre)? > > > > There's also https://github.com/MacPython/scipy-stack-osx-testing by the way, which could have scikit-learn and scikit-image added to it. > > > > That's two options that are imho both better than adding more workload for the numpy release manager. Also from a principled point of view, packages should test with new versions of their dependencies, not the other way around. > > Sorry, that was unclear. I meant that we should finish the discussion, not that we should necessarily be the ones running the tests. "The discussion" being this one: > > https://github.com/numpy/numpy/issues/6462#issuecomment-148094591 > https://github.com/numpy/numpy/issues/6494 > > I'm not saying that the release manager necessarily should be running the tests (though it's one option). But the 1.10 experience seems to indicate that we need *some* process for the release manager to make sure that some basic downstream testing has happened. Another option would be keeping a checklist of downstream projects and making sure they've all checked in and confirmed that they've run tests before making the release. > > -n > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion -------------- next part -------------- An HTML attachment was scrubbed... URL: From derek at astro.physik.uni-goettingen.de Sat Jan 30 14:27:58 2016 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Sat, 30 Jan 2016 20:27:58 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b1 is out In-Reply-To: <1453896636.18959.2.camel@sipsolutions.net> References: <1453896636.18959.2.camel@sipsolutions.net> Message-ID: On 27 Jan 2016, at 1:10 pm, Sebastian Berg wrote: > > On Mi, 2016-01-27 at 11:19 +0000, Nadav Horesh wrote: >> Why the dot function/method is slower than @ on python 3.5.1? Tested >> from the latest 1.11 maintenance branch. >> > > The explanation I think is that you do not have a blas optimization. In > which case the fallback mode is probably faster in the @ case (since it > has SSE2 optimization by using einsum, while np.dot does not do that). I am a bit confused now, as A @ c is just short for A.__matmul__(c) or equivalent to np.matmul(A,c), so why would these not use the optimised blas? Also, I am getting almost identical results on my Mac, yet I thought numpy would by default build against the VecLib optimised BLAS. If I build explicitly against ATLAS, I am actually seeing slightly slower results. But I also saw these kind of warnings on the first timeit runs: %timeit A.dot(c) The slowest run took 6.91 times longer than the fastest. This could mean that an intermediate result is being cached and when testing much larger arrays, the discrepancy between matmul and dot rather increases, so perhaps this is more an issue of a less memory-efficient implementation in np.dot? Cheers, Derek From sebastian at sipsolutions.net Sun Jan 31 03:48:02 2016 From: sebastian at sipsolutions.net (Sebastian Berg) Date: Sun, 31 Jan 2016 09:48:02 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b1 is out In-Reply-To: References: <1453896636.18959.2.camel@sipsolutions.net> Message-ID: <1454230082.3650.8.camel@sipsolutions.net> On Sa, 2016-01-30 at 20:27 +0100, Derek Homeier wrote: > On 27 Jan 2016, at 1:10 pm, Sebastian Berg < > sebastian at sipsolutions.net> wrote: > > > > On Mi, 2016-01-27 at 11:19 +0000, Nadav Horesh wrote: > > > Why the dot function/method is slower than @ on python 3.5.1? > > > Tested > > > from the latest 1.11 maintenance branch. > > > > > > > The explanation I think is that you do not have a blas > > optimization. In > > which case the fallback mode is probably faster in the @ case > > (since it > > has SSE2 optimization by using einsum, while np.dot does not do > > that). > > I am a bit confused now, as A @ c is just short for A.__matmul__(c) > or equivalent > to np.matmul(A,c), so why would these not use the optimised blas? > Also, I am getting almost identical results on my Mac, yet I thought > numpy would > by default build against the VecLib optimised BLAS. If I build > explicitly against > ATLAS, I am actually seeing slightly slower results. > But I also saw these kind of warnings on the first timeit runs: > > %timeit A.dot(c) > The slowest run took 6.91 times longer than the fastest. This could > mean that an intermediate result is being cached > > and when testing much larger arrays, the discrepancy between matmul > and dot rather > increases, so perhaps this is more an issue of a less memory > -efficient implementation > in np.dot? Sorry, I missed the fact that one of the arrays was 3D. In that case I am not even sure which if the functions call into blas or what else they have to do, would have to check. Note that `np.dot` uses a different type of combinging high dimensional arrays. @/matmul broadcasts extra axes, while np.dot will do the outer combination of them, so that the result is: As = A.shape As.pop(-1) cs = c.shape cs.pop(-2) # if possible result_shape = As + cs which happens to be identical if only A.ndim > 2 and c.ndim <= 2. - Sebastian > > Cheers, > Derek > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: This is a digitally signed message part URL: From jtaylor.debian at googlemail.com Sun Jan 31 05:57:45 2016 From: jtaylor.debian at googlemail.com (Julian Taylor) Date: Sun, 31 Jan 2016 11:57:45 +0100 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: References: <56ABA552.1090002@gmail.com> Message-ID: <56ADE8A9.7030901@googlemail.com> On 01/30/2016 06:27 PM, Ralf Gommers wrote: > > > On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith > wrote: > > It occurs to me that the best solution might be to put together a > .travis.yml for the release branches that does: "for pkg in > IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; > pkg.test()'" > This might not be viable right now, but will be made more viable if > pypi starts allowing official Linux wheels, which looks likely to > happen before 1.12... (see PEP 513) > > On Jan 29, 2016 9:46 AM, "Andreas Mueller" > wrote: > > > > Is this the point when scikit-learn should build against it? > > Yes please! > > > Or do we wait for an RC? > > This is still all in flux, but I think we might actually want a rule > that says it can't become an RC until after we've tested > scikit-learn (and a list of similarly prominent packages). On the > theory that RC means "we think this is actually good enough to > release" :-). OTOH I'm not sure the alpha/beta/RC distinction is > very helpful; maybe they should all just be betas. > > > Also, we need a scipy build against it. Who does that? > > Like Julian says, it shouldn't be necessary. In fact using old > builds of scipy and scikit-learn is even better than rebuilding > them, because it tests numpy's ABI compatibility -- if you find you > *have* to rebuild something then we *definitely* want to know that. > > > Our continuous integration doesn't usually build scipy or numpy, so it will be a bit tricky to add to our config. > > Would you run our master tests? [did we ever finish this discussion?] > > We didn't, and probably should... :-) > > Why would that be necessary if scikit-learn simply tests pre-releases of > numpy as you suggested earlier in the thread (with --pre)? > > There's also https://github.com/MacPython/scipy-stack-osx-testing by the > way, which could have scikit-learn and scikit-image added to it. > > That's two options that are imho both better than adding more workload > for the numpy release manager. Also from a principled point of view, > packages should test with new versions of their dependencies, not the > other way around. It would be nice but its not realistic, I doubt most upstreams that are not themselves major downstreams are even subscribed to this list. Testing or delegating testing of least our major downstreams should be the job of the release manager. Thus I also disagree with our more frequent releases. It puts too much porting and testing effort on our downstreams and it gets infeaseble for a volunteer release manager to handle. I fear by doing this we will end up in an situation where more downstreams put upper bounds on their supported numpy releases like e.g. astropy already did. This has bad consequences like the subclass breaking of linspace that should have been caught month ago but was not because upstreams were discouraging users from upgrading numpy because they could not keep up with porting. From pav at iki.fi Sun Jan 31 07:08:02 2016 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 31 Jan 2016 14:08:02 +0200 Subject: [Numpy-discussion] Downstream integration testing In-Reply-To: <56ADE8A9.7030901@googlemail.com> References: <56ABA552.1090002@gmail.com> <56ADE8A9.7030901@googlemail.com> Message-ID: 31.01.2016, 12:57, Julian Taylor kirjoitti: [clip] > Testing or delegating testing of least our major downstreams should be > the job of the release manager. > Thus I also disagree with our more frequent releases. It puts too much > porting and testing effort on our downstreams and it gets infeaseble for > a volunteer release manager to handle. > I fear by doing this we will end up in an situation where more > downstreams put upper bounds on their supported numpy releases like e.g. > astropy already did. > This has bad consequences like the subclass breaking of linspace that > should have been caught month ago but was not because upstreams were > discouraging users from upgrading numpy because they could not keep up > with porting. I'd suggest that some automation could reduce the maintainer burden here. Basically, I think being aware of downstream breakage is something that could be determined without too much manual intervention. For example, automated test rig that does the following: - run tests of a given downstream project version, against previous numpy version, record output - run tests of a given downstream project version, against numpy master, record output - determine which failures were added by the new numpy version - make this happen with just a single command, eg "python run.py", and implement it for several downstream packages and versions. (Probably good to steal ideas from travis-ci dependency matrix etc.) This is probably too time intensive and waste of resources for Travis-CI, but could be run by the Numpy maintainer or someone else during release process, or periodically on some ad-hoc machine if someone is willing to set it up. Of course, understanding the cause of breakages would take some understanding of the downstream package, but this would at least ensure we are aware of stuff breaking. Provided it's covered by downstream test suite, of course. -- Pauli Virtanen From davidmenhur at gmail.com Sun Jan 31 07:41:09 2016 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Sun, 31 Jan 2016 13:41:09 +0100 Subject: [Numpy-discussion] Downstream integration testing In-Reply-To: References: <56ABA552.1090002@gmail.com> <56ADE8A9.7030901@googlemail.com> Message-ID: On 31 Jan 2016 13:08, "Pauli Virtanen" wrote: > For example, automated test rig that does the following: > > - run tests of a given downstream project version, against > previous numpy version, record output > > - run tests of a given downstream project version, against > numpy master, record output > > - determine which failures were added by the new numpy version > > - make this happen with just a single command, eg "python run.py", > and implement it for several downstream packages and versions. > (Probably good to steal ideas from travis-ci dependency matrix etc.) A simpler idea: build the master branch of a series of projects and run the tests. In case of failure, we can compare with Travis's logs from the project when they use the released numpy. In most cases, the master branch is clean, so an error will likely be a change in behaviour. This can be run automatically once a week, to not hog too much of Travis, and counting the costs in hours of work, is very cheap to set up, and free to maintain. /David -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Jan 31 08:12:28 2016 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 31 Jan 2016 15:12:28 +0200 Subject: [Numpy-discussion] Downstream integration testing In-Reply-To: References: <56ABA552.1090002@gmail.com> <56ADE8A9.7030901@googlemail.com> Message-ID: 31.01.2016, 14:41, Da?id kirjoitti: > On 31 Jan 2016 13:08, "Pauli Virtanen" wrote: >> For example, automated test rig that does the following: >> >> - run tests of a given downstream project version, against >> previous numpy version, record output >> >> - run tests of a given downstream project version, against >> numpy master, record output >> >> - determine which failures were added by the new numpy version >> >> - make this happen with just a single command, eg "python run.py", >> and implement it for several downstream packages and versions. >> (Probably good to steal ideas from travis-ci dependency matrix etc.) > > A simpler idea: build the master branch of a series of projects and run the > tests. In case of failure, we can compare with Travis's logs from the > project when they use the released numpy. In most cases, the master branch > is clean, so an error will likely be a change in behaviour. If you can assume the tests of a downstream project are in an OK state, then you can skip the build against existing numpy. But it's an additional and unnecessary burden for the Numpy maintainers to compare the logs manually (and check the built versions are the same, and that the difference is not due to difference in build environments). I would also avoid depending on the other projects' Travis-CI configurations, since these may change. I think testing released versions of downstream projects is better than testing their master versions here, as the master branch may contain workarounds for Numpy changes and not be representative of what people get on their computers after Numpy release. > This can be run automatically once a week, to not hog too much of Travis, > and counting the costs in hours of work, is very cheap to set up, and free > to maintain. It may be that such project could be runnable on Travis, if split to per-project runs to work around the 50min timeout. I'm not aware of Travis-CI having support for "automatically once per week" builds. Anyway, having any form of central automated integration testing would be better than the current situation where it's mostly all-manual and relies on the activity of downstream project maintainers. -- Pauli Virtanen From charlesr.harris at gmail.com Sun Jan 31 14:10:31 2016 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 31 Jan 2016 12:10:31 -0700 Subject: [Numpy-discussion] Numpy pull requests getting out of hand. Message-ID: Hi All, There are now 130 open numpy pull requests and it seems almost impossible to keep that number down. My personal decision is that I am going to ignore any new enhancements for the next couple of months and only merge bug fixes, tests, house keeping (style, docs, deprecations), and old PRs. I would also request that other maintainers start looking a taking care of older PRs, either cleaning them up and merging, or closing them. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeffreback at gmail.com Sun Jan 31 14:25:43 2016 From: jeffreback at gmail.com (Jeff Reback) Date: Sun, 31 Jan 2016 14:25:43 -0500 Subject: [Numpy-discussion] Numpy pull requests getting out of hand. In-Reply-To: References: Message-ID: FYI also useful to simply close by time - say older than 6 months with a message for the writer to reopen if they want to work on it then u don't get too many stale ones my 2c > On Jan 31, 2016, at 2:10 PM, Charles R Harris wrote: > > Hi All, > > There are now 130 open numpy pull requests and it seems almost impossible to keep that number down. My personal decision is that I am going to ignore any new enhancements for the next couple of months and only merge bug fixes, tests, house keeping (style, docs, deprecations), and old PRs. I would also request that other maintainers start looking a taking care of older PRs, either cleaning them up and merging, or closing them. > > Chuck > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion From m.h.vankerkwijk at gmail.com Sun Jan 31 16:52:08 2016 From: m.h.vankerkwijk at gmail.com (Marten van Kerkwijk) Date: Sun, 31 Jan 2016 16:52:08 -0500 Subject: [Numpy-discussion] Numpy 1.11.0b2 released In-Reply-To: <56ADE8A9.7030901@googlemail.com> References: <56ABA552.1090002@gmail.com> <56ADE8A9.7030901@googlemail.com> Message-ID: Hi Julian, While the numpy 1.10 situation was bad, I do want to clarify that the problems we had in astropy were a consequence of *good* changes in `recarray`, which solved many problems, but also broke the work-arounds that had been created in `astropy.io.fits` quite a long time ago (possibly before astropy became as good as it tries to be now at moving issues upstream and perhaps before numpy had become as responsive to what happens downstream as it is now; I think it is fair to say many project's attitude to testing has changed rather drastically in the last decade!). I do agree, though, that it just goes to show one has to try to be careful, and like Nathaniel's suggestion of automatic testing with pre-releases -- I just asked on our astropy-dev list whether we can implement it. All the best, Marten -------------- next part -------------- An HTML attachment was scrubbed... URL: