From charlesr.harris at gmail.com Sat Oct 3 15:23:07 2015 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 3 Oct 2015 13:23:07 -0600 Subject: [SciPy-User] Numpy 1.10.0 coming Monday, 5 Oct. Message-ID: Hi All, A heads up about the coming Numpy release. If you have discovered any problems with rc1 or rc2, please report them. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeffreback at gmail.com Sat Oct 3 17:33:34 2015 From: jeffreback at gmail.com (Jeff Reback) Date: Sat, 3 Oct 2015 17:33:34 -0400 Subject: [SciPy-User] ANN: pandas v0.17.0rc2 - RELEASE CANDIDATE 2 Message-ID: Hi, I'm pleased to announce the availability of the second release candidate of Pandas 0.17.0. Please try this RC and report any issues here: Pandas Issues We will be releasing officially on October 9. **RELEASE CANDIDATE 2** >From RC 1 we have: - compat for Python 3.5 - compat for matplotlib 1.5.0 - .convert_objects is now restored to the original, and is deprecated This is a major release from 0.16.2 and includes a small number of API changes, several new features, enhancements, and performance improvements along with a large number of bug fixes. We recommend that all users upgrade to this version. Highlights include: - Release the Global Interpreter Lock (GIL) on some cython operations, see here - Plotting methods are now available as attributes of the .plot accessor, see here - The sorting API has been revamped to remove some long-time inconsistencies, see here - Support for a datetime64[ns] with timezones as a first-class dtype, see here - The default for to_datetime will now be to raise when presented with unparseable formats, previously this would return the original input, see here - The default for dropna in HDFStore has changed to False, to store by default all rows even if they are all NaN, see here - Support for Series.dt.strftime to generate formatted strings for datetime-likes, see here - Development installed versions of pandas will now have PEP440 compliant version strings GH9518 - Development support for benchmarking with the Air Speed Velocity library GH8316 - Support for reading SAS xport files, see here - Removal of the automatic TimeSeries broadcasting, deprecated since 0.8.0, see here - Display format with plain text can optionally align with Unicode East Asian Width, see here - Compatibility with Python 3.5 GH11097 - Compatibility with matplotlib 1.5.0 GH11111 See the Whatsnew for much more information. Best way to get this is to install via conda from our development channel. Builds for osx-64,linux-64,win-64 for Python 2.7, Python 3.4, and Python 3.5 (for osx/linux) are all available. conda install pandas -c pandas Thanks to all who made this release happen. It is a very large release! Jeff -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Sun Oct 4 01:35:42 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 3 Oct 2015 22:35:42 -0700 Subject: [SciPy-User] [pydata] ANN: pandas v0.17.0rc2 - RELEASE CANDIDATE 2 In-Reply-To: References: Message-ID: Hi, On Sat, Oct 3, 2015 at 2:33 PM, Jeff Reback wrote: > Hi, > > I'm pleased to announce the availability of the second release candidate of > Pandas 0.17.0. > Please try this RC and report any issues here: Pandas Issues > We will be releasing officially on October 9. > > **RELEASE CANDIDATE 2** > > From RC 1 we have: > > compat for Python 3.5 > compat for matplotlib 1.5.0 > .convert_objects is now restored to the original, and is deprecated > > This is a major release from 0.16.2 and includes a small number of API > changes, several new features, enhancements, and performance improvements > along with a large number of bug fixes. We recommend that all users upgrade > to this version. > > Highlights include: > > Release the Global Interpreter Lock (GIL) on some cython operations, see > here > Plotting methods are now available as attributes of the .plot accessor, see > here > The sorting API has been revamped to remove some long-time inconsistencies, > see here > Support for a datetime64[ns] with timezones as a first-class dtype, see here > The default for to_datetime will now be to raise when presented with > unparseable formats, previously this would return the original input, see > here > The default for dropna in HDFStore has changed to False, to store by default > all rows even if they are all NaN, see here > Support for Series.dt.strftime to generate formatted strings for > datetime-likes, see here > Development installed versions of pandas will now have PEP440 compliant > version strings GH9518 > Development support for benchmarking with the Air Speed Velocity library > GH8316 > Support for reading SAS xport files, see here > Removal of the automatic TimeSeries broadcasting, deprecated since 0.8.0, > see here > Display format with plain text can optionally align with Unicode East Asian > Width, see here > Compatibility with Python 3.5 GH11097 > Compatibility with matplotlib 1.5.0 GH11111 > > > See the Whatsnew for much more information. > > Best way to get this is to install via conda from our development channel. > Builds for osx-64,linux-64,win-64 for Python 2.7, Python 3.4, and Python 3.5 > (for osx/linux) are all available. > > conda install pandas -c pandas I built OSX wheels for Pythons 2.7, 3.4, 3.5. To test: pip install --pre -f http://wheels.scipy.org pandas There were some test failures for Python 3.3 - issue here: https://github.com/pydata/pandas/issues/11232 Cheers, Matthew From lev.konst at gmail.com Mon Oct 5 14:23:12 2015 From: lev.konst at gmail.com (Lev Konstantinovskiy) Date: Mon, 5 Oct 2015 19:23:12 +0100 Subject: [SciPy-User] Alternative for scipy.sparse.sparsetools for use from outside of scipy Message-ID: What is the danger of using sparsetools from outside of scipy? I found the Mar 2014 PR that made it internal but would be grateful to know the reasoning. "MAINT: wrap sparsetools manually instead via SWIG#3440" https://github.com/scipy/scipy/pull/3440 Thanks Lev From charlesr.harris at gmail.com Tue Oct 6 00:52:50 2015 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 5 Oct 2015 22:52:50 -0600 Subject: [SciPy-User] Numpy 1.10.0 release Message-ID: Hi All, It is my pleasure to release NumPy 1.10.0. Files may be found at Sourceforge and pypi . This release is the result of 789 non-merge commits made by 160 developers over a period of a year and supports Python 2.6 - 2.7 and 3.2 - 3.5. NumPy 1.10.0 Release Notes ************************** This release supports Python 2.6 - 2.7 and 3.2 - 3.5. Highlights ========== * numpy.distutils now supports parallel compilation via the --parallel/-j argument passed to setup.py build * numpy.distutils now supports additional customization via site.cfg to control compilation parameters, i.e. runtime libraries, extra linking/compilation flags. * Addition of *np.linalg.multi_dot*: compute the dot product of two or more arrays in a single function call, while automatically selecting the fastest evaluation order. * The new function `np.stack` provides a general interface for joining a sequence of arrays along a new axis, complementing `np.concatenate` for joining along an existing axis. * Addition of `nanprod` to the set of nanfunctions. * Support for the '@' operator in Python 3.5. Dropped Support: * The _dotblas module has been removed. CBLAS Support is now in Multiarray. * The testcalcs.py file has been removed. * The polytemplate.py file has been removed. * npy_PyFile_Dup and npy_PyFile_DupClose have been removed from npy_3kcompat.h. * splitcmdline has been removed from numpy/distutils/exec_command.py. * try_run and get_output have been removed from numpy/distutils/command/config.py * The a._format attribute is no longer supported for array printing. * Keywords ``skiprows`` and ``missing`` removed from np.genfromtxt. * Keyword ``old_behavior`` removed from np.correlate. Future Changes: * In array comparisons like ``arr1 == arr2``, many corner cases involving strings or structured dtypes that used to return scalars now issue ``FutureWarning`` or ``DeprecationWarning``, and in the future will be change to either perform elementwise comparisons or raise an error. * The SafeEval class will be removed. * The alterdot and restoredot functions will be removed. See below for more details on these changes. Compatibility notes =================== numpy version string ~~~~~~~~~~~~~~~~~~~~ The numpy version string for development builds has been changed from ``x.y.z.dev-githash`` to ``x.y.z.dev0+githash`` (note the +) in order to comply with PEP 440. relaxed stride checking ~~~~~~~~~~~~~~~~~~~~~~~ NPY_RELAXED_STRIDE_CHECKING is now true by default. Concatenation of 1d arrays along any but ``axis=0`` raises ``IndexError`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Using axis != 0 has raised a DeprecationWarning since NumPy 1.7, it now raises an error. *np.ravel*, *np.diagonal* and *np.diag* now preserve subtypes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ There was inconsistent behavior between *x.ravel()* and *np.ravel(x)*, as well as between *x.diagonal()* and *np.diagonal(x)*, with the methods preserving subtypes while the functions did not. This has been fixed and the functions now behave like the methods, preserving subtypes except in the case of matrices. Matrices are special cased for backward compatibility and still return 1-D arrays as before. If you need to preserve the matrix subtype, use the methods instead of the functions. *rollaxis* and *swapaxes* always return a view ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Previously, a view was returned except when no change was made in the order of the axes, in which case the input array was returned. A view is now returned in all cases. *nonzero* now returns base ndarrays ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Previously, an inconsistency existed between 1-D inputs (returning a base ndarray) and higher dimensional ones (which preserved subclasses). Behavior has been unified, and the return will now be a base ndarray. Subclasses can still override this behavior by providing their own *nonzero* method. C API ~~~~~ The changes to *swapaxes* also apply to the *PyArray_SwapAxes* C function, which now returns a view in all cases. The changes to *nonzero* also apply to the *PyArray_Nonzero* C function, which now returns a base ndarray in all cases. The dtype structure (PyArray_Descr) has a new member at the end to cache its hash value. This shouldn't affect any well-written applications. The change to the concatenation function DeprecationWarning also affects PyArray_ConcatenateArrays, recarray field return types ~~~~~~~~~~~~~~~~~~~~~~~~~~~ Previously the returned types for recarray fields accessed by attribute and by index were inconsistent, and fields of string type were returned as chararrays. Now, fields accessed by either attribute or indexing will return an ndarray for fields of non-structured type, and a recarray for fields of structured type. Notably, this affect recarrays containing strings with whitespace, as trailing whitespace is trimmed from chararrays but kept in ndarrays of string type. Also, the dtype.type of nested structured fields is now inherited. recarray views ~~~~~~~~~~~~~~ Viewing an ndarray as a recarray now automatically converts the dtype to np.record. See new record array documentation. Additionally, viewing a recarray with a non-structured dtype no longer converts the result's type to ndarray - the result will remain a recarray. 'out' keyword argument of ufuncs now accepts tuples of arrays ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ When using the 'out' keyword argument of a ufunc, a tuple of arrays, one per ufunc output, can be provided. For ufuncs with a single output a single array is also a valid 'out' keyword argument. Previously a single array could be provided in the 'out' keyword argument, and it would be used as the first output for ufuncs with multiple outputs, is deprecated, and will result in a `DeprecationWarning` now and an error in the future. byte-array indices now raises an IndexError ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Indexing an ndarray using a byte-string in Python 3 now raises an IndexError instead of a ValueError. Masked arrays containing objects with arrays ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ For such (rare) masked arrays, getting a single masked item no longer returns a corrupted masked array, but a fully masked version of the item. Median warns and returns nan when invalid values are encountered ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Similar to mean, median and percentile now emits a Runtime warning and returns `NaN` in slices where a `NaN` is present. To compute the median or percentile while ignoring invalid values use the new `nanmedian` or `nanpercentile` functions. Functions available from numpy.ma.testutils have changed ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All functions from numpy.testing were once available from numpy.ma.testutils but not all of them were redefined to work with masked arrays. Most of those functions have now been removed from numpy.ma.testutils with a small subset retained in order to preserve backward compatibility. In the long run this should help avoid mistaken use of the wrong functions, but it may cause import problems for some. New Features ============ Reading extra flags from site.cfg ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Previously customization of compilation of dependency libraries and numpy itself was only accomblishable via code changes in the distutils package. Now numpy.distutils reads in the following extra flags from each group of the *site.cfg*: * ``runtime_library_dirs/rpath``, sets runtime library directories to override ``LD_LIBRARY_PATH`` * ``extra_compile_args``, add extra flags to the compilation of sources * ``extra_link_args``, add extra flags when linking libraries This should, at least partially, complete user customization. *np.cbrt* to compute cube root for real floats ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.cbrt* wraps the C99 cube root function *cbrt*. Compared to *np.power(x, 1./3.)* it is well defined for negative real floats and a bit faster. numpy.distutils now allows parallel compilation ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ By passing *--parallel=n* or *-j n* to *setup.py build* the compilation of extensions is now performed in *n* parallel processes. The parallelization is limited to files within one extension so projects using Cython will not profit because it builds extensions from single files. *genfromtxt* has a new ``max_rows`` argument ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ A ``max_rows`` argument has been added to *genfromtxt* to limit the number of rows read in a single call. Using this functionality, it is possible to read in multiple arrays stored in a single file by making repeated calls to the function. New function *np.broadcast_to* for invoking array broadcasting ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.broadcast_to* manually broadcasts an array to a given shape according to numpy's broadcasting rules. The functionality is similar to broadcast_arrays, which in fact has been rewritten to use broadcast_to internally, but only a single array is necessary. New context manager *clear_and_catch_warnings* for testing warnings ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ When Python emits a warning, it records that this warning has been emitted in the module that caused the warning, in a module attribute ``__warningregistry__``. Once this has happened, it is not possible to emit the warning again, unless you clear the relevant entry in ``__warningregistry__``. This makes is hard and fragile to test warnings, because if your test comes after another that has already caused the warning, you will not be able to emit the warning or test it. The context manager ``clear_and_catch_warnings`` clears warnings from the module registry on entry and resets them on exit, meaning that warnings can be re-raised. *cov* has new ``fweights`` and ``aweights`` arguments ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``fweights`` and ``aweights`` arguments add new functionality to covariance calculations by applying two types of weighting to observation vectors. An array of ``fweights`` indicates the number of repeats of each observation vector, and an array of ``aweights`` provides their relative importance or probability. Support for the '@' operator in Python 3.5+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Python 3.5 adds support for a matrix multiplication operator '@' proposed in PEP465. Preliminary support for that has been implemented, and an equivalent function ``matmul`` has also been added for testing purposes and use in earlier Python versions. The function is preliminary and the order and number of its optional arguments can be expected to change. New argument ``norm`` to fft functions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The default normalization has the direct transforms unscaled and the inverse transforms are scaled by :math:`1/n`. It is possible to obtain unitary transforms by setting the keyword argument ``norm`` to ``"ortho"`` (default is `None`) so that both direct and inverse transforms will be scaled by :math:`1/\\sqrt{n}`. Improvements ============ *np.digitize* using binary search ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.digitize* is now implemented in terms of *np.searchsorted*. This means that a binary search is used to bin the values, which scales much better for larger number of bins than the previous linear search. It also removes the requirement for the input array to be 1-dimensional. *np.poly* now casts integer inputs to float ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.poly* will now cast 1-dimensional input arrays of integer type to double precision floating point, to prevent integer overflow when computing the monic polynomial. It is still possible to obtain higher precision results by passing in an array of object type, filled e.g. with Python ints. *np.interp* can now be used with periodic functions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.interp* now has a new parameter *period* that supplies the period of the input data *xp*. In such case, the input data is properly normalized to the given period and one end point is added to each extremity of *xp* in order to close the previous and the next period cycles, resulting in the correct interpolation behavior. *np.pad* supports more input types for ``pad_width`` and ``constant_values`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ``constant_values`` parameters now accepts NumPy arrays and float values. NumPy arrays are supported as input for ``pad_width``, and an exception is raised if its values are not of integral type. *np.argmax* and *np.argmin* now support an ``out`` argument ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``out`` parameter was added to *np.argmax* and *np.argmin* for consistency with *ndarray.argmax* and *ndarray.argmin*. The new parameter behaves exactly as it does in those methods. More system C99 complex functions detected and used ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All of the functions ``in complex.h`` are now detected. There are new fallback implementations of the following functions. * npy_ctan, * npy_cacos, npy_casin, npy_catan * npy_ccosh, npy_csinh, npy_ctanh, * npy_cacosh, npy_casinh, npy_catanh As a result of these improvements, there will be some small changes in returned values, especially for corner cases. *np.loadtxt* support for the strings produced by the ``float.hex`` method ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The strings produced by ``float.hex`` look like ``0x1.921fb54442d18p+1``, so this is not the hex used to represent unsigned integer types. *np.isclose* properly handles minimal values of integer dtypes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In order to properly handle minimal values of integer types, *np.isclose* will now cast to the float dtype during comparisons. This aligns its behavior with what was provided by *np.allclose*. *np.allclose* uses *np.isclose* internally. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.allclose* now uses *np.isclose* internally and inherits the ability to compare NaNs as equal by setting ``equal_nan=True``. Subclasses, such as *np.ma.MaskedArray*, are also preserved now. *np.genfromtxt* now handles large integers correctly ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.genfromtxt* now correctly handles integers larger than ``2**31-1`` on 32-bit systems and larger than ``2**63-1`` on 64-bit systems (it previously crashed with an ``OverflowError`` in these cases). Integers larger than ``2**63-1`` are converted to floating-point values. *np.load*, *np.save* have pickle backward compatibility flags ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The functions *np.load* and *np.save* have additional keyword arguments for controlling backward compatibility of pickled Python objects. This enables Numpy on Python 3 to load npy files containing object arrays that were generated on Python 2. MaskedArray support for more complicated base classes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Built-in assumptions that the baseclass behaved like a plain array are being removed. In particular, setting and getting elements and ranges will respect baseclass overrides of ``__setitem__`` and ``__getitem__``, and arithmetic will respect overrides of ``__add__``, ``__sub__``, etc. Changes ======= dotblas functionality moved to multiarray ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The cblas versions of dot, inner, and vdot have been integrated into the multiarray module. In particular, vdot is now a multiarray function, which it was not before. stricter check of gufunc signature compliance ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Inputs to generalized universal functions are now more strictly checked against the function's signature: all core dimensions are now required to be present in input arrays; core dimensions with the same label must have the exact same size; and output core dimension's must be specified, either by a same label input core dimension or by a passed-in output array. views returned from *np.einsum* are writeable ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Views returned by *np.einsum* will now be writeable whenever the input array is writeable. *np.argmin* skips NaT values ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ *np.argmin* now skips NaT values in datetime64 and timedelta64 arrays, making it consistent with *np.min*, *np.argmax* and *np.max*. Deprecations ============ Array comparisons involving strings or structured dtypes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Normally, comparison operations on arrays perform elementwise comparisons and return arrays of booleans. But in some corner cases, especially involving strings are structured dtypes, NumPy has historically returned a scalar instead. For example:: ### Current behaviour np.arange(2) == "foo" # -> False np.arange(2) < "foo" # -> True on Python 2, error on Python 3 np.ones(2, dtype="i4,i4") == np.ones(2, dtype="i4,i4,i4") # -> False Continuing work started in 1.9, in 1.10 these comparisons will now raise ``FutureWarning`` or ``DeprecationWarning``, and in the future they will be modified to behave more consistently with other comparison operations, e.g.:: ### Future behaviour np.arange(2) == "foo" # -> array([False, False]) np.arange(2) < "foo" # -> error, strings and numbers are not orderable np.ones(2, dtype="i4,i4") == np.ones(2, dtype="i4,i4,i4") # -> [False, False] SafeEval ~~~~~~~~ The SafeEval class in numpy/lib/utils.py is deprecated and will be removed in the next release. alterdot, restoredot ~~~~~~~~~~~~~~~~~~~~ The alterdot and restoredot functions no longer do anything, and are deprecated. pkgload, PackageLoader ~~~~~~~~~~~~~~~~~~~~~~ These ways of loading packages are now deprecated. bias, ddof arguments to corrcoef ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The values for the ``bias`` and ``ddof`` arguments to the ``corrcoef`` function canceled in the division implied by the correlation coefficient and so had no effect on the returned values. We now deprecate these arguments to ``corrcoef`` and the masked array version ``ma.corrcoef``. Because we are deprecating the ``bias`` argument to ``ma.corrcoef``, we also deprecate the use of the ``allow_masked`` argument as a positional argument, as its position will change with the removal of ``bias``. ``allow_masked`` will in due course become a keyword-only argument. dtype string representation changes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Since 1.6, creating a dtype object from its string representation, e.g. ``'f4'``, would issue a deprecation warning if the size did not correspond to an existing type, and default to creating a dtype of the default size for the type. Starting with this release, this will now raise a ``TypeError``. The only exception is object dtypes, where both ``'O4'`` and ``'O8'`` will still issue a deprecation warning. This platform-dependent representation will raise an error in the next release. In preparation for this upcoming change, the string representation of an object dtype, i.e. ``np.dtype(object).str``, no longer includes the item size, i.e. will return ``'|O'`` instead of ``'|O4'`` or ``'|O8'`` as before. Authors ======= This release contains work by the following people who contributed at least one patch to this release. The names are in alphabetical order by first name. Names followed by a "+" contributed a patch for the first time. Abdul Muneer+ Adam Williams+ Alan Briolat+ Alex Griffing Alex Willmer+ Alexander Belopolsky Alistair Muldal+ Allan Haldane+ Amir Sarabadani+ Andrea Bedini+ Andrew Dawson+ Andrew Nelson+ Antoine Pitrou+ Anton Ovchinnikov+ Antony Lee+ Behzad Nouri+ Bertrand+ Blake Griffith Bob Poekert+ Brian Kearns+ CJ Carey Carl Kleffner+ Chander G+ Charles Harris Chris Hogan+ Chris Kerr Chris Lamb+ Chris Laumann+ Christian Brodbeck+ Christian Brueffer Christoph Gohlke Cimarron Mittelsteadt Daniel da Silva Darsh P. Ranjan+ David Cournapeau David M Fobes+ David Powell+ Didrik Pinte+ Dimas Abreu Dutra Dmitry Zagorny+ Eric Firing Eric Hunsberger+ Eric Martin+ Eric Moore Eric O. LEBIGOT (EOL)+ Erik M. Bray Ernest N. Mamikonyan+ Fei Liu+ Fran?ois Magimel+ Gabor Kovacs+ Gabriel-p+ Garrett-R+ George Castillo+ Gerrit Holl+ Gert-Ludwig Ingold+ Glen Mabey+ Graham Christensen+ Greg Thomsen+ Gregory R. Lee+ Helder Cesar+ Helder Oliveira+ Henning Dickten+ Ian Henriksen+ Jaime Fernandez James Camel+ James Salter+ Jan Schl?ter+ Jarl Haggerty+ Jay Bourque Joel Nothman+ John Kirkham+ John Tyree+ Joris Van den Bossche+ Joseph Martinot-Lagarde Josh Warner (Mac) Juan Luis Cano Rodr?guez Julian Taylor Kreiswolke+ Lars Buitinck Leonardo Donelli+ Lev Abalkin Lev Levitsky+ Malik Woods+ Maniteja Nandana+ Marshall Farrier+ Marten van Kerkwijk Martin Spacek Martin Thoma+ Masud Rahman+ Matt Newville+ Mattheus Ueckermann+ Matthew Brett Matthew Craig+ Michael Currie+ Michael Droettboom Michele Vallisneri+ Mortada Mehyar+ Nate Jensen+ Nathaniel J. Smith Nick Papior Andersen+ Nick Papior+ Nils Werner Oliver Eberle+ Patrick Peglar+ Paul Jacobson Pauli Virtanen Peter Iannucci+ Ralf Gommers Richard Barnes+ Ritta Narita+ Robert Johansson+ Robert LU+ Robert McGibbon+ Ryan Blakemore+ Ryan Nelson+ Sandro Tosi Saullo Giovani+ Sebastian Berg Sebastien Gouezel+ Simon Gibbons+ Simon Guillot+ Stefan Eng+ Stefan Otte+ Stefan van der Walt Stephan Hoyer+ Stuart Berg+ Sturla Molden+ Thomas A Caswell+ Thomas Robitaille Tim D. Smith+ Tom Krauss+ Tom Poole+ Toon Verstraelen+ Ulrich Seidl Valentin Haenel Vraj Mohan+ Warren Weckesser Wendell Smith+ Yaroslav Halchenko Yotam Doron Yousef Hamza+ Yuval Langer+ Yuxiang Wang+ Zbigniew J?drzejewski-Szmek+ cel+ chebee7i+ empeeu+ endolith hannaro+ immerrr jmrosen155+ jnothman kanhua+ mbyt+ mlai+ styr+ tdihp+ wim glenn+ yolanda15+ ?smund Hjulstad+ Enjjoy, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From davidmenhur at gmail.com Tue Oct 6 08:08:36 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Tue, 6 Oct 2015 14:08:36 +0200 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.0 release In-Reply-To: References: Message-ID: I don't get any failures on Fedora 22. I have installed it with pip, setting my CFLAGS to "-march=core-avx-i -O2 -pipe -mtune=native" and linking against openblas. With the new Numpy, Scipy full suite shows two errors, I am sorry I didn't think of running that in the RC phase: ====================================================================== FAIL: test_weighting (test_stats.TestHistogram) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/david/.local/virtualenv/py27/lib/python2.7/site-packages/scipy/stats/tests/test_stats.py", line 892, in test_weighting decimal=2) File "/home/david/.local/virtualenv/py27/lib/python2.7/site-packages/numpy/testing/utils.py", line 886, in assert_array_almost_equal precision=decimal) File "/home/david/.local/virtualenv/py27/lib/python2.7/site-packages/numpy/testing/utils.py", line 708, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal to 2 decimals (mismatch 40.0%) x: array([ 4. , 0. , 4.5, -0.9, 0. , 0.3, 110.2, 0. , 0. , 42. ]) y: array([ 4. , 0. , 4.5, -0.9, 0.3, 0. , 7. , 103.2, 0. , 42. ]) ====================================================================== FAIL: test_nanmedian_all_axis (test_stats.TestNanFunc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/david/.local/virtualenv/py27/lib/python2.7/site-packages/scipy/stats/tests/test_stats.py", line 226, in test_nanmedian_all_axis assert_equal(len(w), 4) File "/home/david/.local/virtualenv/py27/lib/python2.7/site-packages/numpy/testing/utils.py", line 354, in assert_equal raise AssertionError(msg) AssertionError: Items are not equal: ACTUAL: 1 DESIRED: 4 I am almost sure these errors weren't there before. On 6 October 2015 at 13:53, Neal Becker wrote: > 1 test failure: > > FAIL: test_blasdot.test_blasdot_used > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in > runTest > self.test(*self.arg) > File "/home/nbecker/.local/lib/python2.7/site- > packages/numpy/testing/decorators.py", line 146, in skipper_func > return f(*args, **kwargs) > File "/home/nbecker/.local/lib/python2.7/site- > packages/numpy/core/tests/test_blasdot.py", line 31, in test_blasdot_used > assert_(dot is _dotblas.dot) > File "/home/nbecker/.local/lib/python2.7/site- > packages/numpy/testing/utils.py", line 53, in assert_ > raise AssertionError(smsg) > AssertionError > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Oct 6 14:57:20 2015 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 6 Oct 2015 18:57:20 +0000 (UTC) Subject: [SciPy-User] Alternative for scipy.sparse.sparsetools for use from outside of scipy References: Message-ID: Lev Konstantinovskiy gmail.com> writes: > What is the danger of using sparsetools from outside of scipy? It is a private API. Changes can be made to it without needing to go through deprecation cycles. The boundaries of what is private and what was not historically very clear in Scipy (and in Python in general). Clearly, it does not make sense to keep all helper routines as public. https://docs.scipy.org/doc/scipy-dev/reference/api.html#api-definition The case of scipy.sparse is that it will likely need to go through some restructuring in the future, partly because of the need for more array-like sparse matrices, and also partly to improve performance in places where the half-Python half-C++ approach has too large costs. -- Pauli Virtanen From pav at iki.fi Tue Oct 6 15:28:58 2015 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 6 Oct 2015 19:28:58 +0000 (UTC) Subject: [SciPy-User] =?utf-8?q?Alternative_for_scipy=2Esparse=2Esparsetoo?= =?utf-8?q?ls_for_use_from=09outside_of_scipy?= References: Message-ID: Lev Konstantinovskiy gmail.com> writes: > Getting deprecation warning for sparsetools. Is there an alternative to > switch to? > > The use is sparsetools.csc_matvecs in gensim > https://github.com/piskvorky/gensim/blob/9a1c2c954e2f72213023fc01f0e33306001e > 303f/gensim/models/lsimodel.py If I understand correctly, these are use cases that can be expressed in terms of usual sparse matrix operations, y = corpus * o y += corpus * chunk but you are using the internal sparsetools routines instead, because of performance reasons? Is the performance difference big in this case? Is the issue that you want in-place sparse AXPY, or is it due to dealing with small matrices and avoiding overheads? There's currently no sparse axpy available in Scipy. There probably should be though. Gensim is not a pure-Python module, so one relatively straightforward possibility is to just bundle a copy of the current sparsetools module (or just the one routine you need) with it. There's no SWIG nowadays involved, and it's independent of the rest of Scipy, so it's probably doable. -- Pauli Virtanen From jeffreback at gmail.com Fri Oct 9 14:31:13 2015 From: jeffreback at gmail.com (Jeff Reback) Date: Fri, 9 Oct 2015 14:31:13 -0400 Subject: [SciPy-User] ANN: pandas v0.17.0 released Message-ID: Hi, We are proud to announce v0.17.0 of pandas. This is a major release from 0.16.2 and includes a small number of API changes, several new features, enhancements, and performance improvements along with a large number of bug fixes. We recommend that all users upgrade to this version. This was a release of 4 months with 515 commits by 112 authors encompassing 233 issues and 362 pull-requests. We recommend that all users upgrade to this version. *What is it:* *pandas* is a Python package providing fast, flexible, and expressive data structures designed to make working with ?relational? or ?labeled? data both easy and intuitive. It aims to be the fundamental high-level building block for doing practical, real world data analysis in Python. Additionally, it has the broader goal of becoming the most powerful and flexible open source data analysis / manipulation tool available in any language. *Highlights*: - Release the Global Interpreter Lock (GIL) on some cython operations, see here - Plotting methods are now available as attributes of the .plot accessor, see here - The sorting API has been revamped to remove some long-time inconsistencies, see here - Support for a datetime64[ns] with timezones as a first-class dtype, see here - The default for to_datetime will now be to raise when presented with unparseable formats, previously this would return the original input, see here - The default for dropna in HDFStore has changed to False, to store by default all rows even if they are all NaN, see here - Support for Series.dt.strftime to generate formatted strings for datetime-likes, see here - Development installed versions of pandas will now have PEP440 compliant version strings GH9518 - Development support for benchmarking with the Air Speed Velocity library GH8316 - Support for reading SAS xport files, see here - Removal of the automatic TimeSeries broadcasting, deprecated since 0.8.0, see here - Display format with plain text can optionally align with Unicode East Asian Width, see here - Compatibility with Python 3.5 GH11097 - Compatibility with matplotlib 1.5.0 GH11111 See the Whatsnew for much more information and the full Documentation link. *How to get it:* Source tarballs, windows wheels, macosx wheels are available on PyPI - note that currently PyPi is not accepting 3.5 wheels. Installation via conda is: - conda install pandas windows wheels are courtesy of Christoph Gohlke and are built on Numpy 1.9 macosx wheels are courtesy of Matthew Brett *Issues:* Please report any issues on our issue tracker : Thanks to all who made this release happen. It is a very large release! Jeff *Thanks to all of the contributors* - Alex Rothberg - Andrea Bedini - Andrew Rosenfeld - Andy Li - Anthonios Partheniou - Artemy Kolchinsky - Bernard Willers - Charlie Clark - Chris - Chris Whelan - Christoph Gohlke - Christopher Whelan - Clark Fitzgerald - Clearfield Christopher - Dan Ringwalt - Daniel Ni - Data & Code Expert Experimenting with Code on Data - David Cottrell - David John Gagne - David Kelly - ETF - Eduardo Schettino - Egor - Egor Panfilov - Evan Wright - Frank Pinter - Gabriel Araujo - Garrett-R - Gianluca Rossi - Guillaume Gay - Guillaume Poulin - Harsh Nisar - Ian Henriksen - Ian Hoegen - Jaidev Deshpande - Jan Rudolph - Jan Schulz - Jason Swails - Jeff Reback - Jonas Buyl - Joris Van den Bossche - Joris Vankerschaver - Josh Levy-Kramer - Julien Danjou - Ka Wo Chen - Karrie Kehoe - Kelsey Jordahl - Kerby Shedden - Kevin Sheppard - Lars Buitinck - Leif Johnson - Luis Ortiz - Mac - Matt Gambogi - Matt Savoie - Matthew Gilbert - Maximilian Roos - Michelangelo D'Agostino - Mortada Mehyar - Nick Eubank - Nipun Batra - Ond?ej ?ert?k - Phillip Cloud - Pratap Vardhan - Rafal Skolasinski - Richard Lewis - Rinoc Johnson - Rob Levy - Robert Gieseke - Safia Abdalla - Samuel Denny - Saumitra Shahapure - Sebastian P?lsterl - Sebastian Rubbert - Sheppard, Kevin - Sinhrks - Siu Kwan Lam - Skipper Seabold - Spencer Carrucciu - Stephan Hoyer - Stephen Hoover - Stephen Pascoe - Terry Santegoeds - Thomas Grainger - Tjerk Santegoeds - Tom Augspurger - Vincent Davis - Winterflower - Yaroslav Halchenko - Yuan Tang (Terry) - agijsberts - ajcr - behzad nouri - cel4 - cyrusmaher - davidovitch - ganego - jreback - juricast - larvian - maximilianr - msund - rekcahpassyla - robertzk - scls19fr - seth-p - sinhrks - springcoil - terrytangyuan - tzinckgraf -------------- next part -------------- An HTML attachment was scrubbed... URL: From fabien.maussion at gmail.com Sat Oct 10 08:54:17 2015 From: fabien.maussion at gmail.com (Fabien) Date: Sat, 10 Oct 2015 14:54:17 +0200 Subject: [SciPy-User] Travis, wheels, and scikit-image Message-ID: Folks, I have a project which requires several packages from the scipy track. I configured travis to use cached wheels in order to reduce build time. Everything worked fine until I had to add scikit-image to my dependencies. Here are the relevant parts of my travis config: language: python sudo: false cache: directories: - ~/.cache/pip env: global: - PIP_WHEEL_DIR=$HOME/.cache/pip/wheels - PIP_FIND_LINKS=file://$HOME/.cache/pip/wheels - NUMPY=numpy [...] before_install: - pip install -U pip - pip install wheel install: - "pip wheel -r requirements.txt" - "pip install -r requirements.txt" - "pip install coveralls" - "pip install nose" - "pip install -e ." script: - nosetests And of requirements.txt: numpy scipy scikit-image The problem is that scikit-image wants to import scipy from setup.py, and I get this error from my travis log: 9.50s$ pip wheel -r requirements.txt Collecting numpy (from -r requirements.txt (line 2)) File was already downloaded /home/travis/.cache/pip/wheels/numpy-1.10.0.post2-cp27-none-linux_x86_64.whl Collecting scipy (from -r requirements.txt (line 3)) File was already downloaded /home/travis/.cache/pip/wheels/scipy-0.16.0-cp27-none-linux_x86_64.whl Collecting scikit-image (from -r requirements.txt (line 8)) Downloading scikit-image-0.11.3.tar.gz (18.6MB) 100% |????????????????????????????????| 18.6MB 22kB/s Complete output from command python setup.py egg_info: Traceback (most recent call last): File "", line 20, in File "/tmp/pip-build-WRt8fn/scikit-image/setup.py", line 77, in import scipy ImportError: No module named scipy Is my way to configure travis a bad one? Should I handle my wheels differently? I am also thinking about using conda in travis as explained here (http://conda.pydata.org/docs/travis.html) but I wanted to avoid it as long as not necessary: it seems that it doesn't use the travis cache? Thanks for your advices, Fabien From ralf.gommers at gmail.com Sat Oct 10 09:00:24 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 10 Oct 2015 15:00:24 +0200 Subject: [SciPy-User] Travis, wheels, and scikit-image In-Reply-To: References: Message-ID: On Sat, Oct 10, 2015 at 2:54 PM, Fabien wrote: > Folks, > > I have a project which requires several packages from the scipy track. I > configured travis to use cached wheels in order to reduce build time. > Everything worked fine until I had to add scikit-image to my dependencies. > Here are the relevant parts of my travis config: > requirements.txt is fragile. I suggest to use a pre-install shell script. Example: https://github.com/scikit-image/scikit-image/blob/master/tools/travis_before_install.sh called from: https://github.com/scikit-image/scikit-image/blob/master/.travis.yml#L48 Ralf > language: python > sudo: false > cache: > directories: > - ~/.cache/pip > env: > global: > - PIP_WHEEL_DIR=$HOME/.cache/pip/wheels > - PIP_FIND_LINKS=file://$HOME/.cache/pip/wheels > - NUMPY=numpy > [...] > before_install: > - pip install -U pip > - pip install wheel > install: > - "pip wheel -r requirements.txt" > - "pip install -r requirements.txt" > - "pip install coveralls" > - "pip install nose" > - "pip install -e ." > script: > - nosetests > > And of requirements.txt: > > numpy > scipy > scikit-image > > > The problem is that scikit-image wants to import scipy from setup.py, and > I get this error from my travis log: > > 9.50s$ pip wheel -r requirements.txt > Collecting numpy (from -r requirements.txt (line 2)) > File was already downloaded > /home/travis/.cache/pip/wheels/numpy-1.10.0.post2-cp27-none-linux_x86_64.whl > Collecting scipy (from -r requirements.txt (line 3)) > File was already downloaded > /home/travis/.cache/pip/wheels/scipy-0.16.0-cp27-none-linux_x86_64.whl > Collecting scikit-image (from -r requirements.txt (line 8)) > Downloading scikit-image-0.11.3.tar.gz (18.6MB) > 100% |????????????????????????????????| 18.6MB 22kB/s > Complete output from command python setup.py egg_info: > Traceback (most recent call last): > File "", line 20, in > File "/tmp/pip-build-WRt8fn/scikit-image/setup.py", line 77, in > > import scipy > ImportError: No module named scipy > > > Is my way to configure travis a bad one? Should I handle my wheels > differently? I am also thinking about using conda in travis as explained > here (http://conda.pydata.org/docs/travis.html) but I wanted to avoid it > as long as not necessary: it seems that it doesn't use the travis cache? > > Thanks for your advices, > > Fabien > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lev.konst at gmail.com Mon Oct 12 07:01:34 2015 From: lev.konst at gmail.com (Lev Konstantinovskiy) Date: Mon, 12 Oct 2015 12:01:34 +0100 Subject: [SciPy-User] Alternative for scipy.sparse.sparsetools for use from outside of scipy In-Reply-To: References: Message-ID: Thanks for your reply. Raised a feature request for sparse in-place AXPY https://github.com/scipy/scipy/issues/5348 The reason for using this routine is to save memory. Using sparse.linalg.matmat would double the memory in y = y + chunk * o to create a temporary chunk*o. But it seems the only way to go when sparsetools becomes deprecated. On Tue, Oct 6, 2015 at 8:28 PM, Pauli Virtanen wrote: > Lev Konstantinovskiy gmail.com> writes: >> Getting deprecation warning for sparsetools. Is there an alternative to >> switch to? >> >> The use is sparsetools.csc_matvecs in gensim >> https://github.com/piskvorky/gensim/blob/9a1c2c954e2f72213023fc01f0e33306001e >> 303f/gensim/models/lsimodel.py > > If I understand correctly, these are use cases that can be expressed > in terms of usual sparse matrix operations, > > y = corpus * o > y += corpus * chunk > > but you are using the internal sparsetools routines instead, > because of performance reasons? Is the performance difference > big in this case? Is the issue that you want in-place sparse AXPY, > or is it due to dealing with small matrices and avoiding > overheads? > > There's currently no sparse axpy available in Scipy. > There probably should be though. > > Gensim is not a pure-Python module, so one relatively straightforward > possibility is to just bundle a copy of the current sparsetools > module (or just the one routine you need) with it. There's no SWIG > nowadays involved, and it's independent of the rest of Scipy, > so it's probably doable. > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user From charlesr.harris at gmail.com Mon Oct 12 12:27:08 2015 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 12 Oct 2015 10:27:08 -0600 Subject: [SciPy-User] Numpy 1.10.1 released. Message-ID: Hi All, I'm pleased to announce the release of Numpy 1.10.1. This release fixes some build problems and serves to reset the release number on pipy to something usable. As a note for future release managers, I had to upload these files from the command line, as using the file upload option at pipy resulted in a failure to parse the version. NumPy 1.10.1 Release Notes ************************** This release deals with a few build problems that showed up in 1.10.0. Most users would not have seen these problems. The differences are: * Compiling with msvc9 or msvc10 for 32 bit Windows now requires SSE2. This was the easiest fix for what looked to be some miscompiled code when SSE2 was not used. If you need to compile for 32 bit Windows systems without SSE2 support, mingw32 should still work. * Make compiling with VS2008 python2.7 SDK easier * Change Intel compiler options so that code will also be generated to support systems without SSE4.2. * Some _config test functions needed an explicit integer return in order to avoid the openSUSE rpmlinter erring out. * We ran into a problem with pipy not allowing reuse of filenames and a resulting proliferation of *.*.*.postN releases. Not only were the names getting out of hand, some packages were unable to work with the postN suffix. Numpy 1.10.1 supports Python 2.6 - 2.7 and 3.2 - 3.5. Commits: 45a3d84 DEP: Remove warning for `full` when dtype is set. 0c1a5df BLD: import setuptools to allow compile with VS2008 python2.7 sdk 04211c6 BUG: mask nan to 1 in ordered compare 826716f DOC: Document the reason msvc requires SSE2 on 32 bit platforms. 49fa187 BLD: enable SSE2 for 32-bit msvc 9 and 10 compilers dcbc4cc MAINT: remove Wreturn-type warnings from config checks d6564cb BLD: do not build exclusively for SSE4.2 processors 15cb66f BLD: do not build exclusively for SSE4.2 processors c38bc08 DOC: fix var. reference in percentile docstring 78497f4 DOC: Sync 1.10.0-notes.rst in 1.10.x branch with master. Cheers, Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Mon Oct 12 13:15:00 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 12 Oct 2015 10:15:00 -0700 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.1 released. In-Reply-To: References: Message-ID: Hi, On Mon, Oct 12, 2015 at 9:27 AM, Charles R Harris wrote: > Hi All, > > I'm pleased to announce the release of Numpy 1.10.1. This release fixes some > build problems and serves to reset the release number on pipy to something > usable. As a note for future release managers, I had to upload these files > from the command line, as using the file upload option at pipy resulted in a > failure to parse the version. > > NumPy 1.10.1 Release Notes > ************************** > > This release deals with a few build problems that showed up in 1.10.0. Most > users would not have seen these problems. The differences are: > > * Compiling with msvc9 or msvc10 for 32 bit Windows now requires SSE2. > This was the easiest fix for what looked to be some miscompiled code when > SSE2 was not used. If you need to compile for 32 bit Windows systems > without SSE2 support, mingw32 should still work. > > * Make compiling with VS2008 python2.7 SDK easier > > * Change Intel compiler options so that code will also be generated to > support systems without SSE4.2. > > * Some _config test functions needed an explicit integer return in > order to avoid the openSUSE rpmlinter erring out. > > * We ran into a problem with pipy not allowing reuse of filenames and a > resulting proliferation of *.*.*.postN releases. Not only were the names > getting out of hand, some packages were unable to work with the postN > suffix. > > > Numpy 1.10.1 supports Python 2.6 - 2.7 and 3.2 - 3.5. > > > Commits: > > 45a3d84 DEP: Remove warning for `full` when dtype is set. > 0c1a5df BLD: import setuptools to allow compile with VS2008 python2.7 sdk > 04211c6 BUG: mask nan to 1 in ordered compare > 826716f DOC: Document the reason msvc requires SSE2 on 32 bit platforms. > 49fa187 BLD: enable SSE2 for 32-bit msvc 9 and 10 compilers > dcbc4cc MAINT: remove Wreturn-type warnings from config checks > d6564cb BLD: do not build exclusively for SSE4.2 processors > 15cb66f BLD: do not build exclusively for SSE4.2 processors > c38bc08 DOC: fix var. reference in percentile docstring > 78497f4 DOC: Sync 1.10.0-notes.rst in 1.10.x branch with master. Thanks a lot for guiding this through. I uploaded the OSX wheels to pypi via : https://github.com/MacPython/numpy-wheels Cheers, Matthew From fabien.maussion at gmail.com Tue Oct 13 11:14:20 2015 From: fabien.maussion at gmail.com (Fabien) Date: Tue, 13 Oct 2015 17:14:20 +0200 Subject: [SciPy-User] Travis, wheels, and scikit-image In-Reply-To: References: Message-ID: On 10/10/2015 03:00 PM, Ralf Gommers wrote: > On Sat, Oct 10, 2015 at 2:54 PM, Fabien > wrote: > > Folks, > > I have a project which requires several packages from the scipy > track. I configured travis to use cached wheels in order to reduce > build time. Everything worked fine until I had to add scikit-image > to my dependencies. Here are the relevant parts of my travis config: > > > requirements.txt is fragile. I suggest to use a pre-install shell > script. Example: > https://github.com/scikit-image/scikit-image/blob/master/tools/travis_before_install.sh > called from: > https://github.com/scikit-image/scikit-image/blob/master/.travis.yml#L48 > > Ralf Thanks Ralf. A bit oversized for my needs but I thinks that it will make it. Anyone has experience with conda on Travis? The solution provided here ((http://conda.pydata.org/docs/travis.html)) really seems inefficient since all packages have to be downloaded at each build... From davidmenhur at gmail.com Tue Oct 13 11:39:29 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Tue, 13 Oct 2015 17:39:29 +0200 Subject: [SciPy-User] Travis, wheels, and scikit-image In-Reply-To: References: Message-ID: On 13 October 2015 at 17:14, Fabien wrote: > Thanks Ralf. A bit oversized for my needs but I thinks that it will make > it. A simplest option, specify them manually. Doable if you don't have many packages, and you can use it to pre-install the ones that may give you trouble: before_install: - uname -a - free -m - df -h - ulimit -a - pip install -U pip wheel setuptools - pip install numpy cython nose six matplotlib - pip install -v -U scipy - python -V Note that I added the -v flag for scipy because it takes around 10 min to install on Travis, and Travis kills jobs that don't produce output after that time. This is only relevant once per Scipy release, the rest of the times it will use the cached wheel. The full script is here: https://raw.githubusercontent.com/Dapid/GPy/a02beebf27a95733e7a49a86f1a6642b2cb160fe/.travis.yml Anyway, downloading and installing miniconda takes less than a minute, so inefficient is relative: https://travis-ci.org/SheffieldML/GPy/jobs/85069851 /David. -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Tue Oct 13 11:43:49 2015 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 13 Oct 2015 11:43:49 -0400 Subject: [SciPy-User] Travis, wheels, and scikit-image In-Reply-To: References: Message-ID: On Tue, Oct 13, 2015 at 11:14 AM, Fabien wrote: > On 10/10/2015 03:00 PM, Ralf Gommers wrote: > >> On Sat, Oct 10, 2015 at 2:54 PM, Fabien > > wrote: >> >> Folks, >> >> I have a project which requires several packages from the scipy >> track. I configured travis to use cached wheels in order to reduce >> build time. Everything worked fine until I had to add scikit-image >> to my dependencies. Here are the relevant parts of my travis config: >> >> >> requirements.txt is fragile. I suggest to use a pre-install shell >> script. Example: >> >> https://github.com/scikit-image/scikit-image/blob/master/tools/travis_before_install.sh >> called from: >> https://github.com/scikit-image/scikit-image/blob/master/.travis.yml#L48 >> >> Ralf >> > > > Thanks Ralf. A bit oversized for my needs but I thinks that it will make > it. > > Anyone has experience with conda on Travis? The solution provided here (( > http://conda.pydata.org/docs/travis.html)) really seems inefficient since > all packages have to be downloaded at each build... > > statsmodels has been using conda on Travis for a while now. I don't remember seeing any problems related to downloading, so I never paid attention to whether we download on each testrun or not. (Also, we still haven't switched to the new Travis infrastructure. ) In the Travis log, conda reports that it's downloading around 200 MB for that test run. Josef > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Oct 13 11:58:16 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 13 Oct 2015 08:58:16 -0700 Subject: [SciPy-User] Travis, wheels, and scikit-image In-Reply-To: References: Message-ID: Hi, On Tue, Oct 13, 2015 at 8:39 AM, Da?id wrote: > > On 13 October 2015 at 17:14, Fabien wrote: >> >> Thanks Ralf. A bit oversized for my needs but I thinks that it will make >> it. > > > > A simplest option, specify them manually. Doable if you don't have many > packages, and you can use it to pre-install the ones that may give you > trouble: > > before_install: > - uname -a > - free -m > - df -h > - ulimit -a > - pip install -U pip wheel setuptools > - pip install numpy cython nose six matplotlib > - pip install -v -U scipy > - python -V > > Note that I added the -v flag for scipy because it takes around 10 min to > install on Travis, and Travis kills jobs that don't produce output after > that time. This is only relevant once per Scipy release, the rest of the > times it will use the cached wheel. The full script is here: > > https://raw.githubusercontent.com/Dapid/GPy/a02beebf27a95733e7a49a86f1a6642b2cb160fe/.travis.yml There's a good example of using pre-built travis wheels here : https://github.com/matplotlib/matplotlib/blob/master/tools/travis_tools.sh Cheers, Matthew From chris.barker at noaa.gov Wed Oct 14 12:14:46 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 14 Oct 2015 09:14:46 -0700 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.1 released. In-Reply-To: References: Message-ID: On Mon, Oct 12, 2015 at 9:27 AM, Charles R Harris wrote: > * Compiling with msvc9 or msvc10 for 32 bit Windows now requires SSE2. > This was the easiest fix for what looked to be some miscompiled code when > SSE2 was not used. > Note that there is discusion right now on pyton-dev about requireing SSE2 for teh python.org build of python3.5 -- it does now, so it's fine for third party pacakges to also require it. But there is some talk of removing that requirement -- still a lot of old machines around, I guess -- particular at schools and the like. Ideally, any binary wheels on PyPi should be compatible with the python.org builds -- so not require SSE2, if the python.org builds don't. Though we had this discussion a while back -- and numpy could, and maybe should require more -- did we ever figure out a way to get a meaningful message to the user if they try to run an SSE2 build on a machine without SSE2? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed Oct 14 12:38:45 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 14 Oct 2015 09:38:45 -0700 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.1 released. In-Reply-To: References: Message-ID: On Oct 14, 2015 9:15 AM, "Chris Barker" wrote: > > On Mon, Oct 12, 2015 at 9:27 AM, Charles R Harris < charlesr.harris at gmail.com> wrote: >> >> * Compiling with msvc9 or msvc10 for 32 bit Windows now requires SSE2. >> This was the easiest fix for what looked to be some miscompiled code when >> SSE2 was not used. > > > Note that there is discusion right now on pyton-dev about requireing SSE2 for teh python.org build of python3.5 -- it does now, so it's fine for third party pacakges to also require it. But there is some talk of removing that requirement -- still a lot of old machines around, I guess -- particular at schools and the like. Note that the 1.10.1 release announcement is somewhat misleading -- apparently the affected builds have actually required SSE2 since numpy 1.8, and the change here just makes it even more required. I'm not sure if this is all 32 bit builds or only ones using msvc that have been needing SSE2 all along. The change in 1.10.1 only affects msvc, which is not what most people are using (IIUC Enthought Canopy uses msvc, but the pypi, gohlke, and Anaconda builds don't). I'm actually not sure if anyone even uses the 32 bit builds at all :-) > Ideally, any binary wheels on PyPi should be compatible with the python.org builds -- so not require SSE2, if the python.org builds don't. > > Though we had this discussion a while back -- and numpy could, and maybe should require more -- did we ever figure out a way to get a meaningful message to the user if they try to run an SSE2 build on a machine without SSE2? It's not that difficult in principle, just someone has to do it :-). -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed Oct 14 12:47:23 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 14 Oct 2015 09:47:23 -0700 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.1 released. In-Reply-To: References: Message-ID: On Wed, Oct 14, 2015 at 9:38 AM, Nathaniel Smith wrote: > The change in 1.10.1 only affects msvc, which is not what most people are > using (IIUC Enthought Canopy uses msvc, but the pypi, gohlke, and Anaconda > builds don't). > Anaconda uses MSVC for the most part -- they _may_ compile numpy itself some other way, no one but continuum knows for sure :-) > I'm actually not sure if anyone even uses the 32 bit builds at all :-) > There's a lot of 32 bit python use out there still, including numpy. We ever figure out a way to get a meaningful message to the user if they > try to run an SSE2 build on a machine without SSE2? > > It's not that difficult in principle, just someone has to do it :-). > yeah, there's always that .... -CHB > -n > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Wed Oct 14 16:34:35 2015 From: cournape at gmail.com (David Cournapeau) Date: Wed, 14 Oct 2015 21:34:35 +0100 Subject: [SciPy-User] [Numpy-discussion] Numpy 1.10.1 released. In-Reply-To: References: Message-ID: On Wed, Oct 14, 2015 at 5:38 PM, Nathaniel Smith wrote: > On Oct 14, 2015 9:15 AM, "Chris Barker" wrote: > > > > On Mon, Oct 12, 2015 at 9:27 AM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> > >> * Compiling with msvc9 or msvc10 for 32 bit Windows now requires SSE2. > >> This was the easiest fix for what looked to be some miscompiled code > when > >> SSE2 was not used. > > > > > > Note that there is discusion right now on pyton-dev about requireing > SSE2 for teh python.org build of python3.5 -- it does now, so it's fine > for third party pacakges to also require it. But there is some talk of > removing that requirement -- still a lot of old machines around, I guess -- > particular at schools and the like. > > Note that the 1.10.1 release announcement is somewhat misleading -- > apparently the affected builds have actually required SSE2 since numpy 1.8, > and the change here just makes it even more required. I'm not sure if this > is all 32 bit builds or only ones using msvc that have been needing SSE2 > all along. The change in 1.10.1 only affects msvc, which is not what most > people are using (IIUC Enthought Canopy uses msvc, but the pypi, gohlke, > and Anaconda builds don't). > > I'm actually not sure if anyone even uses the 32 bit builds at all :-) > I cannot divulge exact figures for downloads, but for us at Enthought, windows 32 bits is in the same ballpark as OS X and Linux (64 bits) in terms of proportion, windows 64 bits being significantly more popular. Linux 32 bits and OS X 32 bits have been in the 1 % range each of our downloads for a while (we recently stopped support for both). David > > Ideally, any binary wheels on PyPi should be compatible with the > python.org builds -- so not require SSE2, if the python.org builds don't. > > > > Though we had this discussion a while back -- and numpy could, and maybe > should require more -- did we ever figure out a way to get a meaningful > message to the user if they try to run an SSE2 build on a machine without > SSE2? > > It's not that difficult in principle, just someone has to do it :-). > > -n > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > https://mail.scipy.org/mailman/listinfo/numpy-discussion > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From prabhu at aero.iitb.ac.in Fri Oct 16 07:46:59 2015 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Fri, 16 Oct 2015 17:16:59 +0530 Subject: [SciPy-User] [ANN] SciPy India 2015: call for papers Message-ID: <5620E3B3.6010803@aero.iitb.ac.in> Hello, [Apologies for the cross-posting!] The CFP and registration for SciPy India 2015 (http://scipy.in) is open. SciPy India 2015 will be held at IIT Bombay between December 14th to December 16th, 2015. Please spread the word! SciPy India is an annual conference on using Python for research and education. The conference is currently in its seventh year. Call for Papers ============= We look forward to your submissions on the use of Python for scientific computing and education. This includes pedagogy, exploration, modeling and analysis from both applied and developmental perspectives. We welcome contributions from academia as well as industry. For details on the paper submission please see here: http://scipy.in/2015/cfp/ Important Dates ================ - Call for proposals end: 24th November 2015 - List of accepted proposals will be published: 1st December 2015. We look forward to seeing you at SciPy India. Regards, Prabhu Ramachandran and Jarrod Millman From toddrjen at gmail.com Fri Oct 16 09:50:36 2015 From: toddrjen at gmail.com (Todd) Date: Fri, 16 Oct 2015 15:50:36 +0200 Subject: [SciPy-User] Playing numpy array over speakers In-Reply-To: References: Message-ID: On Tue, Mar 31, 2015 at 6:02 PM, Todd wrote: > On Tue, Mar 24, 2015 at 9:39 PM, Todd wrote: > >> Is anyone aware of a well-maintained, simple, cross-platform python >> package that can play a numpy array as sound over speakers? >> >> I am aware of https://wiki.python.org/moin/Audio/. However, in all the >> cases there, as far as I can find they either do not support numpy arrays, >> are not cross-platform, cannot playback sound at all, or are unmaintained. >> There is also PySoundCard, which would do what I need but also appears to >> be unmaintained (no release in over a year, and no commits in 5 months, no >> release with serious bugfixes mentioned in commits). >> > > So in terms of raw waveform playback (as opposed to music note playback), > I have done some more searching and I think I have found something that > works. It is the "audio.io" package ( > https://pypi.python.org/pypi/audio.io/). It has a recent release (late > 2014), supports numpy arrays, and is cross-platform through PyAudio. It is > just a VERY thin wrapper around PyAudio (less than 100 lines). However, > there is no website, no issue tracker, essentially no documentation, and > has several projects copied into its tarball (including setputools, about, > and sh). > > Here are the reasonably maintained, reasonably relevant alternatives I > have been able to find: > > PyAudio: maintained, cross-platform, doesn't support numpy. It seems to > be used as a backend by a lot of other projects. > > audiolazy: cross-platform, supports numpy, has not seen a release since > 2013 but its github repo is still seeing commits so it may have more > releases in the future. Uses PyAudio. Provides a lot of other powerful > audio-handling and audio-processing capabilities. > > PySoundCard: cross-platform, supports numpy, has not seen a release in > over a year and its github repo has not seen a commit in 5 months, but > another related project (PySoundFile) has seen commits and releases > recently. The only option amongst these that does NOT rely on PyAudio. > > pydub: maintained, cross-platform, doesn't appear to support numpy but the > audio output is undocumented so I can't be sure. Uses PyAudio or ffmpeg if > PyAudio is not available. > > Just an update on cross-platform, numpy-compatible sound I/O packages: I have found some other possibilities: The "JACK-Client" package (https://pypi.python.org/pypi/JACK-Client/) is the furthest along and most established. It has been around for almost a year, has three contributors, and has seen four releases. However, it has gained built-in numpy support since my last update, which is why it hasn't appeared previously. The maintainer seems to be a member of an established auditory research group with a good open-source software track record. It seems to be a traditionally MATLAB group but they are adding more and more python packages. The "sounddevice" package (https://pypi.python.org/pypi/sounddevice/). It only has only been around for a few months and only has one contributor so far. However, the maintainer is the same as the maintainer of the "JACK-Client" package, it has a github repo with continued commits, a couple other people submitting issues. Since "JACK-Client" seems to have done okay, I hope this package will as well. The "hear" package (https://pypi.python.org/pypi/Hear/) is in a similar situation, although with a different maintainer. It has been around about the same amount of time, has about the same number of releases, and only has one contributor. The maintainer seems to have a good track record with open-source software and experience with sound processing, so it has some promise too. Otherwise, there has been no change. None of the other packages I listed that support numpy have seen a release in the last year. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sameer.grover.1 at gmail.com Fri Oct 16 11:42:54 2015 From: sameer.grover.1 at gmail.com (Sameer Grover) Date: Fri, 16 Oct 2015 21:12:54 +0530 Subject: [SciPy-User] Playing numpy array over speakers In-Reply-To: References: Message-ID: <56211AFE.2090107@gmail.com> On 16/10/15 19:20, Todd wrote: > On Tue, Mar 31, 2015 at 6:02 PM, Todd > wrote: > > On Tue, Mar 24, 2015 at 9:39 PM, Todd > wrote: > > Is anyone aware of a well-maintained, simple, cross-platform > python package that can play a numpy array as sound over > speakers? > > I am aware of https://wiki.python.org/moin/Audio/. However, in > all the cases there, as far as I can find they either do not > support numpy arrays, are not cross-platform, cannot playback > sound at all, or are unmaintained. There is also PySoundCard, > which would do what I need but also appears to be unmaintained > (no release in over a year, and no commits in 5 months, no > release with serious bugfixes mentioned in commits). > > > So in terms of raw waveform playback (as opposed to music note > playback), I have done some more searching and I think I have > found something that works. It is the "audio.io > " package > (https://pypi.python.org/pypi/audio.io/). It has a recent release > (late 2014), supports numpy arrays, and is cross-platform through > PyAudio. It is just a VERY thin wrapper around PyAudio (less than > 100 lines). However, there is no website, no issue tracker, > essentially no documentation, and has several projects copied into > its tarball (including setputools, about, and sh). > > Here are the reasonably maintained, reasonably relevant > alternatives I have been able to find: > > PyAudio: maintained, cross-platform, doesn't support numpy. It > seems to be used as a backend by a lot of other projects. > > audiolazy: cross-platform, supports numpy, has not seen a release > since 2013 but its github repo is still seeing commits so it may > have more releases in the future. Uses PyAudio. Provides a lot > of other powerful audio-handling and audio-processing capabilities. > > PySoundCard: cross-platform, supports numpy, has not seen a > release in over a year and its github repo has not seen a commit > in 5 months, but another related project (PySoundFile) has seen > commits and releases recently. The only option amongst these that > does NOT rely on PyAudio. > > pydub: maintained, cross-platform, doesn't appear to support numpy > but the audio output is undocumented so I can't be sure. Uses > PyAudio or ffmpeg if PyAudio is not available. > > > > Just an update on cross-platform, numpy-compatible sound I/O packages: > > I have found some other possibilities: > > The "JACK-Client" package (https://pypi.python.org/pypi/JACK-Client/) > is the furthest along and > most established. It has been around for almost a year, has three > contributors, and has seen four releases. However, it has gained > built-in numpy support since my last update, which is why it hasn't > appeared previously. The maintainer seems to be a member of an > established auditory research group with a good open-source software > track record. It seems to be a traditionally MATLAB group but they > are adding more and more python packages. > > The "sounddevice" package (https://pypi.python.org/pypi/sounddevice/). > It only has only been around for a few months and only has one > contributor so far. However, the maintainer is the same as the > maintainer of the "JACK-Client" package, it has a github repo with > continued commits, a couple other people submitting issues. Since > "JACK-Client" seems to have done okay, I hope this package will as well. > > The "hear" package (https://pypi.python.org/pypi/Hear/) is in a > similar situation, although with a different maintainer. It has been > around about the same amount of time, has about the same number of > releases, and only has one contributor. The maintainer seems to have > a good track record with open-source software and experience with > sound processing, so it has some promise too. > > Otherwise, there has been no change. None of the other packages I > listed that support numpy have seen a release in the last year. > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user https://github.com/standarddeviant/sound4python I've used this on linux. According to the documentation, it is supposed to work on windows too. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat Oct 17 02:19:26 2015 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 16 Oct 2015 23:19:26 -0700 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' Message-ID: Hi all, Just wanted to bring this to the attention of a wider audience -- Right now, if you're using pip to manage packages in a virtualenv, then upgrading to a new version of some package is *super annoying*, because you run pip install -U and then it starts upgrading not just the package you requested, but all the packages that it depends on. And if one of those packages is numpy or scipy or whatever, then suddenly your laptop is overheating and what should have been a 10 second process takes an hour and might fail and arrrghh frustrating. Then this also has nasty knock-on effects at the ecosystem level, because it motivates lots of numerical python packages to simply lie about what their dependencies are, on the theory that 'pip install -U' can't mess up a dependency that it can't see. Which resolves the immediate problem, but is obviously not a good long-term solution. >From some recent discussions on distutils-sig, one outcome was that the pip maintainers are actually totally happy to fix this if anyone wants to write a patch -- basically the idea would be to add a 'pip upgrade' command that works the way you'd expect (only upgrades the named package, plus whatever other packages are minimally required in order to satisfy its new dependencies -- basically the equivalent of 'pip install foo='), and deprecate 'install -U': https://github.com/pypa/pip/issues/59#issuecomment-147149208 So if anyone is feeling inspired to make a highly-visible fix to one of the most-widely-used Python programs, and earn the undying gratitude of lots of maintainers and users, then writing a patch to implement 'pip upgrade' and submitting it upstream would be a great opportunity for that ;-) -n -- Nathaniel J. Smith -- http://vorpus.org From mansingtakale at gmail.com Sat Oct 17 03:28:15 2015 From: mansingtakale at gmail.com (mansing takale) Date: Sat, 17 Oct 2015 12:58:15 +0530 Subject: [SciPy-User] [ANN] SciPy India 2015: call for papers In-Reply-To: <5620E3B3.6010803@aero.iitb.ac.in> References: <5620E3B3.6010803@aero.iitb.ac.in> Message-ID: Dear Sir, Its pleasure to receive your mail. I will give this mail wide publicity inside and outside the campus for FOSSEE lovers. Thank you. Bye with regards Dr. Mansing V Takale, Assistant Prof. Department of Physics, Shivaji University,Kolhapur-416004 On Fri, Oct 16, 2015 at 5:16 PM, Prabhu Ramachandran wrote: > Hello, > > [Apologies for the cross-posting!] > > The CFP and registration for SciPy India 2015 (http://scipy.in) is open. > SciPy > India 2015 will be held at IIT Bombay between December 14th to December > 16th, 2015. > > Please spread the word! > > SciPy India is an annual conference on using Python for research and > education. > The conference is currently in its seventh year. > > Call for Papers > ============= > > We look forward to your submissions on the use of Python for scientific > computing and education. This includes pedagogy, exploration, modeling and > analysis from both applied and developmental perspectives. We welcome > contributions from academia as well as industry. > > For details on the paper submission please see here: > > http://scipy.in/2015/cfp/ > > Important Dates > ================ > > - Call for proposals end: 24th November 2015 > - List of accepted proposals will be published: 1st December 2015. > > > We look forward to seeing you at SciPy India. > > Regards, > Prabhu Ramachandran and Jarrod Millman > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user > -- *Dr. Mansing V. Takale*, Assistant Professor, Department of Physics, Shivaji University, Kolhapur-416004 India (M.S.) Contact: +91-9673041222 -------------- next part -------------- An HTML attachment was scrubbed... URL: From toddrjen at gmail.com Sat Oct 17 10:52:39 2015 From: toddrjen at gmail.com (Todd) Date: Sat, 17 Oct 2015 16:52:39 +0200 Subject: [SciPy-User] Playing numpy array over speakers In-Reply-To: <56211AFE.2090107@gmail.com> References: <56211AFE.2090107@gmail.com> Message-ID: On Oct 16, 2015 17:45, "Sameer Grover" wrote: > > On 16/10/15 19:20, Todd wrote: >> >> On Tue, Mar 31, 2015 at 6:02 PM, Todd wrote: >>> >>> On Tue, Mar 24, 2015 at 9:39 PM, Todd wrote: >>>> >>>> Is anyone aware of a well-maintained, simple, cross-platform python package that can play a numpy array as sound over speakers? >>>> >>>> I am aware of https://wiki.python.org/moin/Audio/. However, in all the cases there, as far as I can find they either do not support numpy arrays, are not cross-platform, cannot playback sound at all, or are unmaintained. There is also PySoundCard, which would do what I need but also appears to be unmaintained (no release in over a year, and no commits in 5 months, no release with serious bugfixes mentioned in commits). >>> >>> >>> So in terms of raw waveform playback (as opposed to music note playback), I have done some more searching and I think I have found something that works. It is the "audio.io" package ( https://pypi.python.org/pypi/audio.io/). It has a recent release (late 2014), supports numpy arrays, and is cross-platform through PyAudio. It is just a VERY thin wrapper around PyAudio (less than 100 lines). However, there is no website, no issue tracker, essentially no documentation, and has several projects copied into its tarball (including setputools, about, and sh). >>> >>> Here are the reasonably maintained, reasonably relevant alternatives I have been able to find: >>> >>> PyAudio: maintained, cross-platform, doesn't support numpy. It seems to be used as a backend by a lot of other projects. >>> >>> audiolazy: cross-platform, supports numpy, has not seen a release since 2013 but its github repo is still seeing commits so it may have more releases in the future. Uses PyAudio. Provides a lot of other powerful audio-handling and audio-processing capabilities. >>> >>> PySoundCard: cross-platform, supports numpy, has not seen a release in over a year and its github repo has not seen a commit in 5 months, but another related project (PySoundFile) has seen commits and releases recently. The only option amongst these that does NOT rely on PyAudio. >>> >>> pydub: maintained, cross-platform, doesn't appear to support numpy but the audio output is undocumented so I can't be sure. Uses PyAudio or ffmpeg if PyAudio is not available. >>> >> >> >> Just an update on cross-platform, numpy-compatible sound I/O packages: >> >> I have found some other possibilities: >> >> The "JACK-Client" package (https://pypi.python.org/pypi/JACK-Client/) is the furthest along and most established. It has been around for almost a year, has three contributors, and has seen four releases. However, it has gained built-in numpy support since my last update, which is why it hasn't appeared previously. The maintainer seems to be a member of an established auditory research group with a good open-source software track record. It seems to be a traditionally MATLAB group but they are adding more and more python packages. >> >> The "sounddevice" package (https://pypi.python.org/pypi/sounddevice/). It only has only been around for a few months and only has one contributor so far. However, the maintainer is the same as the maintainer of the "JACK-Client" package, it has a github repo with continued commits, a couple other people submitting issues. Since "JACK-Client" seems to have done okay, I hope this package will as well. >> >> The "hear" package (https://pypi.python.org/pypi/Hear/) is in a similar situation, although with a different maintainer. It has been around about the same amount of time, has about the same number of releases, and only has one contributor. The maintainer seems to have a good track record with open-source software and experience with sound processing, so it has some promise too. >> >> Otherwise, there has been no change. None of the other packages I listed that support numpy have seen a release in the last year. >> >> >> > > https://github.com/standarddeviant/sound4python > > I've used this on linux. According to the documentation, it is supposed to work on windows too. It looks like it hasn't received any commits in the last year, though. -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sat Oct 17 13:54:04 2015 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 17 Oct 2015 13:54:04 -0400 Subject: [SciPy-User] If you use GPL in our neighborhood then you break the chain, ... Message-ID: ... and something bad will happen to us. Dear developer, One of the great features of development in our neighborhood is that almost all code is BSD licensed or has a license that is compatible with it. This allows us to freely use and copy (with attribution) each others code, which makes some packages a great source of inspiration or provides code snippets for reuse. I think this is also a reason why Python became so successful. Not only is the code readable but everybody is also legally allowed to copy the code and adjust it to individual special requirements. GPL breaks the mutual sharing, because we cannot benefit back from the changes, improvements, adjustments and extensions. If you really want or need to license your code as GPL to protect a specific application or a GUI, then please consider dual licensing in a BSD compatible way the backend, the statistical backend as far as I'm concerned. Thank you for your attention. Josef -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Oct 18 08:07:30 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 18 Oct 2015 14:07:30 +0200 Subject: [SciPy-User] new feature: optimize.curve_fit with bounds Message-ID: Hi all, Curve fitting with bounds is a feature that has often been requested by users in the past, so I'd like to advertise the excellent GSoC work of Nikolay that's now included in Scipy master and will be in 0.17.0: 1. A new function optimize.least_squares: https://github.com/scipy/scipy/pull/5044 2. Adding bounds to optimize.curve_fit: https://github.com/scipy/scipy/pull/5147 Up-to-date html docs: http://scipy.github.io/devdocs/optimize.html Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From jks at iki.fi Sun Oct 18 14:27:02 2015 From: jks at iki.fi (=?iso-8859-1?Q?Jouni_K=2E_Sepp=E4nen?=) Date: Sun, 18 Oct 2015 21:27:02 +0300 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' References: Message-ID: Nathaniel Smith writes: > basically the idea would be to add a 'pip upgrade' command that works > the way you'd expect (only upgrades the named package, plus whatever > other packages are minimally required in order to satisfy its new > dependencies -- basically the equivalent of 'pip install foo= version>') I submitted a patch that attempts to do this: https://github.com/pypa/pip/pull/3194 -- Jouni K. Sepp?nen http://www.iki.fi/jks From njs at pobox.com Mon Oct 19 02:47:30 2015 From: njs at pobox.com (Nathaniel Smith) Date: Sun, 18 Oct 2015 23:47:30 -0700 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' In-Reply-To: References: Message-ID: On Sun, Oct 18, 2015 at 11:27 AM, Jouni K. Sepp?nen wrote: > Nathaniel Smith writes: > >> basically the idea would be to add a 'pip upgrade' command that works >> the way you'd expect (only upgrades the named package, plus whatever >> other packages are minimally required in order to satisfy its new >> dependencies -- basically the equivalent of 'pip install foo=> version>') > > I submitted a patch that attempts to do this: > https://github.com/pypa/pip/pull/3194 Awesome! -n -- Nathaniel J. Smith -- http://vorpus.org From baker.alexander at gmail.com Mon Oct 19 03:10:45 2015 From: baker.alexander at gmail.com (Alexander Baker) Date: Mon, 19 Oct 2015 08:10:45 +0100 Subject: [SciPy-User] new feature: optimize.curve_fit with bounds In-Reply-To: References: Message-ID: Hi Ralf & Nikolay, Thanks, this looks very useful indeed, great work. Regards, Alex baker > On 18 Oct 2015, at 13:07, Ralf Gommers wrote: > > Hi all, > > Curve fitting with bounds is a feature that has often been requested by users in the past, so I'd like to advertise the excellent GSoC work of Nikolay that's now included in Scipy master and will be in 0.17.0: > > 1. A new function optimize.least_squares: https://github.com/scipy/scipy/pull/5044 > 2. Adding bounds to optimize.curve_fit: https://github.com/scipy/scipy/pull/5147 > > Up-to-date html docs: http://scipy.github.io/devdocs/optimize.html > > Cheers, > Ralf > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From ndbecker2 at gmail.com Mon Oct 19 07:58:32 2015 From: ndbecker2 at gmail.com (Neal Becker) Date: Mon, 19 Oct 2015 07:58:32 -0400 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' References: Message-ID: Nathaniel Smith wrote: > On Sun, Oct 18, 2015 at 11:27 AM, Jouni K. Sepp?nen wrote: >> Nathaniel Smith writes: >> >>> basically the idea would be to add a 'pip upgrade' command that works >>> the way you'd expect (only upgrades the named package, plus whatever >>> other packages are minimally required in order to satisfy its new >>> dependencies -- basically the equivalent of 'pip install foo=>> version>') >> >> I submitted a patch that attempts to do this: >> https://github.com/pypa/pip/pull/3194 > > Awesome! > > -n > Perhaps related, I am running linux/fedora. My vendor (Fedora) provides may python packages, but they aren't always the latest. I find the most convenient approach is to use: pip install foo --user But in many cases, this won't work correctly. I often find that although I have the latest numpy in my local --user, pip is confused about the older one install into the standard sys location. It attempts to build another copy of numpy (for example), and eventually fails on trying to remove the system copy (pip running un-privileged). This has been a big annoyance for a long time. From davidmenhur at gmail.com Mon Oct 19 08:22:13 2015 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Mon, 19 Oct 2015 14:22:13 +0200 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' In-Reply-To: References: Message-ID: On 19 October 2015 at 13:58, Neal Becker wrote: > > Perhaps related, > I am running linux/fedora. My vendor (Fedora) provides may python > packages, > but they aren't always the latest. I find the most convenient approach is > to use: > pip install foo --user I install everything into a virtualenv and ignore system's Python. This way I have much better control over what is going on (if I install inkscape, it drags along a possibly outdated Numpy version, for example), and risk free updates. And if anything ever goes wrong, it is much easier to nuke (remove a folder) and start it all over. /David. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Mon Oct 19 12:23:31 2015 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 19 Oct 2015 09:23:31 -0700 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' In-Reply-To: References: Message-ID: On Oct 19, 2015 4:58 AM, "Neal Becker" wrote: > > Nathaniel Smith wrote: > > > On Sun, Oct 18, 2015 at 11:27 AM, Jouni K. Sepp?nen wrote: > >> Nathaniel Smith writes: > >> > >>> basically the idea would be to add a 'pip upgrade' command that works > >>> the way you'd expect (only upgrades the named package, plus whatever > >>> other packages are minimally required in order to satisfy its new > >>> dependencies -- basically the equivalent of 'pip install foo= >>> version>') > >> > >> I submitted a patch that attempts to do this: > >> https://github.com/pypa/pip/pull/3194 > > > > Awesome! > > > > -n > > > > Perhaps related, > I am running linux/fedora. My vendor (Fedora) provides may python packages, > but they aren't always the latest. I find the most convenient approach is > to use: > pip install foo --user > > But in many cases, this won't work correctly. Not sure what to say without more details about what these "many cases" are. IIUC this should work and if it doesn't then it's a bug (though there are adjacent things that aren't expected to work, e.g. 'setup.py install --user'). -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From ndbecker2 at gmail.com Mon Oct 19 14:18:28 2015 From: ndbecker2 at gmail.com (Neal Becker) Date: Mon, 19 Oct 2015 14:18:28 -0400 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' References: Message-ID: Nathaniel Smith wrote: > On Oct 19, 2015 4:58 AM, "Neal Becker" wrote: >> >> Nathaniel Smith wrote: >> >> > On Sun, Oct 18, 2015 at 11:27 AM, Jouni K. Sepp?nen wrote: >> >> Nathaniel Smith writes: >> >> >> >>> basically the idea would be to add a 'pip upgrade' command that works >> >>> the way you'd expect (only upgrades the named package, plus whatever >> >>> other packages are minimally required in order to satisfy its new >> >>> dependencies -- basically the equivalent of 'pip install foo=> >>> version>') >> >> >> >> I submitted a patch that attempts to do this: >> >> https://github.com/pypa/pip/pull/3194 >> > >> > Awesome! >> > >> > -n >> > >> >> Perhaps related, >> I am running linux/fedora. My vendor (Fedora) provides may python > packages, >> but they aren't always the latest. I find the most convenient approach >> is to use: >> pip install foo --user >> >> But in many cases, this won't work correctly. > > Not sure what to say without more details about what these "many cases" > are. IIUC this should work and if it doesn't then it's a bug (though there > are adjacent things that aren't expected to work, e.g. 'setup.py install > --user'). > Well I had just installed numpy-1.10.1 and pandas-0.17.0, and here's what happens: $ pip install --user --up pandas Requirement already up-to-date: pandas in /home/nbecker/.local/lib/python2.7/site-packages Requirement already up-to-date: python-dateutil in /home/nbecker/.local/lib/python2.7/site-packages (from pandas) Requirement already up-to-date: pytz>=2011k in /home/nbecker/.local/lib/python2.7/site-packages (from pandas) Collecting numpy>=1.7.0 (from pandas) Using cached numpy-1.10.1.tar.gz Collecting six>=1.5 (from python-dateutil->pandas) Using cached six-1.10.0-py2.py3-none-any.whl Installing collected packages: numpy, six Found existing installation: numpy 1.8.2 DEPRECATION: Uninstalling a distutils installed project (numpy) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project. Uninstalling numpy-1.8.2: Successfully uninstalled numpy-1.8.2 Running setup.py install for numpy C-c C-c Rolling back uninstall of numpy As you see, it seems to be installing numpy-1.10.1 again, even though it's already installed /home/nbecker/.local/lib/python2.7/site-packages: total used in directory 4264 available 781585920 drwx------. 1 nbecker nbecker 15272 Oct 19 14:15 . drwxrwxr-x. 1 nbecker nbecker 190 Oct 15 10:44 pandas-0.17.0- py2.7.egg-info drwxrwxr-x. 1 nbecker nbecker 506 Oct 15 10:44 pandas -rw-rw-r--. 1 nbecker nbecker 869 Oct 15 10:41 easy-install.pth -rw-rw-r--. 1 nbecker nbecker 2060 Oct 15 10:37 numpy-1.10.1-py2.7.egg- info drwxrwxr-x. 1 nbecker nbecker 572 Oct 15 10:37 numpy drwxrwxr-x. 1 nbecker nbecker 136 Oct 15 10:32 pytz-2015.6.dist-info drwxrwxr-x. 1 nbecker nbecker 272 Oct 15 10:32 pytz From njs at pobox.com Mon Oct 19 14:55:30 2015 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 19 Oct 2015 11:55:30 -0700 Subject: [SciPy-User] Opportunity to contribute: 'pip upgrade' In-Reply-To: References: Message-ID: On Oct 19, 2015 11:18 AM, "Neal Becker" wrote: > > Nathaniel Smith wrote: > > > On Oct 19, 2015 4:58 AM, "Neal Becker" wrote: > >> > >> Nathaniel Smith wrote: > >> > >> > On Sun, Oct 18, 2015 at 11:27 AM, Jouni K. Sepp?nen wrote: > >> >> Nathaniel Smith writes: > >> >> > >> >>> basically the idea would be to add a 'pip upgrade' command that works > >> >>> the way you'd expect (only upgrades the named package, plus whatever > >> >>> other packages are minimally required in order to satisfy its new > >> >>> dependencies -- basically the equivalent of 'pip install foo= >> >>> version>') > >> >> > >> >> I submitted a patch that attempts to do this: > >> >> https://github.com/pypa/pip/pull/3194 > >> > > >> > Awesome! > >> > > >> > -n > >> > > >> > >> Perhaps related, > >> I am running linux/fedora. My vendor (Fedora) provides may python > > packages, > >> but they aren't always the latest. I find the most convenient approach > >> is to use: > >> pip install foo --user > >> > >> But in many cases, this won't work correctly. > > > > Not sure what to say without more details about what these "many cases" > > are. IIUC this should work and if it doesn't then it's a bug (though there > > are adjacent things that aren't expected to work, e.g. 'setup.py install > > --user'). > > > > Well I had just installed numpy-1.10.1 and pandas-0.17.0, and here's what > happens: > > $ pip install --user --up pandas > Requirement already up-to-date: pandas in > /home/nbecker/.local/lib/python2.7/site-packages > Requirement already up-to-date: python-dateutil in > /home/nbecker/.local/lib/python2.7/site-packages (from pandas) > Requirement already up-to-date: pytz>=2011k in > /home/nbecker/.local/lib/python2.7/site-packages (from pandas) > Collecting numpy>=1.7.0 (from pandas) > Using cached numpy-1.10.1.tar.gz > Collecting six>=1.5 (from python-dateutil->pandas) > Using cached six-1.10.0-py2.py3-none-any.whl > Installing collected packages: numpy, six > Found existing installation: numpy 1.8.2 > DEPRECATION: Uninstalling a distutils installed project (numpy) has been > deprecated and will be removed in a future version. This is due to the fact > that uninstalling a distutils project will only partially uninstall the > project. > Uninstalling numpy-1.8.2: > Successfully uninstalled numpy-1.8.2 > Running setup.py install for numpy > C-c C-c Rolling back uninstall of numpy > > As you see, it seems to be installing numpy-1.10.1 again, even though it's > already installed > > /home/nbecker/.local/lib/python2.7/site-packages: > total used in directory 4264 available 781585920 > drwx------. 1 nbecker nbecker 15272 Oct 19 14:15 . > drwxrwxr-x. 1 nbecker nbecker 190 Oct 15 10:44 pandas-0.17.0- > py2.7.egg-info > drwxrwxr-x. 1 nbecker nbecker 506 Oct 15 10:44 pandas > -rw-rw-r--. 1 nbecker nbecker 869 Oct 15 10:41 easy-install.pth > -rw-rw-r--. 1 nbecker nbecker 2060 Oct 15 10:37 numpy-1.10.1-py2.7.egg- > info > drwxrwxr-x. 1 nbecker nbecker 572 Oct 15 10:37 numpy > drwxrwxr-x. 1 nbecker nbecker 136 Oct 15 10:32 pytz-2015.6.dist-info > drwxrwxr-x. 1 nbecker nbecker 272 Oct 15 10:32 pytz I'm struck that pip seems to have found a copy of numpy 1.8.2 somewhere, yet we can tell that it's neither in your regular user install directory (or it would be listed), nor is it a fedora-installed version (because if it were then the uninstall would have errored out for lack of write permissions). But in a default setup these are the only two places an install can go. What's going on with your sys.path? Where is this 1.8.2? -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From jturner at gemini.edu Tue Oct 20 20:01:29 2015 From: jturner at gemini.edu (James E.H. Turner) Date: Tue, 20 Oct 2015 21:01:29 -0300 Subject: [SciPy-User] If you use GPL in our neighborhood then you break the chain, ... In-Reply-To: References: Message-ID: <5626D5D9.7090300@gemini.edu> Hi Josef, I feel compelled to comment on this, even as a somewhat peripheral member of the SciPy community -- not because I disagree with your request, but because it's basically spreading misinformation (which it seems has been pointed out by one other person at most). Everyone is allowed to copy and adjust GPL code. The GPL is designed explicitly to ensure that you "benefit back from the changes, improvements, adjustments and extensions" and does a better job of that than BSD does, since it prohibits making proprietary derivatives. The only reason the community can't use GPL code is that it chooses not to make everything GPL. You can peruse the licensing details at http://www.gnu.org/licenses/license-list.html or http://opensource.org/licenses/gpl-license. Now I think BSD is a fine licence that has been overwhelmingly endorsed by the scientific software community. I have no intention of discouraging its use. It's nice and simple and I understand that some companies in our community develop proprietary software for their customers while also providing long-standing support for the SciPy stack. My only objection is that to say "GPL breaks the sharing" is simply a FUD-like error of fact and grossly unfair to those behind it. Cheers, James. On 17/10/15 14:54, josef.pktd at gmail.com wrote: > ... and something bad will happen to us. > > > Dear developer, > > One of the great features of development in our neighborhood is that almost all > code is BSD licensed or has a license that is compatible with it. This allows us > to freely use and copy (with attribution) each others code, which makes some > packages a great source of inspiration or provides code snippets for reuse. > > I think this is also a reason why Python became so successful. Not only is the > code readable but everybody is also legally allowed to copy the code and adjust > it to individual special requirements. > > GPL breaks the mutual sharing, because we cannot benefit back from the changes, > improvements, adjustments and extensions. > > If you really want or need to license your code as GPL to protect a specific > application or a GUI, then please consider dual licensing in a BSD compatible > way the backend, the statistical backend as far as I'm concerned. > > > Thank you for your attention. > > Josef > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user From david.mikolas1 at gmail.com Wed Oct 21 01:09:39 2015 From: david.mikolas1 at gmail.com (David Mikolas) Date: Wed, 21 Oct 2015 13:09:39 +0800 Subject: [SciPy-User] access to odeint intermediate results for later interpolation Message-ID: I've read that sometimes one saves the intermediate results from numerical integration so that it can be re-interpolated for other points in time. For example, an efficient integration would use variable step size, and when it detects that the current step has passed one or more of the time points requested, it would use the same coefficients to calculate results at the requested time point instead of recalculating. If this is happening in odeint - where can I read about it. Maybe a "classic" reference to the odepack, or maybe the FORTRAN (though it may not be commented). Is there any possibility this could be accessed within python? -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.mikolas1 at gmail.com Wed Oct 21 01:54:59 2015 From: david.mikolas1 at gmail.com (David Mikolas) Date: Wed, 21 Oct 2015 13:54:59 +0800 Subject: [SciPy-User] access to odeint intermediate results for later interpolation In-Reply-To: References: Message-ID: I have found that the documentation for integrate.ode (the broader range of methods) lists the book HNW93 as a reference for the dopri45 method - I will go read that now for background. On Wed, Oct 21, 2015 at 1:09 PM, David Mikolas wrote: > I've read that sometimes one saves the intermediate results from numerical > integration so that it can be re-interpolated for other points in time. > > For example, an efficient integration would use variable step size, and > when it detects that the current step has passed one or more of the time > points requested, it would use the same coefficients to calculate results > at the requested time point instead of recalculating. > > If this is happening in odeint - where can I read about it. Maybe a > "classic" reference to the odepack, or maybe the FORTRAN (though it may not > be commented). > > Is there any possibility this could be accessed within python? > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ctmedra at unizar.es Wed Oct 21 12:23:38 2015 From: ctmedra at unizar.es (Carlos Medrano) Date: Wed, 21 Oct 2015 18:23:38 +0200 Subject: [SciPy-User] fromstring and int64 Message-ID: <5627BC0A.2040100@unizar.es> Hello, I am having an issue with function fromstring and long integers (int64). It seems it cannot read properly long numbers. I use scipy version 0.14.0 and numpy 1.8.2. I have the problem also if I use numpy directly. import scipy s='0 1445367600061 -35960 39671 79230' scipy.fromstring(s, sep=' ', dtype=scipy.int64) I get this array([ 0, 2147483647, -35960, 39671, 79230], dtype=int64) which is wrong in the second element However if I write to a file the string s and use loadtxt, I get the right values: f=open('dumf.txt','w') f.write(s) f.close() scipy.loadtxt('dumf.txt', dtype=scipy.int64) I get: array([0, 1445367600061, -35960, 39671, 79230], dtype=int64) I think this can be a bug a I would like to check on the mailing list first and see if other people have the same behaviour. Regards Carlos Medrano From rob.clewley at gmail.com Thu Oct 22 12:08:22 2015 From: rob.clewley at gmail.com (Rob Clewley) Date: Thu, 22 Oct 2015 12:08:22 -0400 Subject: [SciPy-User] access to odeint intermediate results for later interpolation In-Reply-To: References: Message-ID: > > On Wed, Oct 21, 2015 at 1:09 PM, David Mikolas > wrote: >> >> I've read that sometimes one saves the intermediate results from numerical >> integration so that it can be re-interpolated for other points in time. >> >> For example, an efficient integration would use variable step size, and >> when it detects that the current step has passed one or more of the time >> points requested, it would use the same coefficients to calculate results at >> the requested time point instead of recalculating. I think these features are not in the current scipy codes. However, PyDSTool's interface to Dopri and Radau gives access to those polynomials. It also augments the original codes with a way to force time points from an array. The output is an interpolation object that can use the underlying polynomial directly. An example is at https://github.com/robclewley/pydstool/blob/master/examples/poly_interp_test.py From chris.barker at noaa.gov Thu Oct 22 12:57:59 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 22 Oct 2015 09:57:59 -0700 Subject: [SciPy-User] fromstring and int64 In-Reply-To: <5627BC0A.2040100@unizar.es> References: <5627BC0A.2040100@unizar.es> Message-ID: On Wed, Oct 21, 2015 at 9:23 AM, Carlos Medrano wrote: > I am having an issue with function fromstring and long integers (int64). > It seems it cannot read properly long numbers. I use scipy version 0.14.0 > and numpy 1.8.2. I have the problem also if I use numpy directly. > > import scipy > s='0 1445367600061 -35960 39671 79230' > scipy.fromstring(s, sep=' ', dtype=scipy.int64) > > I get this > > array([ 0, 2147483647, -35960, 39671, 79230], dtype=int64) > > it works for me: In [1]: %paste import scipy s='0 1445367600061 -35960 39671 79230' scipy.fromstring(s, sep=' ', dtype=scipy.int64) ## -- End pasted text -- Out[1]: array([ 0, 1445367600061, -35960, 39671, 79230]) numpy version 1.9.3 However, I suspect it's a platform thing -- I'm on 64 bit OS-X, which used 64 bit integers for a long -- 32 bit platforms and Windows64 don't. scipy.fromstring is numpy.fromstring, and numpy.fromstring is kludgy, ugly and pretty broken. It also punts the string parsing to the C atoi(), with a bit of standard python plugged in there. That's why I think it's a C long problem. Even if you are running on on a "proper" 64 bit platfrom, it may be broken, but I can guarantee you it will be a pain to fix. So the short answer is -- don't use it. scipy.loadtxt('dumf.txt', dtype=scipy.int64) > yes, loadtxt is a lot smarter. However, fromstring is a lt faster, so it's too bad. If you really need very fast reading of numbers from text from files, I'd look at panda's CSV reader -- I hear it's pretty sweet. Also -- I have some Cython code that's blazingly fast -- only floats right now, but it wouldn't be hard to adapt to integers... I can send it to you if you want. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.mikolas1 at gmail.com Sat Oct 24 02:46:53 2015 From: david.mikolas1 at gmail.com (David Mikolas) Date: Sat, 24 Oct 2015 14:46:53 +0800 Subject: [SciPy-User] access to odeint intermediate results for later interpolation In-Reply-To: References: Message-ID: That's great! I like the packaging of the dense output as a Python interpolation object. I can see it might take a little time to install and learn. Documentation at the following site looks very helpful. http://www.ni.gsu.edu/~rclewley/PyDSTool/FrontPage.html I wish there was an introductory video (setup and working a problem) - would be great for squeamish people like myself. Thanks for the help! Just as reference, I've also found some discussion of possible implementation of dense output in SciPy itself here: http://comments.gmane.org/gmane.comp.python.scientific.devel/19635 https://mail.scipy.org/pipermail/scipy-user/2015-October/thread.html On Fri, Oct 23, 2015 at 12:08 AM, Rob Clewley wrote: > > > > On Wed, Oct 21, 2015 at 1:09 PM, David Mikolas > > > wrote: > >> > >> I've read that sometimes one saves the intermediate results from > numerical > >> integration so that it can be re-interpolated for other points in time. > >> > >> For example, an efficient integration would use variable step size, and > >> when it detects that the current step has passed one or more of the time > >> points requested, it would use the same coefficients to calculate > results at > >> the requested time point instead of recalculating. > > I think these features are not in the current scipy codes. However, > PyDSTool's interface to Dopri and Radau gives access to those > polynomials. It also augments the original codes with a way to force > time points from an array. The output is an interpolation object that > can use the underlying polynomial directly. An example is at > > https://github.com/robclewley/pydstool/blob/master/examples/poly_interp_test.py > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhmerchant at gmail.com Sat Oct 24 15:04:41 2015 From: bhmerchant at gmail.com (Brian Merchant) Date: Sat, 24 Oct 2015 12:04:41 -0700 Subject: [SciPy-User] access to odeint intermediate results for later interpolation In-Reply-To: References: Message-ID: Hi David, I have used PyDSTool before and can highly recommend it. I too was worried about how it would be difficult to get started, so only poked around for a bit, before actually beginning to use it for school assignments (where I used it to handily beat XPPAUT, except when it came to automatic bifurcation continuation). As you mention, it's scary to start off because there isn't super-complete, hand-holding, documentation just yet. However, once I dove in, I found three things were very helpful: 1) Rob Clewley (the developer's) immense (seriously, not an exaggeration) kindness when it comes to answering questions/making things clearer, 2) the Sourceforge user group's activity to similarly help (I believe Rob participates there as well), 3) the current examples on the documentation page and the many undocumented examples within the package itself (I forget the exact path, since it has been a while since I used it) I remember that the following documentation example got me up to speed really quickly in terms of the very basics: http://www.ni.gsu.edu/~rclewley/PyDSTool/Tutorial/Tutorial_Calcium.html I think this one is kind of new (so I am not sure if I have read it), but it seems to be very helpful too: http://www.ni.gsu.edu/~rclewley/PyDSTool/Tutorial/Tutorial_linear.html Overall, I'd say part of the difficulty of using PyDSTool is the difficulty of the subject matter (dynamical systems, numerical integration) itself. Numerical integration is "easy" as long as you are using a black box, and then suddenly and unexpectedly seems to become "difficult" once the black boxes encounter something that isn't working so well. Dynamical systems is a new field (its only within the last 30-40 years that serious progress has been made since some initial needling in the age-before-computers by folks like Poincare), and some of the most cutting edge tools (e.g. XPPAUT) are horrendously un-user-friendly. So, sticking through the initial "oh god, I wish I could just download all this info into my brain" is important, and you'll soon find yourself being able to ask the right questions (at the very least). Hopefully, you can also help develop some documentation if you have the time :) Kind regards, Brian On Fri, Oct 23, 2015 at 11:46 PM, David Mikolas wrote: > That's great! I like the packaging of the dense output as a Python > interpolation object. > > I can see it might take a little time to install and learn. Documentation > at the following site looks very helpful. > > http://www.ni.gsu.edu/~rclewley/PyDSTool/FrontPage.html > > I wish there was an introductory video (setup and working a problem) - > would be great for squeamish people like myself. > > Thanks for the help! > > Just as reference, I've also found some discussion of possible > implementation of dense output in SciPy itself here: > > http://comments.gmane.org/gmane.comp.python.scientific.devel/19635 > https://mail.scipy.org/pipermail/scipy-user/2015-October/thread.html > > > On Fri, Oct 23, 2015 at 12:08 AM, Rob Clewley > wrote: > >> > >> > On Wed, Oct 21, 2015 at 1:09 PM, David Mikolas < >> david.mikolas1 at gmail.com> >> > wrote: >> >> >> >> I've read that sometimes one saves the intermediate results from >> numerical >> >> integration so that it can be re-interpolated for other points in time. >> >> >> >> For example, an efficient integration would use variable step size, and >> >> when it detects that the current step has passed one or more of the >> time >> >> points requested, it would use the same coefficients to calculate >> results at >> >> the requested time point instead of recalculating. >> >> I think these features are not in the current scipy codes. However, >> PyDSTool's interface to Dopri and Radau gives access to those >> polynomials. It also augments the original codes with a way to force >> time points from an array. The output is an interpolation object that >> can use the underlying polynomial directly. An example is at >> >> https://github.com/robclewley/pydstool/blob/master/examples/poly_interp_test.py >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at scipy.org >> https://mail.scipy.org/mailman/listinfo/scipy-user >> > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > https://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Oct 25 06:25:26 2015 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 25 Oct 2015 11:25:26 +0100 Subject: [SciPy-User] ANN: Scipy 0.16.1 release Message-ID: Hi all, I'm happy to announce the availability of the Scipy 0.16.1 release. This is a bugfix only release; it contains no new features compared to 0.16.0. The sources and binary installers can be found at: - Source tarballs: at https://github.com/scipy/scipy/releases and on PyPi. - OS X: there are wheels on PyPi, so simply install with pip. - Windows: .exe installers can be found on https://github.com/scipy/scipy/releases Cheers, Ralf ========================== SciPy 0.16.1 Release Notes ========================== SciPy 0.16.1 is a bug-fix release with no new features compared to 0.16.0. Issues closed for 0.16.1 ------------------------ - `#5077 `__: cKDTree not indexing properly for arrays with too many elements - `#5127 `__: Regression in 0.16.0: solve_banded errors out in patsy test suite - `#5149 `__: linalg tests apparently cause python to crash with numpy 1.10.0b1 - `#5154 `__: 0.16.0 fails to build on OS X; can't find Python.h - `#5173 `__: failing stats.histogram test with numpy 1.10 - `#5191 `__: Scipy 0.16.x - TypeError: _asarray_validated() got an unexpected... - `#5195 `__: tarballs missing documentation source - `#5363 `__: FAIL: test_orthogonal.test_j_roots, test_orthogonal.test_js_roots Pull requests for 0.16.1 ------------------------ - `#5088 `__: BUG: fix logic error in cKDTree.sparse_distance_matrix - `#5089 `__: BUG: Don't overwrite b in lfilter's FIR path - `#5128 `__: BUG: solve_banded failed when solving 1x1 systems - `#5155 `__: BLD: fix missing Python include for Homebrew builds. - `#5192 `__: BUG: backport as_inexact kwarg to _asarray_validated - `#5203 `__: BUG: fix uninitialized use in lartg 0.16 backport - `#5204 `__: BUG: properly return error to fortran from ode_jacobian_function... - `#5207 `__: TST: Fix TestCtypesQuad failure on Python 3.5 for Windows - `#5352 `__: TST: sparse: silence warnings about boolean indexing - `#5355 `__: MAINT: backports for 0.16.1 release - `#5356 `__: REL: update Paver file to ensure sdist contents are OK for releases. - `#5382 `__: 0.16.x backport: MAINT: work around a possible numpy ufunc loop... - `#5393 `__: TST:special: bump tolerance levels for test_j_roots and test_js_roots - `#5417