From hturesson at gmail.com Fri Aug 1 14:49:06 2014 From: hturesson at gmail.com (Hjalmar Turesson) Date: Fri, 1 Aug 2014 20:49:06 +0200 Subject: [SciPy-User] Spatio-temporal interpolation In-Reply-To: References: Message-ID: Hi Abrham, You will need to get your excel data over to a numpy array. To do this, you can save the excel sheets as .csv (comma separated values) and then read in these files to Python using genfromtxt() ( http://docs.scipy.org/doc/numpy/reference/generated/numpy.genfromtxt.html). Make sure that you are using consistent delimiters (e.g. commas). Since your data are in 50 different sheets you will probably need to concatenate them into one 4-d numpy array containing all data or three arrays with date, location and values separated. I found a simple and clear example of 2-d Kriging interpolation using Python here: http://connor-johnson.com/2014/03/20/simple-kriging-in-python/ It seems to be possible to do this using sklearn too. Possibly 3-d. http://scikit-learn.org/stable/modules/generated/sklearn.gaussian_process.GaussianProcess.html Yet, another example: http://girs.googlecode.com/svn/trunk/maths/kriging/kriging.py Good luck, Hjalmar On Wed, Jul 30, 2014 at 12:31 AM, Abrham Melesse wrote: > Dear users, > I am new to python and trying to work hard in to it. > I am trying to interpolate daily station climate data to the defined grid > points. I have more than 20 years of daily climate data for 50 stations > that I want to interpolate to the daily grid point values. I could handle > how to interpolate one spatial data in to grid points. My data is very > bulky. > > I organize my station data in one excel file with 50 sheets, each sheet > is daily climate data for the station. The data in each sheet has 4 columns > ( date, latitude, longitude, value/data) > > My grid points are given in another excel file that has 3 columns (Id, > latitude, longitude) > > > I want the interpolation output be one excel/text file in the form as > below. I prefer kriging interpolation if someone of you already have the > code. > > > date value1 value2 ?.... valueN > > > Here Value 1 ...valueN are grid point interpolated values for each grid > point Id from 1 to N. > > Thank you, > > Abraham > > -- > Abrham Melesse > Arba Minch University > Water Resource and Irrigation Engineering Department > PO Box 2552 > Cell Phone +251 912491406 > Arba Minch > Ethiopia > > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zseder.hlt at gmail.com Tue Aug 5 04:56:45 2014 From: zseder.hlt at gmail.com (Attila Zseder) Date: Tue, 5 Aug 2014 10:56:45 +0200 Subject: [SciPy-User] parallelization of cdist with multiprocessing Message-ID: Hi, I would like to run scipy.spatial.distance.cdist in parallel, and I created a stackoverflow question about this. http://stackoverflow.com/questions/25119316/scipy-parallel-cdist-with-multiprocessing Do you have any suggestions? Thank you! Attila -------------- next part -------------- An HTML attachment was scrubbed... URL: From david_baddeley at yahoo.com.au Wed Aug 6 09:51:23 2014 From: david_baddeley at yahoo.com.au (David Baddeley) Date: Wed, 6 Aug 2014 06:51:23 -0700 Subject: [SciPy-User] matlab "uigetfile" analog for ipython In-Reply-To: References: Message-ID: <1407333083.70662.YahooMailNeo@web163903.mail.gq1.yahoo.com> It will depend on which GUI framework you are using.?Hjalmar and Antonio have given answers for tk and Qt respectively. Under wx you'd use wx.FileDialog e.g. (copy and pasted from a file I had open hence the slightly irrelevant message and wildcard): import wx fdialog = wx.FileDialog(None, 'Open 3D scene ...', wildcard='*.stl', style=wx.OPEN) succ = fdialog.ShowModal() ? ? ? ?? if (succ == wx.ID_OK): ? ?fname = fdialog.GetPath().encode() If you're using an iPython notebook in a browser (which it sounds like you might be), you're also going to need to tell iPython to start your gui backend of choice using a gui 'magic' - eg: %gui wx cheers, David On Tuesday, 15 July 2014 6:22 AM, Antonino Ingargiola wrote: Hi, I use to define a function that uses Qt (either PySide or PyQt) to open a dialog: try: ? ? from PySide import QtCore, QtGui except ImportError: ? ? from PyQt4 import QtCore, QtGui def gui_fname(dir=None): ? ? """Select a file via a dialog and returns the file name. ? ? """ ? ? if dir is None: dir ='./' ? ? fname = QtGui.QFileDialog.getOpenFileName(None, "Select data file...",? ? ? ? ? ? ? dir, filter="All files (*)") ? ? return fname[0] Antonino On Sun, Jul 13, 2014 at 4:27 AM, Hjalmar Turesson wrote: Hi, > >I used tkFileDialog for that type of thing. > >Hjalmar > > > > >On Fri, Jul 11, 2014 at 9:39 AM, Antonelli Maria Rosaria wrote: > >Hi,? >> >> >>Is there any way to select interactively the folder/directory where are stored the data that I want to analyse in my notebook. >>Matlab command should be 'uigetfile'. This open a window and I select the file I want from this window. >> >> >>Thank you very much. >>Best, >>Rosa >>_______________________________________________ >>SciPy-User mailing list >>SciPy-User at scipy.org >>http://mail.scipy.org/mailman/listinfo/scipy-user >> >> > >_______________________________________________ >SciPy-User mailing list >SciPy-User at scipy.org >http://mail.scipy.org/mailman/listinfo/scipy-user > > _______________________________________________ SciPy-User mailing list SciPy-User at scipy.org http://mail.scipy.org/mailman/listinfo/scipy-user From camillechambon at yahoo.fr Thu Aug 7 04:32:56 2014 From: camillechambon at yahoo.fr (Camille) Date: Thu, 7 Aug 2014 08:32:56 +0000 (UTC) Subject: [SciPy-User] =?utf-8?q?integrate=2Eode_sets_t0_values_outside_of_?= =?utf-8?q?my_data=09range?= References: <1406642365.28550.YahooMailNeo@web28904.mail.ir2.yahoo.com> Message-ID: Oleksandr Huziy gmail.com> writes: > > Salut Camille: > > Just don't ask for the solution in the last point, since in order to > calculate a value of the solution at that point it has to go a bit > beyond (y should be defined from both sides of ti for the derivative > to exist in ti (this is a necessary condition)), but interpolation > does not really define values beyond the right limit. > > Here I've modified your example: > http://nbviewer.ipython.org/github/guziy/PyNotebooks/blob/master/ode_demo.ipynb > > Cheers > Hello Oleksandr, Thanks for your answer. But I do need the solution in the last point. Furthermore, in your modifications you redefine data(t) as: [1, 2, 3, 4]=data([0.,1.33333333,2.66666667,4.], which is not the same as: [1, 2, 3, 4]=data([0.,1.,2.,3.]. I need data(t) to remain as: [1, 2, 3, 4]=data([0.,1.,2.,3.]. Cheers, Camille From camillechambon at yahoo.fr Thu Aug 7 05:05:19 2014 From: camillechambon at yahoo.fr (Camille) Date: Thu, 7 Aug 2014 09:05:19 +0000 (UTC) Subject: [SciPy-User] =?utf-8?q?integrate=2Eode_sets_t0_values_outside_of_?= =?utf-8?q?my_data=09range?= References: <1406642365.28550.YahooMailNeo@web28904.mail.ir2.yahoo.com> Message-ID: Warren Weckesser gmail.com> writes: > > Camille, > > I added an answer to your question on StackOverflow: http://stackoverflow.com/questions/25031966/integrate-ode-sets-t0-values-outside-of-my-data-range/ > > Summary: > > * It is normal for the ODE solver to evaluate your function at points beyond the last requested time value. > > * One way to avoid the interpolation problem is to extend the interpolation data linearly, using the last two data point. > > Warren? > > Hello Warren, Thanks for your answer. Actually, data(t) is a sinusoidal function. Thus I can not extend the interpolation data linearly. I edited my question on StackOverflow accordingly. Cheers, Camille From davidmenhur at gmail.com Thu Aug 7 07:13:25 2014 From: davidmenhur at gmail.com (=?UTF-8?B?RGHPgGlk?=) Date: Thu, 7 Aug 2014 13:13:25 +0200 Subject: [SciPy-User] integrate.ode sets t0 values outside of my data range In-Reply-To: References: <1406642365.28550.YahooMailNeo@web28904.mail.ir2.yahoo.com> Message-ID: On 7 August 2014 11:05, Camille wrote: > Thanks for your answer. Actually, data(t) is a sinusoidal function. Thus I > can not extend the interpolation data linearly. I edited my question on > StackOverflow accordingly. Note that you are evaluating it very close to the boundary, so the interpolation effects will not be so bad. You can check the sensibility comparing the results with a purposefully "wrong" interpolation, like the same linear interpolation but with the opposite slope; but I bet the differences are going to be slim. If you know your data is sinusoidal, you can use that to make an even better estimation of the next value. Essentially, you need to provide the ODE solver with a way to estimate the derivatives of your function at any point. /David -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Thu Aug 7 09:23:43 2014 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Thu, 7 Aug 2014 09:23:43 -0400 Subject: [SciPy-User] integrate.ode sets t0 values outside of my data range In-Reply-To: References: <1406642365.28550.YahooMailNeo@web28904.mail.ir2.yahoo.com> Message-ID: On 8/7/14, Da?id wrote: > On 7 August 2014 11:05, Camille wrote: > >> Thanks for your answer. Actually, data(t) is a sinusoidal function. Thus >> I >> can not extend the interpolation data linearly. I edited my question on >> StackOverflow accordingly. > > > Note that you are evaluating it very close to the boundary, so the > interpolation effects will not be so bad. You can check the sensibility > comparing the results with a purposefully "wrong" interpolation, like the > same linear interpolation but with the opposite slope; but I bet the > differences are going to be slim. > > If you know your data is sinusoidal, you can use that to make an even > better estimation of the next value. Essentially, you need to provide the > ODE solver with a way to estimate the derivatives of your function at any > point. > > > /David > I agree with David. Use whatever extrapolation method is appropriate for your function. The main point is to expect your function to be evaluated a little bit beyond the final time requested in odeint. I updated my answer on StackOverflow with a suggestion to use either the "dopri5" or "dop853" solver of the scipy.integrate.ode class. It appears that these solvers do not evaluate your function at times beyond the requested time. Check out the SO answer for the sample code: http://stackoverflow.com/questions/25031966/integrate-ode-sets-t0-values-outside-of-my-data-range Warren From camillechambon at yahoo.fr Fri Aug 8 04:32:07 2014 From: camillechambon at yahoo.fr (CamilleC) Date: Fri, 8 Aug 2014 01:32:07 -0700 (PDT) Subject: [SciPy-User] integrate.ode sets t0 values outside of my data range In-Reply-To: References: <1406642365.28550.YahooMailNeo@web28904.mail.ir2.yahoo.com> Message-ID: <1407486727716-19667.post@n7.nabble.com> Thanks Warren and David. The function to be integrated might not be sinusoidal all the time and it's not possible to find an extrapolation method which is valid all the time. Thus I will use the "dopri5" or "dop853" solver of the scipy.integrate.ode class. Thanks again. Camille -- View this message in context: http://scipy-user.10969.n7.nabble.com/SciPy-User-integrate-ode-sets-t0-values-outside-of-my-data-range-tp19651p19667.html Sent from the Scipy-User mailing list archive at Nabble.com. From matthew.brett at gmail.com Sat Aug 9 20:23:54 2014 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 9 Aug 2014 17:23:54 -0700 Subject: [SciPy-User] [Numpy-discussion] ANN: NumPy 1.8.2 bugfix release In-Reply-To: <53E6162A.8050809@googlemail.com> References: <53E6162A.8050809@googlemail.com> Message-ID: On Sat, Aug 9, 2014 at 5:38 AM, Julian Taylor wrote: > Hello, > > I am pleased to announce the release of NumPy 1.8.2, a > pure bugfix release for the 1.8.x series. > https://sourceforge.net/projects/numpy/files/NumPy/1.8.2/ > The upgrade is recommended for all users of the 1.8.x series. > > Following issues have been fixed: > * gh-4836: partition produces wrong results for multiple selections in > equal ranges > * gh-4656: Make fftpack._raw_fft threadsafe > * gh-4628: incorrect argument order to _copyto in in np.nanmax, np.nanmin > * gh-4642: Hold GIL for converting dtypes types with fields > * gh-4733: fix np.linalg.svd(b, compute_uv=False) > * gh-4853: avoid unaligned simd load on reductions on i386 > * gh-4722: Fix seg fault converting empty string to object > * gh-4613: Fix lack of NULL check in array_richcompare > * gh-4774: avoid unaligned access for strided byteswap > * gh-650: Prevent division by zero when creating arrays from some buffers > * gh-4602: ifort has issues with optimization flag O2, use O1 > > > The source distributions have been uploaded to PyPI. The Windows > installers, documentation and release notes can be found at: > https://sourceforge.net/projects/numpy/files/NumPy/1.8.2/ OSX wheels now also up on pypi, please let us know of any problems, Cheers, Matthew From rgrant at enthought.com Mon Aug 11 18:20:16 2014 From: rgrant at enthought.com (Robert Grant) Date: Mon, 11 Aug 2014 17:20:16 -0500 Subject: [SciPy-User] ANN: DistArray 0.5 Release Message-ID: ============================================================================= DistArray 0.5 release ============================================================================= **Mailing List:** distarray at googlegroups.com **Documentation:** http://distarray.readthedocs.org **License:** Three-clause BSD **Python versions:** 2.7, 3.3, and 3.4 **OS support:** \*nix and Mac OS X What is DistArray? ------------------ DistArray aims to bring the ease-of-use of NumPy to data-parallel high-performance computing. It provides distributed multi-dimensional NumPy arrays, distributed ufuncs, and distributed IO capabilities. It can efficiently interoperate with external distributed libraries like Trilinos. DistArray works with NumPy and builds on top of it in a flexible and natural way. 0.5 Release ----------- Noteworthy improvements in this release include: * closer alignment with NumPy's API, * support for Python 3.4 (existing support for Python 2.7 and 3.3), * a performance-oriented MPI-only mode for deployment on clusters and supercomputers, * a way to register user-defined functions to be callable locally on worker processes, * more consistent naming of sub-packages, * testing with MPICH2 (already tested against OpenMPI), * improved and expanded examples, * installed version testable via ``distarray.test()``, and * performance and scaling improvements. With this release, DistArray ready for real-world testing and deployment. The project is still evolving rapidly and we appreciate the continued input from the larger scientific-Python community. Existing features ----------------- DistArray: * supports NumPy-like slicing, reductions, and ufuncs on distributed multidimensional arrays; * has a client-engine process design -- data resides on the worker processes, commands are initiated from master; * allows full control over what is executed on the worker processes and integrates transparently with the master process; * allows direct communication between workers, bypassing the master process for scalability; * integrates with IPython.parallel for interactive creation and exploration of distributed data; * supports distributed ufuncs (currently without broadcasting); * builds on and leverages MPI via MPI4Py in a transparent and user-friendly way; * has basic support for unstructured arrays; * supports user-controllable array distributions across workers (block, cyclic, block-cyclic, and unstructured) on a per-axis basis; * has a straightforward API to control how an array is distributed; * has basic plotting support for visualization of array distributions; * separates the array?s distribution from the array?s data -- useful for slicing, reductions, redistribution, broadcasting, and other operations; * implements distributed random arrays; * supports ``.npy``-like flat-file IO and hdf5 parallel IO (via ``h5py``); leverages MPI-based IO parallelism in an easy-to-use and transparent way; and * supports the distributed array protocol [protocol]_, which allows independently developed parallel libraries to share distributed arrays without copying, analogous to the PEP-3118 new buffer protocol. Planned features and roadmap ---------------------------- Near-term features and improvements include: * array re-distribution capabilities; * lazy evaluation and deferred computation for latency hiding; * interoperation with Trilinos [Trilinos]_; and * distributed broadcasting support. The longer-term roadmap includes: * Integration with other packages [petsc]_ that subscribe to the distributed array protocol [protocol]_; * Distributed fancy indexing; * Out-of-core computations; * Support for distributed sorting and other non-trivial distributed algorithms; and * End-user control over communication and temporary array creation, and other performance aspects of distributed computations. History and funding ------------------- Brian Granger started DistArray as a NASA-funded SBIR project in 2008. Enthought picked it up as part of a DOE Phase II SBIR [SBIR]_ to provide a generally useful distributed array package. It builds on NumPy, MPI, MPI4Py, IPython, IPython.parallel, and interfaces with the Trilinos suite of distributed HPC solvers (via PyTrilinos [Trilinos]_). This material is based upon work supported by the Department of Energy under Award Number DE-SC0007699. This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. .. [protocol] http://distributed-array-protocol.readthedocs.org/en/rel-0.10.0/ .. [Trilinos] http://trilinos.org/ .. [petsc] http://www.mcs.anl.gov/petsc/ .. [SBIR] http://www.sbir.gov/sbirsearch/detail/410257 From ralf.gommers at gmail.com Mon Aug 18 18:20:22 2014 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 19 Aug 2014 00:20:22 +0200 Subject: [SciPy-User] sprint @ EuroSciPy, Aug 31 Message-ID: Hi all, Here is a reminder that on Sunday 31 August, there will be a Scipy sprint at EuroSciPy (in Cambridge, UK). Details can be found at https://www.euroscipy.org/2014/program/sprints/ Newcomers to Scipy development are very welcome; actually one of the main goals of the sprint is to help new people to get started. Last year's sprint was excellent - 20 people joined and we still have all-time highs in the commits per month and contributors per month graph to show for it: https://www.openhub.net/p/scipy If you have time and will be at EuroSciPy: please join! Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Aug 17 14:23:49 2014 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 17 Aug 2014 18:23:49 +0000 (UTC) Subject: [SciPy-User] =?utf-8?q?how_to_share_weave=2Einline_generated_file?= =?utf-8?q?s_accross=09machines=3F?= References: <5CC618D74A5E2747BD49AEB4F19B85AE1A6AECC0@HKW20030007.gbl.ad.hedani.net> Message-ID: Jadhav, Alok credit-suisse.com> writes: > according to the scipy.weave.inline documentation, weave > stores the catalogue of already compiled code strings in > an on-disk cache. What happens if the location of weave generated > pyd files is commong between 2 machines (on a network drive), > will the catalogue of this data shared between 2 machines? To > be more precise, if 1 machine generates a pyd file and if we > copy this pyd file to a different machine and run the same weave.inline > function will the other machine try to recompile the code? Weave caches compiled code on disk, using cache keys derived from the code string to be compiled. If compiled code is found from the cache, no recompilation is done. With network file systems, I'm not sure how carefully locking is implemented in Weave, so race conditions may exist if two processes try to write the same code simultaneously. -- Pauli Virtanen From jaakko.luttinen at aalto.fi Thu Aug 21 07:58:30 2014 From: jaakko.luttinen at aalto.fi (Jaakko Luttinen) Date: Thu, 21 Aug 2014 14:58:30 +0300 Subject: [SciPy-User] ANN: BayesPy 0.2 Message-ID: <53F5DEE6.5050204@aalto.fi> Dear all, I am pleased to announce the release of BayesPy version 0.2. BayesPy provides tools for Bayesian inference in Python. In particular, it implements variational message passing framework, which enables modular and efficient way to construct models and perform approximate posterior inference. Download: https://pypi.python.org/pypi/bayespy/ Documentation: http://www.bayespy.org Repository: https://github.com/bayespy/bayespy Comments, feedback and contributions are welcome. Best regards, Jaakko From parrenin at ujf-grenoble.fr Fri Aug 22 05:50:26 2014 From: parrenin at ujf-grenoble.fr (=?UTF-8?Q?Fr=C3=A9d=C3=A9ric_Parrenin?=) Date: Fri, 22 Aug 2014 11:50:26 +0200 Subject: [SciPy-User] time-frequency diagram with wavelet transform Message-ID: Dear all, I have a irregularly sampled time series and I would like to plot its time-frequency diagram using wavelet transform. (I am relatively new in wavelets, especially in python) Is there any example on how to do this? (I could not find any clear example on the internet) Thanks in advance, best regards, Fr?d?ric -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Wed Aug 27 10:23:30 2014 From: mailinglists at xgm.de (Florian Lindner) Date: Wed, 27 Aug 2014 16:23:30 +0200 Subject: [SciPy-User] loadmat: Unknown mat file type, version 49 Message-ID: Hello, when trying to read a matlab file, writen bei petSC, I get this error: Traceback (most recent call last): File "petsc_plot.py", line 9, in rbf_results = scipy.io.loadmat("rbf_results") File "/usr/lib/python3.4/site-packages/scipy/io/matlab/mio.py", line 131, in loadmat MR = mat_reader_factory(file_name, appendmat, **kwargs) File "/usr/lib/python3.4/site-packages/scipy/io/matlab/mio.py", line 55, in mat_reader_factory mjv, mnv = get_matfile_version(byte_stream) File "/usr/lib/python3.4/site-packages/scipy/io/matlab/miobase.py", line 237, in get_matfile_version % ret) ValueError: Unknown mat file type, version 49, 49 The file contains a matrix and a vector and looks like that: %Mat Object:Nodes 1 MPI processes % type: seqdense % Size = 10 2 Nodes = zeros(10,2); Nodes = [ 7.2003197397953400e-01 4.0114727059134836e-01 6.1793966542126100e-02 5.2764865382548720e-01 1.0022337819588500e-02 1.4405427480322786e-01 1.4463931936456476e-01 9.9650445216117589e-01 3.9777780919128602e-01 1.0677308875937896e-01 7.3036588248200474e-02 9.8905332488367748e-01 1.0386628927366459e-01 7.3131275919223881e-01 2.5078039364333193e-01 3.0846024921541471e-01 9.8405227390177075e-01 1.8055644102282642e-01 7.3851607000756303e-01 9.8604536855510361e-01 ]; %Vec Object:Coefficients 1 MPI processes % type: seq Coefficients = [ 5.0245251254608746e+04 2.3659864438721524e+04 -1.4592196188522011e+05 1.4189477757354762e+04 7.8494387129370181e+04 2.5176418131682707e+05 -4.3384330330706594e+04 -1.7567379893104360e+05 6.5349663489607065e+02 -5.4052285933927575e+04 ]; The petsc C code: PetscViewer viewer; PetscViewerASCIIOpen(PETSC_COMM_WORLD, "rbf_results", &viewer); PetscViewerSetFormat(viewer, PETSC_VIEWER_ASCII_MATLAB); MatView(Nodes, viewer); VecView(c, viewer); PetscViewerDestroy(&viewer); Any idea what could be the problem here? System is ArchLinux, SciPy 0.14, NumPy 1.8.2, Python 3.4.1 Thanks, Florian From dphinnstuart at gmail.com Wed Aug 27 11:08:43 2014 From: dphinnstuart at gmail.com (phinn stuart) Date: Thu, 28 Aug 2014 00:08:43 +0900 Subject: [SciPy-User] Convert 3d NumPy array into 2d Message-ID: Hi everyone, how can I convert (1L, 480L, 1440L) shaped numpy array into (480L, 1440L)? Thanks in the advance. phinn -------------- next part -------------- An HTML attachment was scrubbed... URL: From maximilian.albert at gmail.com Wed Aug 27 12:18:17 2014 From: maximilian.albert at gmail.com (Maximilian Albert) Date: Wed, 27 Aug 2014 17:18:17 +0100 Subject: [SciPy-User] Convert 3d NumPy array into 2d In-Reply-To: References: Message-ID: [source] 2014-08-27 16:08 GMT+01:00 phinn stuart : > Hi everyone, how can I convert (1L, 480L, 1440L) shaped numpy array into > (480L, 1440L)? > If 'a' is your array then 'a.squeeze()' or 'a.squeeze(axis=0)' will do the job (see the docs at [1]). You can also say a.reshape(480, 1440) but obviously this is only useful if you know the exact dimensions of your array. Cheers, Max [1] http://docs.scipy.org/doc/numpy/reference/generated/numpy.squeeze.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Wed Aug 27 15:38:11 2014 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 27 Aug 2014 15:38:11 -0400 Subject: [SciPy-User] loadmat: Unknown mat file type, version 49 In-Reply-To: References: Message-ID: Hi, On Wed, Aug 27, 2014 at 10:23 AM, Florian Lindner wrote: > Hello, > > when trying to read a matlab file, writen bei petSC, I get this error: > > Traceback (most recent call last): > File "petsc_plot.py", line 9, in > rbf_results = scipy.io.loadmat("rbf_results") > File "/usr/lib/python3.4/site-packages/scipy/io/matlab/mio.py", line 131, > in loadmat > MR = mat_reader_factory(file_name, appendmat, **kwargs) > File "/usr/lib/python3.4/site-packages/scipy/io/matlab/mio.py", line 55, > in mat_reader_factory > mjv, mnv = get_matfile_version(byte_stream) > File "/usr/lib/python3.4/site-packages/scipy/io/matlab/miobase.py", line > 237, in get_matfile_version > % ret) > ValueError: Unknown mat file type, version 49, 49 > > > The file contains a matrix and a vector and looks like that: > > %Mat Object:Nodes 1 MPI processes > % type: seqdense > % Size = 10 2 > Nodes = zeros(10,2); > Nodes = [ > 7.2003197397953400e-01 4.0114727059134836e-01 > 6.1793966542126100e-02 5.2764865382548720e-01 > 1.0022337819588500e-02 1.4405427480322786e-01 > 1.4463931936456476e-01 9.9650445216117589e-01 > 3.9777780919128602e-01 1.0677308875937896e-01 > 7.3036588248200474e-02 9.8905332488367748e-01 > 1.0386628927366459e-01 7.3131275919223881e-01 > 2.5078039364333193e-01 3.0846024921541471e-01 > 9.8405227390177075e-01 1.8055644102282642e-01 > 7.3851607000756303e-01 9.8604536855510361e-01 > ]; > %Vec Object:Coefficients 1 MPI processes > % type: seq > Coefficients = [ > 5.0245251254608746e+04 > 2.3659864438721524e+04 > -1.4592196188522011e+05 > 1.4189477757354762e+04 > 7.8494387129370181e+04 > 2.5176418131682707e+05 > -4.3384330330706594e+04 > -1.7567379893104360e+05 > 6.5349663489607065e+02 > -5.4052285933927575e+04 > ]; > > > The petsc C code: > > PetscViewer viewer; > PetscViewerASCIIOpen(PETSC_COMM_WORLD, "rbf_results", &viewer); > PetscViewerSetFormat(viewer, PETSC_VIEWER_ASCII_MATLAB); > MatView(Nodes, viewer); > VecView(c, viewer); > PetscViewerDestroy(&viewer); > > Any idea what could be the problem here? > > System is ArchLinux, SciPy 0.14, NumPy 1.8.2, Python 3.4.1 Sorry - I wasn't sure from what you said whether this is actually a matlab .mat file in matlab 4 / 5 / 6 binary format? Cheers, matthew From mailinglists at xgm.de Wed Aug 27 16:33:24 2014 From: mailinglists at xgm.de (Florian Lindner) Date: Wed, 27 Aug 2014 22:33:24 +0200 Subject: [SciPy-User] loadmat: Unknown mat file type, version 49 References: Message-ID: Matthew Brett wrote: > Hi, > > On Wed, Aug 27, 2014 at 10:23 AM, Florian Lindner > wrote: >> Hello, >> The petsc C code: >> >> PetscViewer viewer; >> PetscViewerASCIIOpen(PETSC_COMM_WORLD, "rbf_results", &viewer); >> PetscViewerSetFormat(viewer, PETSC_VIEWER_ASCII_MATLAB); >> MatView(Nodes, viewer); >> VecView(c, viewer); >> PetscViewerDestroy(&viewer); >> >> Any idea what could be the problem here? >> >> System is ArchLinux, SciPy 0.14, NumPy 1.8.2, Python 3.4.1 > > Sorry - I wasn't sure from what you said whether this is actually a > matlab .mat file in matlab 4 / 5 / 6 binary format? Actually I got no idea. petSC gives no information about the specific version. I've pasted the relevant file how it's cat'ted into my console. Thx, Florian From pav at iki.fi Wed Aug 27 17:16:46 2014 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 28 Aug 2014 00:16:46 +0300 Subject: [SciPy-User] loadmat: Unknown mat file type, version 49 In-Reply-To: References: Message-ID: 27.08.2014, 23:33, Florian Lindner kirjoitti: > Matthew Brett wrote: [clip] >>> PetscViewerSetFormat(viewer, PETSC_VIEWER_ASCII_MATLAB); [clip] >> Sorry - I wasn't sure from what you said whether this is actually a >> matlab .mat file in matlab 4 / 5 / 6 binary format? Given that it says PETSC_VIEWER_ASCII_MATLAB, it's probably saved just as a text file. scipy.io.loadmat loads files in the matlab binary format. From ndbecker2 at gmail.com Fri Aug 29 08:50:12 2014 From: ndbecker2 at gmail.com (Neal Becker) Date: Fri, 29 Aug 2014 08:50:12 -0400 Subject: [SciPy-User] ANN: NumPy 1.9.0 release candidate 1 available References: <53FE104C.2020006@googlemail.com> Message-ID: OK, it's fixed by doing: rm -rf ~/.local/lib/python2.7/site-packages/numpy* python setup.py install --user I guess something was not cleaned out from previous packages From ben.root at ou.edu Fri Aug 29 09:26:47 2014 From: ben.root at ou.edu (Benjamin Root) Date: Fri, 29 Aug 2014 09:26:47 -0400 Subject: [SciPy-User] [Numpy-discussion] ANN: NumPy 1.9.0 release candidate 1 available In-Reply-To: References: <53FE104C.2020006@googlemail.com> Message-ID: It is generally a good idea when switching between releases to execute "git clean -fxd" prior to rebuilding. Admittedly, I don't know how cleaning out that directory in .local could have impacted things. Go figure. Cheers! Ben Root On Fri, Aug 29, 2014 at 8:50 AM, Neal Becker wrote: > OK, it's fixed by doing: > > rm -rf ~/.local/lib/python2.7/site-packages/numpy* > python setup.py install --user > > I guess something was not cleaned out from previous packages > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cyrille.rossant at gmail.com Sat Aug 30 09:51:23 2014 From: cyrille.rossant at gmail.com (Cyrille Rossant) Date: Sat, 30 Aug 2014 14:51:23 +0100 Subject: [SciPy-User] Fwd: ANN: Vispy 0.3, high-performance visualization in Python In-Reply-To: References: Message-ID: Hi, We are pleased to announce the release of Vispy version 0.3! This release provides new graphics API (including a scene graph), a basic OpenGL backend for matplotlib, and experimental integration in the IPython notebook. Further, we have improved the existing modules and added several application backends (e.g. wx and sdl2). We are now focusing our efforts on: * Consolidating the higher-level graphics APIs: visuals, transforms, scene graph. * Improving the performance of the scene graph, notably through batch rendering (Collections system). * Adding more features to the matplotlib backend. * Improving the integration in the IPython notebook: better performance, improved event loop integration, client-side backend with WebGL. Eventually, scientists with no knowledge of OpenGL will be able to easily create fast and scalable interactive visualizations. We will greatly appreciate any feedback, bug reports, and feature requests. See http://vispy.org for downloads and more information. The Vispy team