From samuel_maybury at me.com Fri Mar 2 15:46:46 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Fri, 02 Mar 2018 20:46:46 +0000 Subject: [SciPy-User] STL from nArray (Delauney) Message-ID: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> Hi all, I?m self-taught in Python for my university honours project and its been one hell of a journey. Simply put I?m designing a 3D scanner-printer hybrid to aid in the creation of prosthetic limbs. So long story short I?m running the system on an RPi and you can find more information here, if you want to help: https://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODE So, in essence I have an array of form [[x1 y1 z1] [x2 y2 z2]? [xn yn zn]] and I need to turn it into an STL mesh. A forum member pointed me in the direction of scipy?s Delauney triangulation. Now, the values were appended into the array such that each group (of unknown size) will have the same z value and the next will have the closest z value to that. Due to how it was generated (with OpenCv) I can?t be entirely sure but I believe each group with the same z value was written in the same order and so the first of z1 should be close to the first of z2. Your help is greatly appreciated as this is very new territory to me and I?m not sure where to start. This is the final hurdle in the way of me graduating with a pretty good accreditation and two charities are interested in the idea so it?d be a blessing to get it working! Many thanks, Sam -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Mar 2 17:38:03 2018 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 2 Mar 2018 14:38:03 -0800 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> References: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> Message-ID: On Fri, Mar 2, 2018 at 12:46 PM, Samuel Maybury wrote: > > Hi all, > I?m self-taught in Python for my university honours project and its been one hell of a journey. Simply put I?m designing a 3D scanner-printer hybrid to aid in the creation of prosthetic limbs. So long story short I?m running the system on an RPi and you can find more information here, if you want to help: https://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODE > > So, in essence I have an array of form [[x1 y1 z1] [x2 y2 z2]? [xn yn zn]] and I need to turn it into an STL mesh. A forum member pointed me in the direction of scipy?s Delauney triangulation. > Now, the values were appended into the array such that each group (of unknown size) will have the same z value and the next will have the closest z value to that. Due to how it was generated (with OpenCv) I can?t be entirely sure but I believe each group with the same z value was written in the same order and so the first of z1 should be close to the first of z2. > > Your help is greatly appreciated as this is very new territory to me and I?m not sure where to start. This is the final hurdle in the way of me graduating with a pretty good accreditation and two charities are interested in the idea so it?d be a blessing to get it working! A Delaunay triangulation does not get you very far. We don't have much in scipy for doing mesh reconstruction from point clouds. Typically, you want to do this in an interactive environment so you can tweak parameters. Consider using MeshLab for this purpose: http://www.meshlab.net/ -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From samuel_maybury at me.com Fri Mar 2 17:44:04 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Fri, 02 Mar 2018 22:44:04 +0000 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: References: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> Message-ID: Hi Robert, Thanks for your reply! I?ll take a look. The shapes that would be scanned are very simple: cylindrical with rounded end and no holes. Wouldn?t the triangulation be able to handle the tessellation of that? Many thanks, Sam Sent from my iPad > On 2 Mar 2018, at 10:38 pm, Robert Kern wrote: > > On Fri, Mar 2, 2018 at 12:46 PM, Samuel Maybury wrote: > > > > Hi all, > > I?m self-taught in Python for my university honours project and its been one hell of a journey. Simply put I?m designing a 3D scanner-printer hybrid to aid in the creation of prosthetic limbs. So long story short I?m running the system on an RPi and you can find more information here, if you want to help: https://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODE > > > > So, in essence I have an array of form [[x1 y1 z1] [x2 y2 z2]? [xn yn zn]] and I need to turn it into an STL mesh. A forum member pointed me in the direction of scipy?s Delauney triangulation. > > Now, the values were appended into the array such that each group (of unknown size) will have the same z value and the next will have the closest z value to that. Due to how it was generated (with OpenCv) I can?t be entirely sure but I believe each group with the same z value was written in the same order and so the first of z1 should be close to the first of z2. > > > > Your help is greatly appreciated as this is very new territory to me and I?m not sure where to start. This is the final hurdle in the way of me graduating with a pretty good accreditation and two charities are interested in the idea so it?d be a blessing to get it working! > > A Delaunay triangulation does not get you very far. We don't have much in scipy for doing mesh reconstruction from point clouds. Typically, you want to do this in an interactive environment so you can tweak parameters. Consider using MeshLab for this purpose: > > http://www.meshlab.net/ > > -- > Robert Kern > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From prabhu at aero.iitb.ac.in Fri Mar 2 23:08:12 2018 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Sat, 3 Mar 2018 09:38:12 +0530 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> References: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> Message-ID: <5ec34bf2-f312-cae3-0f58-adc7a79cc568@aero.iitb.ac.in> On 3/3/18 2:16 AM, Samuel Maybury wrote: > Hi all, > I?m self-taught in Python for my university honours project and its been one > hell of a journey. Simply put I?m designing a 3D scanner-printer hybrid to aid > in the creation of prosthetic limbs. So long story short I?m running the > system on an RPi and you can find more information here, if you want to > help:?https://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODE > > So, in essence I have an array of form*[[x1 y1 z1] [x2 y2 z2]? [xn yn zn]] > *and I need to turn it into an STL mesh. A forum member pointed me in the > direction of scipy?s Delauney triangulation.? > Now, the values were appended into the array such that each group (of unknown > size) will have the same *z *value and the next will have the closest *z > *value to that. Due to how it was generated (with OpenCv) I can?t be entirely > sure but I believe each group with the same z value was written in the same > order and so the first of *z1* should be close to the first of *z2.* > * > * > Your help is greatly appreciated as this is very new?territory to me and I?m > not sure where to start. This is the final hurdle in the way of me graduating > with a pretty good accreditation and two charities are interested in the idea > so it?d be a blessing to get it working! This isn't going to do the triangulation but is handy to manipulate STL files from Python: https://github.com/WoLpH/numpy-stl cheers, Prabhu -------------- next part -------------- An HTML attachment was scrubbed... URL: From hritiknarayan at gmail.com Sat Mar 3 03:43:28 2018 From: hritiknarayan at gmail.com (Hritik Narayan) Date: Sat, 3 Mar 2018 14:13:28 +0530 Subject: [SciPy-User] Proposals, Query Message-ID: Are proposals to be posted on this mailing list, or should they be directly mailed to a mentor? -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From samuel_maybury at me.com Sat Mar 3 04:22:30 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Sat, 03 Mar 2018 09:22:30 +0000 Subject: [SciPy-User] Proposals, Query In-Reply-To: References: Message-ID: <0A7E3298-090D-4FF4-A6AC-DB3E86682786@me.com> Hi Hritik, Feels a bit odd giving advice when I just signed up yesterday but if I?m not mistaken you want the scipy-dev mailing list. You can sign up here: https://www.scipy.org/scipylib/mailing-lists.html Best wishes, Sam > On 3 Mar 2018, at 8:43 am, Hritik Narayan wrote: > > Are proposals to be posted on this mailing list, or should they be directly mailed to a mentor? > > > -- > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From samuel_maybury at me.com Sat Mar 3 03:43:33 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Sat, 03 Mar 2018 08:43:33 +0000 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: <5ec34bf2-f312-cae3-0f58-adc7a79cc568@aero.iitb.ac.in> References: <6B3FF804-3B0A-459C-AA63-C7F00C53A70E@me.com> <5ec34bf2-f312-cae3-0f58-adc7a79cc568@aero.iitb.ac.in> Message-ID: <088AD7AA-0ECF-4F20-AA23-EBFD579746AD@me.com> Hi Prabhu, Yes, I looked at numpy-STL first, it was actually the creator, Rick Van Hattem, who suggested I look at Delaunay :) > On 3 Mar 2018, at 4:08 am, Prabhu Ramachandran wrote: > >> On 3/3/18 2:16 AM, Samuel Maybury wrote: >> Hi all, >> I?m self-taught in Python for my university honours project and its been one hell of a journey. Simply put I?m designing a 3D scanner-printer hybrid to aid in the creation of prosthetic limbs. So long story short I?m running the system on an RPi and you can find more information here, if you want to help: https://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODEhttps://github.com/AcrimoniousMirth/Project-Adam-3D-SCANNER-CODE >> >> So, in essence I have an array of form [[x1 y1 z1] [x2 y2 z2]? [xn yn zn]] and I need to turn it into an STL mesh. A forum member pointed me in the direction of scipy?s Delauney triangulation. >> Now, the values were appended into the array such that each group (of unknown size) will have the same z value and the next will have the closest z value to that. Due to how it was generated (with OpenCv) I can?t be entirely sure but I believe each group with the same z value was written in the same order and so the first of z1 should be close to the first of z2. >> >> Your help is greatly appreciated as this is very new territory to me and I?m not sure where to start. This is the final hurdle in the way of me graduating with a pretty good accreditation and two charities are interested in the idea so it?d be a blessing to get it working! > This isn't going to do the triangulation but is handy to manipulate STL files from Python: > > https://github.com/WoLpH/numpy-stl > > cheers, > Prabhu > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbabcock at mac.com Sat Mar 3 13:08:10 2018 From: hbabcock at mac.com (Hazen Babcock) Date: Sat, 03 Mar 2018 13:08:10 -0500 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: References: Message-ID: On 03/02/2018 11:17 PM, scipy-user-request at python.org wrote: > > Message: 3 > Date: Fri, 02 Mar 2018 22:44:04 +0000 > From: Samuel Maybury > To: SciPy Users List > Subject: Re: [SciPy-User] STL from nArray (Delauney) > Message-ID: > Content-Type: text/plain; charset="utf-8" > > Hi Robert, > Thanks for your reply! I?ll take a look. The shapes that would be scanned are very simple: cylindrical with rounded end and no holes. Wouldn?t the triangulation be able to handle the tessellation of that? > > Many thanks, > Sam > If they are really simple you might consider programmatically creating an OpenSCAD file with the appropriate shapes (cylinder + sphere on the end?), then using OpenSCAD to create the STL file. http://www.openscad.org/ -Hazen From samuel_maybury at me.com Sat Mar 3 14:12:05 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Sat, 03 Mar 2018 19:12:05 +0000 Subject: [SciPy-User] STL from nArray (Delauney) In-Reply-To: References: Message-ID: <39F585FB-C470-477A-9F54-C432BE1AFED1@me.com> Well the illustration of cylinder + sphere was simply to give an idea of the general shape of the objects being scanned (and stored in the numpy array). Creating one would serve no purpose. I?ve done more research and I believe the pyntcloud Library will work much better. Currently struggling to get numba to install on my Pi though. Many thanks, Sam > On 3 Mar 2018, at 6:08 pm, Hazen Babcock wrote: > >> On 03/02/2018 11:17 PM, scipy-user-request at python.org wrote: >> Message: 3 >> Date: Fri, 02 Mar 2018 22:44:04 +0000 >> From: Samuel Maybury >> To: SciPy Users List >> Subject: Re: [SciPy-User] STL from nArray (Delauney) >> Message-ID: >> Content-Type: text/plain; charset="utf-8" >> Hi Robert, >> Thanks for your reply! I?ll take a look. The shapes that would be scanned are very simple: cylindrical with rounded end and no holes. Wouldn?t the triangulation be able to handle the tessellation of that? >> Many thanks, >> Sam > > If they are really simple you might consider programmatically creating an OpenSCAD file with the appropriate shapes (cylinder + sphere on the end?), then using OpenSCAD to create the STL file. > > http://www.openscad.org/ > > -Hazen > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user From cimrman3 at ntc.zcu.cz Tue Mar 6 06:06:20 2018 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 6 Mar 2018 12:06:20 +0100 Subject: [SciPy-User] ANN: SfePy 2018.1 Message-ID: <07f7c4f6-abae-a7f4-5e87-e29dbab2e296@ntc.zcu.cz> I am pleased to announce release 2018.1 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software for solving systems of coupled partial differential equations by the finite element method or by the isogeometric analysis (limited support). It is distributed under the new BSD license. Home page: http://sfepy.org Mailing list: https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/ Git (source) repository, issue tracker: https://github.com/sfepy/sfepy Highlights of this release -------------------------- - major update of time-stepping solvers and solver handling - Newmark and Bathe elastodynamics solvers - interface to MUMPS linear solver - new examples: - iron plate impact problem (elastodynamics) - incompressible Mooney-Rivlin material model (hyperelasticity) as a script For full release notes see http://docs.sfepy.org/doc/release_notes.html#id1 (rather long and technical). Cheers, Robert Cimrman --- Contributors to this release in alphabetical order: Robert Cimrman Jan Heczko Jan Kopacka Vladimir Lukes From ndbecker2 at gmail.com Wed Mar 7 08:52:30 2018 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 07 Mar 2018 13:52:30 +0000 Subject: [SciPy-User] what is signal.wiener? Message-ID: The wiener filter theory I'm familiar with requires specification of the autocorrelation matrix of the observed response and the cross-correlation between the target and the observation. What is it that signal.wiener actually does? -------------- next part -------------- An HTML attachment was scrubbed... URL: From samuel_maybury at me.com Wed Mar 7 17:15:17 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Wed, 07 Mar 2018 22:15:17 +0000 Subject: [SciPy-User] Tessellating PointCloud (ConvexHull? Delaunay) Message-ID: <2CBDD279-35F6-4DAD-A4F2-D7A1A65CE9A0@me.com> Hi guys, Been looking to tesselate a 3D point cloud from array of form [[x1 y1 z1][x2 y2 z2]...[xn yn zn]]. I tried scipy.spatial.convexhull which worked decently, but has the notable limitation of not handling concaves. The STL I then extracted only had half the triangles, but I think I need to take that up with the maker of numpy-stl. Still kinda new to scipy (and programming in general) so could someone informed please help? From what I understand Delaunay triangulates every point in the cloud. Is there a way to limit it to only Tessellating to the nearest? Or a better method for doing this? Many thanks, Sam 3D Prints: 3dhubs.com/service/Mirths-Hub Connect: linkedin.com/samueljmaybury Watch: youtube.com/user/AcrimoniousMirth CV: engineering.community/profile/samuel-maybury/about -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Mar 7 17:32:45 2018 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 7 Mar 2018 14:32:45 -0800 Subject: [SciPy-User] Tessellating PointCloud (ConvexHull? Delaunay) In-Reply-To: <2CBDD279-35F6-4DAD-A4F2-D7A1A65CE9A0@me.com> References: <2CBDD279-35F6-4DAD-A4F2-D7A1A65CE9A0@me.com> Message-ID: On Wed, Mar 7, 2018 at 2:15 PM, Samuel Maybury wrote: > > Hi guys, > Been looking to tesselate a 3D point cloud from array of form [[x1 y1 z1][x2 y2 z2]...[xn yn zn]]. > I tried scipy.spatial.convexhull which worked decently, but has the notable limitation of not handling concaves. The STL I then extracted only had half the triangles, but I think I need to take that up with the maker of numpy-stl. > Still kinda new to scipy (and programming in general) so could someone informed please help? > From what I understand Delaunay triangulates every point in the cloud. Is there a way to limit it to only Tessellating to the nearest? > Or a better method for doing this? As I said before, Delaunay triangulation doesn't do much for reconstructing a surface from a point cloud. Following the MeshLab references[1], it looks like variants of a technique called Poisson reconstruction are standard these days. Googling finds these Python projects, but I have no experience with them: https://github.com/mmolero/pypoisson https://gist.github.com/jackdoerner/b9b5e62a4c3893c76e4c [1] http://www.meshlab.net/#references -- Robert Kern -------------- next part -------------- An HTML attachment was scrubbed... URL: From samuel_maybury at me.com Wed Mar 7 17:41:02 2018 From: samuel_maybury at me.com (Samuel Maybury) Date: Wed, 07 Mar 2018 22:41:02 +0000 Subject: [SciPy-User] Tessellating PointCloud (ConvexHull? Delaunay) In-Reply-To: References: <2CBDD279-35F6-4DAD-A4F2-D7A1A65CE9A0@me.com> Message-ID: <84CCC7F5-5942-4BE3-83FE-0D62F14BECBE@me.com> Thanks Robert, I can honestly say I hadn?t seen that before! I?ve looked at so many libraries but that may just be the ultimate answer! I did extract a near-usable STL with convex hull so glad to say it almost works. I?ll give that a go and you have have this honour student?s thanks :) Best wishes, Sam > On 7 Mar 2018, at 22:32, Robert Kern wrote: > > On Wed, Mar 7, 2018 at 2:15 PM, Samuel Maybury > wrote: > > > > Hi guys, > > Been looking to tesselate a 3D point cloud from array of form [[x1 y1 z1][x2 y2 z2]...[xn yn zn]]. > > I tried scipy.spatial.convexhull which worked decently, but has the notable limitation of not handling concaves. The STL I then extracted only had half the triangles, but I think I need to take that up with the maker of numpy-stl. > > Still kinda new to scipy (and programming in general) so could someone informed please help? > > From what I understand Delaunay triangulates every point in the cloud. Is there a way to limit it to only Tessellating to the nearest? > > Or a better method for doing this? > > As I said before, Delaunay triangulation doesn't do much for reconstructing a surface from a point cloud. Following the MeshLab references[1], it looks like variants of a technique called Poisson reconstruction are standard these days. Googling finds these Python projects, but I have no experience with them: > > https://github.com/mmolero/pypoisson > https://gist.github.com/jackdoerner/b9b5e62a4c3893c76e4c > > [1] http://www.meshlab.net/#references > > -- > Robert Kern > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Mar 12 14:25:42 2018 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 12 Mar 2018 12:25:42 -0600 Subject: [SciPy-User] NumPy 1.14.2 released Message-ID: Hi All, I am pleased to announce the release of NumPy 1.14.2. This is a bugfix release for some bugs reported following the 1.14.1 release. The major problems dealt with are as follows. - Residual bugs in the new array printing functionality. - Regression resulting in a relocation problem with shared library. - Improved PyPy compatibility. This release supports Python 2.7 and 3.4 - 3.6. Wheels for the release are available on PyPI. Source tarballs, zipfiles, release notes, and the changelog are available on github . The Python 3.6 wheels available from PIP are built with Python 3.6.2 and should be compatible with all previous versions of Python 3.6. The source releases were cythonized with Cython 0.26.1, which is known to *not* support the upcoming Python 3.7 release. People who wish to run Python 3.7 should check out the NumPy repo and try building with the, as yet, unreleased master branch of Cython. Contributors ============ A total of 4 people contributed to this release. People with a "+" by their names contributed a patch for the first time. * Allan Haldane * Charles Harris * Eric Wieser * Pauli Virtanen Pull requests merged ==================== A total of 5 pull requests were merged for this release. * `#10674 `__: BUG: Further back-compat fix for subclassed array repr * `#10725 `__: BUG: dragon4 fractional output mode adds too many trailing zeros * `#10726 `__: BUG: Fix f2py generated code to work on PyPy * `#10727 `__: BUG: Fix missing NPY_VISIBILITY_HIDDEN on npy_longdouble_to_PyLong * `#10729 `__: DOC: Create 1.14.2 notes and changelog. Cheers, Charles Harris -------------- next part -------------- An HTML attachment was scrubbed... URL: From franckkalala at googlemail.com Wed Mar 14 12:21:20 2018 From: franckkalala at googlemail.com (Franck Kalala) Date: Wed, 14 Mar 2018 16:21:20 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. Message-ID: Hello All I am not sure this a good place to ask this. I am just making a try. I have and image and I would like to reproduced the Delaunay graphs overlaid. See the attached image for an idea. I try the following code: from matplotlib import pyplot as plt from skimage.io import imread from skimage.feature import corner_harris, corner_subpix, corner_peaks, peak_local_max from scipy.spatial import Delaunay import matplotlib.pyplot as plt image = imread("cup.png", as_grey='True') #points = corner_peaks(corner_harris(image), min_distance=1) points = peak_local_max(corner_harris(image), min_distance=2) #coords_subpix = corner_subpix(image, coords, window_size=13) tri = Delaunay(points) #imgplot = plt.imshow(image,cmap='gray') #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) #plt.plot(points[:,0], points[:,1], 'o') #plt.show() #print(image) coords = peak_local_max(corner_harris(image), min_distance=2) coords_subpix = corner_subpix(image, coords, window_size=13) fig, ax = plt.subplots() ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) #ax.axis((0, 350, 350, 0)) plt.show() this code does not help. I attach also the cup image for a try. best franck -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: objet_and_delaunay_graph_overlaid.png Type: image/png Size: 581973 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: cup.png Type: image/png Size: 21126 bytes Desc: not available URL: From tfmoraes at cti.gov.br Wed Mar 14 12:42:41 2018 From: tfmoraes at cti.gov.br (Thiago Franco Moraes) Date: Wed, 14 Mar 2018 16:42:41 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. In-Reply-To: References: Message-ID: You have to use plt.triplot. Also, you can use matplotlib.tri.Triangulation to triangulate using the delaunay. Something like this: from matplotlib import pyplot as plt from skimage.io import imread from skimage.feature import corner_harris, corner_subpix, corner_peaks, peak_local_max import matplotlib.tri as tri import matplotlib.pyplot as plt image = imread("cup.png", as_grey='True') points = peak_local_max(corner_harris(image), min_distance=2) triangles = tri.Triangulation(points[:,0], points[:,1]) fig, ax = plt.subplots() ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) ax.triplot(triangles) plt.show() On Wed, Mar 14, 2018 at 1:24 PM Franck Kalala wrote: > Hello All > > I am not sure this a good place to ask this. I am just making a try. > I have and image and I would like to reproduced the Delaunay graphs > overlaid. > See the attached image for an idea. > > I try the following code: > > from matplotlib import pyplot as plt > > from skimage.io import imread > from skimage.feature import corner_harris, corner_subpix, corner_peaks, > peak_local_max > from scipy.spatial import Delaunay > import matplotlib.pyplot as plt > > image = imread("cup.png", as_grey='True') > > > #points = corner_peaks(corner_harris(image), min_distance=1) > points = peak_local_max(corner_harris(image), min_distance=2) > #coords_subpix = corner_subpix(image, coords, window_size=13) > tri = Delaunay(points) > > #imgplot = plt.imshow(image,cmap='gray') > #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) > #plt.plot(points[:,0], points[:,1], 'o') > #plt.show() > #print(image) > > coords = peak_local_max(corner_harris(image), min_distance=2) > coords_subpix = corner_subpix(image, coords, window_size=13) > > fig, ax = plt.subplots() > ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) > ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) > #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) > #ax.axis((0, 350, 350, 0)) > plt.show() > > > this code does not help. I attach also the cup image for a try. > > best > > franck > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From franckm at aims.ac.za Wed Mar 14 13:14:31 2018 From: franckm at aims.ac.za (Franck Kalala Mutombo) Date: Wed, 14 Mar 2018 17:14:31 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. In-Reply-To: References: Message-ID: Thank you Thiago, It is working but the triangulation goes out of the object for some points and is that I wanted to avoid compare the image I attached. Best - Franck Kalala Mutombo, (PhD. Mathematics ) +243(04)844140 411 | +27(0)7646 91608 skype: franckm4 "*No one knows the future, however, but this does not prevent us to project** in it and to act as if we control it*" On 14 March 2018 at 16:42, Thiago Franco Moraes wrote: > You have to use plt.triplot. Also, you can use > matplotlib.tri.Triangulation to triangulate using the delaunay. Something > like this: > > from matplotlib import pyplot as plt > > from skimage.io import imread > from skimage.feature import corner_harris, corner_subpix, corner_peaks, > peak_local_max > import matplotlib.tri as tri > import matplotlib.pyplot as plt > > image = imread("cup.png", as_grey='True') > points = peak_local_max(corner_harris(image), min_distance=2) > triangles = tri.Triangulation(points[:,0], points[:,1]) > fig, ax = plt.subplots() > ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) > ax.triplot(triangles) > plt.show() > > > > > On Wed, Mar 14, 2018 at 1:24 PM Franck Kalala > wrote: > >> Hello All >> >> I am not sure this a good place to ask this. I am just making a try. >> I have and image and I would like to reproduced the Delaunay graphs >> overlaid. >> See the attached image for an idea. >> >> I try the following code: >> >> from matplotlib import pyplot as plt >> >> from skimage.io import imread >> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >> peak_local_max >> from scipy.spatial import Delaunay >> import matplotlib.pyplot as plt >> >> image = imread("cup.png", as_grey='True') >> >> >> #points = corner_peaks(corner_harris(image), min_distance=1) >> points = peak_local_max(corner_harris(image), min_distance=2) >> #coords_subpix = corner_subpix(image, coords, window_size=13) >> tri = Delaunay(points) >> >> #imgplot = plt.imshow(image,cmap='gray') >> #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) >> #plt.plot(points[:,0], points[:,1], 'o') >> #plt.show() >> #print(image) >> >> coords = peak_local_max(corner_harris(image), min_distance=2) >> coords_subpix = corner_subpix(image, coords, window_size=13) >> >> fig, ax = plt.subplots() >> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >> ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) >> #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) >> #ax.axis((0, 350, 350, 0)) >> plt.show() >> >> >> this code does not help. I attach also the cup image for a try. >> >> best >> >> franck >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tfmoraes at cti.gov.br Wed Mar 14 13:54:00 2018 From: tfmoraes at cti.gov.br (Thiago Franco Moraes) Date: Wed, 14 Mar 2018 17:54:00 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. In-Reply-To: References: Message-ID: Try to invert the points: triangles = tri.Triangulation(points[:,1], points[:,0]) On Wed, Mar 14, 2018 at 2:16 PM Franck Kalala Mutombo wrote: > Thank you Thiago, > > It is working but the triangulation goes out of the object for some points > and is that I wanted to avoid compare the image I attached. > > Best > > - > Franck Kalala Mutombo, (PhD. Mathematics ) > +243(04)844140 411 | +27(0)7646 91608 > skype: franckm4 > > "*No one knows the future, however, but this does not prevent us to > project** in it and to act as if we control it*" > > > On 14 March 2018 at 16:42, Thiago Franco Moraes > wrote: > >> You have to use plt.triplot. Also, you can use >> matplotlib.tri.Triangulation to triangulate using the delaunay. Something >> like this: >> >> from matplotlib import pyplot as plt >> >> from skimage.io import imread >> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >> peak_local_max >> import matplotlib.tri as tri >> import matplotlib.pyplot as plt >> >> image = imread("cup.png", as_grey='True') >> points = peak_local_max(corner_harris(image), min_distance=2) >> triangles = tri.Triangulation(points[:,0], points[:,1]) >> fig, ax = plt.subplots() >> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >> ax.triplot(triangles) >> plt.show() >> >> >> >> >> On Wed, Mar 14, 2018 at 1:24 PM Franck Kalala < >> franckkalala at googlemail.com> wrote: >> >>> Hello All >>> >>> I am not sure this a good place to ask this. I am just making a try. >>> I have and image and I would like to reproduced the Delaunay graphs >>> overlaid. >>> See the attached image for an idea. >>> >>> I try the following code: >>> >>> from matplotlib import pyplot as plt >>> >>> from skimage.io import imread >>> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >>> peak_local_max >>> from scipy.spatial import Delaunay >>> import matplotlib.pyplot as plt >>> >>> image = imread("cup.png", as_grey='True') >>> >>> >>> #points = corner_peaks(corner_harris(image), min_distance=1) >>> points = peak_local_max(corner_harris(image), min_distance=2) >>> #coords_subpix = corner_subpix(image, coords, window_size=13) >>> tri = Delaunay(points) >>> >>> #imgplot = plt.imshow(image,cmap='gray') >>> #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) >>> #plt.plot(points[:,0], points[:,1], 'o') >>> #plt.show() >>> #print(image) >>> >>> coords = peak_local_max(corner_harris(image), min_distance=2) >>> coords_subpix = corner_subpix(image, coords, window_size=13) >>> >>> fig, ax = plt.subplots() >>> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >>> ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) >>> #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) >>> #ax.axis((0, 350, 350, 0)) >>> plt.show() >>> >>> >>> this code does not help. I attach also the cup image for a try. >>> >>> best >>> >>> franck >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From franckkalala at googlemail.com Wed Mar 14 14:35:01 2018 From: franckkalala at googlemail.com (Franck Kalala) Date: Wed, 14 Mar 2018 18:35:01 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. In-Reply-To: References: Message-ID: It does not bring a big change.... 2018-03-14 17:54 GMT+00:00 Thiago Franco Moraes : > Try to invert the points: > > triangles = tri.Triangulation(points[:,1], points[:,0]) > > On Wed, Mar 14, 2018 at 2:16 PM Franck Kalala Mutombo > wrote: > >> Thank you Thiago, >> >> It is working but the triangulation goes out of the object for some >> points and is that I wanted to avoid compare the image I attached. >> >> Best >> >> - >> Franck Kalala Mutombo, (PhD. Mathematics ) >> +243(04)844140 411 | +27(0)7646 91608 >> skype: franckm4 >> >> "*No one knows the future, however, but this does not prevent us to >> project** in it and to act as if we control it*" >> >> >> On 14 March 2018 at 16:42, Thiago Franco Moraes >> wrote: >> >>> You have to use plt.triplot. Also, you can use >>> matplotlib.tri.Triangulation to triangulate using the delaunay. Something >>> like this: >>> >>> from matplotlib import pyplot as plt >>> >>> from skimage.io import imread >>> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >>> peak_local_max >>> import matplotlib.tri as tri >>> import matplotlib.pyplot as plt >>> >>> image = imread("cup.png", as_grey='True') >>> points = peak_local_max(corner_harris(image), min_distance=2) >>> triangles = tri.Triangulation(points[:,0], points[:,1]) >>> fig, ax = plt.subplots() >>> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >>> ax.triplot(triangles) >>> plt.show() >>> >>> >>> >>> >>> On Wed, Mar 14, 2018 at 1:24 PM Franck Kalala < >>> franckkalala at googlemail.com> wrote: >>> >>>> Hello All >>>> >>>> I am not sure this a good place to ask this. I am just making a try. >>>> I have and image and I would like to reproduced the Delaunay graphs >>>> overlaid. >>>> See the attached image for an idea. >>>> >>>> I try the following code: >>>> >>>> from matplotlib import pyplot as plt >>>> >>>> from skimage.io import imread >>>> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >>>> peak_local_max >>>> from scipy.spatial import Delaunay >>>> import matplotlib.pyplot as plt >>>> >>>> image = imread("cup.png", as_grey='True') >>>> >>>> >>>> #points = corner_peaks(corner_harris(image), min_distance=1) >>>> points = peak_local_max(corner_harris(image), min_distance=2) >>>> #coords_subpix = corner_subpix(image, coords, window_size=13) >>>> tri = Delaunay(points) >>>> >>>> #imgplot = plt.imshow(image,cmap='gray') >>>> #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) >>>> #plt.plot(points[:,0], points[:,1], 'o') >>>> #plt.show() >>>> #print(image) >>>> >>>> coords = peak_local_max(corner_harris(image), min_distance=2) >>>> coords_subpix = corner_subpix(image, coords, window_size=13) >>>> >>>> fig, ax = plt.subplots() >>>> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >>>> ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) >>>> #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) >>>> #ax.axis((0, 350, 350, 0)) >>>> plt.show() >>>> >>>> >>>> this code does not help. I attach also the cup image for a try. >>>> >>>> best >>>> >>>> franck >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From spiff007 at gmail.com Fri Mar 16 03:14:43 2018 From: spiff007 at gmail.com (spiff007) Date: Fri, 16 Mar 2018 12:44:43 +0530 Subject: [SciPy-User] Using scipy to fit a bimodal distribution Message-ID: Hi Folks, I have some data, which is bimodally distributed. The initial part of the data(in red, in the figure below), seems to be exponentially distributed and the latter part(in blue, in the figure below), seems to be a gamma distribution. See image below (& attached) : ? What scipy function, i could use to fit this data, knowing that, the first mode is exponential, and the latter, seems to be gamma. If its not possible to do it using scipy, is there another library/module i could use to do this ? Any suggestions/pointers/advice/hacks ? Thanks a ton, ashish -- ?The best minds of my generation are thinking about how to make people click ads. That sucks.? - Jeff Hammerbacher (early fb engineer) -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2018-03-16 12-38-02.png Type: image/png Size: 81078 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: bimodal_curve_fitting_png.png Type: image/png Size: 81078 bytes Desc: not available URL: From pawel.kw at gmail.com Fri Mar 16 06:21:26 2018 From: pawel.kw at gmail.com (=?UTF-8?Q?Pawe=C5=82_Kwa=C5=9Bniewski?=) Date: Fri, 16 Mar 2018 11:21:26 +0100 Subject: [SciPy-User] Using scipy to fit a bimodal distribution In-Reply-To: References: Message-ID: Hi Ashish, I can recommend lmfit: https://lmfit.github.io It's and additional module, so you'll need to install it, but it's worth it. It's well documented, easy to use and provides the possibility of creating composite models ( https://lmfit.github.io/lmfit-py/model.html#composite-models-adding-or-multiplying-models). I use it whenever I need to fit a double peak. It will also work with two different distributions. Cheers, Pawe? 2018-03-16 8:14 GMT+01:00 spiff007 : > Hi Folks, > > I have some data, which is bimodally distributed. > > The initial part of the data(in red, in the figure below), seems to be > exponentially distributed and the latter part(in blue, in the figure > below), seems to be a gamma distribution. > > See image below (& attached) : > > > ? > What scipy function, i could use to fit this data, knowing that, the first > mode is exponential, and the latter, seems to be gamma. > If its not possible to do it using scipy, is there another library/module > i could use to do this ? > > Any suggestions/pointers/advice/hacks ? > > Thanks a ton, > ashish > > -- > ?The best minds of my generation are thinking about how to make people > click ads. That sucks.? > - Jeff Hammerbacher (early fb engineer) > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2018-03-16 12-38-02.png Type: image/png Size: 81078 bytes Desc: not available URL: From ergun.bicici at boun.edu.tr Tue Mar 20 09:50:42 2018 From: ergun.bicici at boun.edu.tr (Ergun Bicici) Date: Tue, 20 Mar 2018 16:50:42 +0300 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied Message-ID: After multiplying a coo_matrix from scipy.sparse with its copy, the copy's format changes to column-major and hence its data change while the original data remains the same. Example: import numpy, scipy from scipy.sparse import coo_matrix, csc_matrix a = numpy.random.randn(4,7) print ('a:', a) b = coo_matrix(a) c = b.copy() print ('c before (coo_matrix):', c) print ('c data before (coo_matrix):', c.data) b.multiply(c) print ('c after (coo_matrix):', c) print ('c data after (coo_matrix):', c.data) a: [[ 0.80168251 -1.39944231 0.25345622 1.42371329 -0.12530735 0.50449369 1.07289013] [ 0.45875588 -0.15916679 -0.42632756 0.56484416 -1.46640872 1.42488646 0.03532664] [ 0.72752298 1.34140445 -0.37043453 0.78127267 0.01084105 -2.08957963 -0.10899932] [-0.02555597 -0.87197938 -0.76835675 0.82776032 -2.06292275 -1.2911363 -0.72253431]] c before (coo_matrix): (0, 0) 0.801682509603 (0, 1) -1.39944230956 (0, 2) 0.253456223565 (0, 3) 1.4237132942 (0, 4) -0.125307347426 (0, 5) 0.504493692248 (0, 6) 1.07289013113 (1, 0) 0.458755875333 (1, 1) -0.159166794146 (1, 2) -0.426327564398 (1, 3) 0.564844163879 (1, 4) -1.46640871661 (1, 5) 1.42488646441 (1, 6) 0.0353266410166 (2, 0) 0.727522979291 (2, 1) 1.34140444948 (2, 2) -0.370434529077 (2, 3) 0.781272667369 (2, 4) 0.0108410499861 (2, 5) -2.08957962801 (2, 6) -0.108999316299 (3, 0) -0.0255559745129 (3, 1) -0.871979375733 (3, 2) -0.768356754242 (3, 3) 0.827760318972 (3, 4) -2.0629227477 (3, 5) -1.29113629796 (3, 6) -0.722534309773 c data before (coo_matrix): [ 0.80168251 -1.39944231 0.25345622 1.42371329 -0.12530735 0.50449369 1.07289013 0.45875588 -0.15916679 -0.42632756 0.56484416 -1.46640872 1.42488646 0.03532664 0.72752298 1.34140445 -0.37043453 0.78127267 0.01084105 -2.08957963 -0.10899932 -0.02555597 -0.87197938 -0.76835675 0.82776032 -2.06292275 -1.2911363 -0.72253431] c after (coo_matrix): (0, 0) 0.801682509603 (1, 0) 0.458755875333 (2, 0) 0.727522979291 (3, 0) -0.0255559745129 (0, 1) -1.39944230956 (1, 1) -0.159166794146 (2, 1) 1.34140444948 (3, 1) -0.871979375733 (0, 2) 0.253456223565 (1, 2) -0.426327564398 (2, 2) -0.370434529077 (3, 2) -0.768356754242 (0, 3) 1.4237132942 (1, 3) 0.564844163879 (2, 3) 0.781272667369 (3, 3) 0.827760318972 (0, 4) -0.125307347426 (1, 4) -1.46640871661 (2, 4) 0.0108410499861 (3, 4) -2.0629227477 (0, 5) 0.504493692248 (1, 5) 1.42488646441 (2, 5) -2.08957962801 (3, 5) -1.29113629796 (0, 6) 1.07289013113 (1, 6) 0.0353266410166 (2, 6) -0.108999316299 (3, 6) -0.722534309773 c data after (coo_matrix): [ 0.80168251 0.45875588 0.72752298 -0.02555597 -1.39944231 -0.15916679 1.34140445 -0.87197938 0.25345622 -0.42632756 -0.37043453 -0.76835675 1.42371329 0.56484416 0.78127267 0.82776032 -0.12530735 -1.46640872 0.01084105 -2.06292275 0.50449369 1.42488646 -2.08957963 -1.2911363 1.07289013 0.03532664 -0.10899932 -0.72253431] Best Regards, Ergun Ergun Bi?ici http://bicici.github.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Tue Mar 20 20:43:47 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 21 Mar 2018 00:43:47 +0000 Subject: [SciPy-User] objects with their Delaunay graphs overlaid. In-Reply-To: References: Message-ID: first of all, I think scipy's triangulate only does 2D -- those appeared to be 3D images. But in any case, it only does a convex hull -- if your points are not convex, you need constrained Delaney, which I don't know of a good open-source code for. But triangle is a good one if you can deal with the unclear license: here is one wrapper: https://github.com/drufat/triangle I have no idea if it works. -CHB On Wed, Mar 14, 2018 at 6:35 PM, Franck Kalala wrote: > It does not bring a big change.... > > 2018-03-14 17:54 GMT+00:00 Thiago Franco Moraes : > >> Try to invert the points: >> >> triangles = tri.Triangulation(points[:,1], points[:,0]) >> >> On Wed, Mar 14, 2018 at 2:16 PM Franck Kalala Mutombo >> wrote: >> >>> Thank you Thiago, >>> >>> It is working but the triangulation goes out of the object for some >>> points and is that I wanted to avoid compare the image I attached. >>> >>> Best >>> >>> - >>> Franck Kalala Mutombo, (PhD. Mathematics ) >>> +243(04)844140 411 | +27(0)7646 91608 >>> skype: franckm4 >>> >>> "*No one knows the future, however, but this does not prevent us to >>> project** in it and to act as if we control it*" >>> >>> >>> On 14 March 2018 at 16:42, Thiago Franco Moraes >>> wrote: >>> >>>> You have to use plt.triplot. Also, you can use >>>> matplotlib.tri.Triangulation to triangulate using the delaunay. Something >>>> like this: >>>> >>>> from matplotlib import pyplot as plt >>>> >>>> from skimage.io import imread >>>> from skimage.feature import corner_harris, corner_subpix, corner_peaks, >>>> peak_local_max >>>> import matplotlib.tri as tri >>>> import matplotlib.pyplot as plt >>>> >>>> image = imread("cup.png", as_grey='True') >>>> points = peak_local_max(corner_harris(image), min_distance=2) >>>> triangles = tri.Triangulation(points[:,0], points[:,1]) >>>> fig, ax = plt.subplots() >>>> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >>>> ax.triplot(triangles) >>>> plt.show() >>>> >>>> >>>> >>>> >>>> On Wed, Mar 14, 2018 at 1:24 PM Franck Kalala < >>>> franckkalala at googlemail.com> wrote: >>>> >>>>> Hello All >>>>> >>>>> I am not sure this a good place to ask this. I am just making a try. >>>>> I have and image and I would like to reproduced the Delaunay graphs >>>>> overlaid. >>>>> See the attached image for an idea. >>>>> >>>>> I try the following code: >>>>> >>>>> from matplotlib import pyplot as plt >>>>> >>>>> from skimage.io import imread >>>>> from skimage.feature import corner_harris, corner_subpix, >>>>> corner_peaks, peak_local_max >>>>> from scipy.spatial import Delaunay >>>>> import matplotlib.pyplot as plt >>>>> >>>>> image = imread("cup.png", as_grey='True') >>>>> >>>>> >>>>> #points = corner_peaks(corner_harris(image), min_distance=1) >>>>> points = peak_local_max(corner_harris(image), min_distance=2) >>>>> #coords_subpix = corner_subpix(image, coords, window_size=13) >>>>> tri = Delaunay(points) >>>>> >>>>> #imgplot = plt.imshow(image,cmap='gray') >>>>> #plt.triplot(points[:,0], points[:,1], tri.simplices.copy()) >>>>> #plt.plot(points[:,0], points[:,1], 'o') >>>>> #plt.show() >>>>> #print(image) >>>>> >>>>> coords = peak_local_max(corner_harris(image), min_distance=2) >>>>> coords_subpix = corner_subpix(image, coords, window_size=13) >>>>> >>>>> fig, ax = plt.subplots() >>>>> ax.imshow(image, interpolation='nearest', cmap=plt.cm.gray) >>>>> ax.plot(coords[:, 1], coords[:, 0], '.b', markersize=3) >>>>> #ax.plot(coords_subpix[:, 1], coords_subpix[:, 0], '+r', markersize=15) >>>>> #ax.axis((0, 350, 350, 0)) >>>>> plt.show() >>>>> >>>>> >>>>> this code does not help. I attach also the cup image for a try. >>>>> >>>>> best >>>>> >>>>> franck >>>>> _______________________________________________ >>>>> SciPy-User mailing list >>>>> SciPy-User at python.org >>>>> https://mail.python.org/mailman/listinfo/scipy-user >>>>> >>>> >>>> _______________________________________________ >>>> SciPy-User mailing list >>>> SciPy-User at python.org >>>> https://mail.python.org/mailman/listinfo/scipy-user >>>> >>>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Wed Mar 21 06:48:14 2018 From: david at drhagen.com (David Hagen) Date: Wed, 21 Mar 2018 06:48:14 -0400 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: Message-ID: What version are you using? I am on 1.0.0 on Windows 7 and can't reproduce this. In [1]: import numpy, scipy ...: from scipy.sparse import coo_matrix, csc_matrix ...: ...: a = numpy.random.randn(4,7) ...: print ('a:', a) ...: b = coo_matrix(a) ...: c = b.copy() ...: print ('c before (coo_matrix):', c) ...: print ('c data before (coo_matrix):', c.data) ...: b.multiply(c) ...: print ('c after (coo_matrix):', c) ...: print ('c data after (coo_matrix):', c.data) a: [[ 0.13836937 -0.44769877 -0.67486584 -0.18079949 0.68758066 -0.77756412 0.77819897] [-0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 0.09793786 0.13236508] [-0.65393023 1.1196121 0.45519321 0.27690164 -1.28679679 -0.46791624 0.59063279] [ 1.43179076 0.38644705 -0.27769729 1.09228815 -1.38605657 -1.39493734 -1.06365405]] c before (coo_matrix): (0, 0) 0.13836937102833774 (0, 1) -0.44769876721981233 (0, 2) -0.6748658374897415 (0, 3) -0.18079948538318338 (0, 4) 0.6875806622877358 (0, 5) -0.7775641155122853 (0, 6) 0.7781989700714793 (1, 0) -0.07750617928808383 (1, 1) 0.4572315207281202 (1, 2) 0.4580163825027164 (1, 3) 0.6951737836620446 (1, 4) 0.035447606524930446 (1, 5) 0.09793785855383845 (1, 6) 0.13236508271088915 (2, 0) -0.6539302277704122 (2, 1) 1.1196120969391556 (2, 2) 0.45519320523897316 (2, 3) 0.27690163859596667 (2, 4) -1.2867967876681026 (2, 5) -0.4679162379777137 (2, 6) 0.5906327892567922 (3, 0) 1.4317907565386616 (3, 1) 0.38644704937601526 (3, 2) -0.2776972862955177 (3, 3) 1.0922881499931638 (3, 4) -1.3860565664680384 (3, 5) -1.3949373395890408 (3, 6) -1.0636540485066541 c data before (coo_matrix): [ 0.13836937 -0.44769877 -0.67486584 -0.18079949 0.68758066 -0.77756412 0.77819897 -0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 0.09793786 0.13236508 -0.65393023 1.1196121 0.45519321 0.27690164 -1.28679679 -0.46791624 0.59063279 1.43179076 0.38644705 -0.27769729 1.09228815 -1.38605657 -1.39493734 -1.06365405] c after (coo_matrix): (0, 0) 0.13836937102833774 (0, 1) -0.44769876721981233 (0, 2) -0.6748658374897415 (0, 3) -0.18079948538318338 (0, 4) 0.6875806622877358 (0, 5) -0.7775641155122853 (0, 6) 0.7781989700714793 (1, 0) -0.07750617928808383 (1, 1) 0.4572315207281202 (1, 2) 0.4580163825027164 (1, 3) 0.6951737836620446 (1, 4) 0.035447606524930446 (1, 5) 0.09793785855383845 (1, 6) 0.13236508271088915 (2, 0) -0.6539302277704122 (2, 1) 1.1196120969391556 (2, 2) 0.45519320523897316 (2, 3) 0.27690163859596667 (2, 4) -1.2867967876681026 (2, 5) -0.4679162379777137 (2, 6) 0.5906327892567922 (3, 0) 1.4317907565386616 (3, 1) 0.38644704937601526 (3, 2) -0.2776972862955177 (3, 3) 1.0922881499931638 (3, 4) -1.3860565664680384 (3, 5) -1.3949373395890408 (3, 6) -1.0636540485066541 c data after (coo_matrix): [ 0.13836937 -0.44769877 -0.67486584 -0.18079949 0.68758066 -0.77756412 0.77819897 -0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 0.09793786 0.13236508 -0.65393023 1.1196121 0.45519321 0.27690164 -1.28679679 -0.46791624 0.59063279 1.43179076 0.38644705 -0.27769729 1.09228815 -1.38605657 -1.39493734 -1.06365405] In [2]: scipy.version.version Out[2]: '1.0.0' -------------- next part -------------- An HTML attachment was scrubbed... URL: From ergun.bicici at boun.edu.tr Wed Mar 21 12:02:08 2018 From: ergun.bicici at boun.edu.tr (Ergun Bicici) Date: Wed, 21 Mar 2018 19:02:08 +0300 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: Message-ID: Thank you. The versions I use on linux are scipy.__version__ '0.19.1' numpy.__version__ '1.13.3' ?Can you try on linux as well?? Best Regards, Ergun Ergun Bi?ici http://bicici.github.com/ On Wed, Mar 21, 2018 at 1:48 PM, David Hagen wrote: > What version are you using? I am on 1.0.0 on Windows 7 and can't reproduce > this. > > > In [1]: import numpy, scipy > ...: from scipy.sparse import coo_matrix, csc_matrix > ...: > ...: a = numpy.random.randn(4,7) > ...: print ('a:', a) > ...: b = coo_matrix(a) > ...: c = b.copy() > ...: print ('c before (coo_matrix):', c) > ...: print ('c data before (coo_matrix):', c.data) > ...: b.multiply(c) > ...: print ('c after (coo_matrix):', c) > ...: print ('c data after (coo_matrix):', c.data) > > a: [[ 0.13836937 -0.44769877 -0.67486584 -0.18079949 0.68758066 > -0.77756412 0.77819897] > [-0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 0.09793786 > 0.13236508] > [-0.65393023 1.1196121 0.45519321 0.27690164 -1.28679679 -0.46791624 > 0.59063279] > [ 1.43179076 0.38644705 -0.27769729 1.09228815 -1.38605657 -1.39493734 > -1.06365405]] > c before (coo_matrix): (0, 0) 0.13836937102833774 > (0, 1) -0.44769876721981233 > (0, 2) -0.6748658374897415 > (0, 3) -0.18079948538318338 > (0, 4) 0.6875806622877358 > (0, 5) -0.7775641155122853 > (0, 6) 0.7781989700714793 > (1, 0) -0.07750617928808383 > (1, 1) 0.4572315207281202 > (1, 2) 0.4580163825027164 > (1, 3) 0.6951737836620446 > (1, 4) 0.035447606524930446 > (1, 5) 0.09793785855383845 > (1, 6) 0.13236508271088915 > (2, 0) -0.6539302277704122 > (2, 1) 1.1196120969391556 > (2, 2) 0.45519320523897316 > (2, 3) 0.27690163859596667 > (2, 4) -1.2867967876681026 > (2, 5) -0.4679162379777137 > (2, 6) 0.5906327892567922 > (3, 0) 1.4317907565386616 > (3, 1) 0.38644704937601526 > (3, 2) -0.2776972862955177 > (3, 3) 1.0922881499931638 > (3, 4) -1.3860565664680384 > (3, 5) -1.3949373395890408 > (3, 6) -1.0636540485066541 > c data before (coo_matrix): [ 0.13836937 -0.44769877 -0.67486584 > -0.18079949 0.68758066 -0.77756412 > 0.77819897 -0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 > 0.09793786 0.13236508 -0.65393023 1.1196121 0.45519321 0.27690164 > -1.28679679 -0.46791624 0.59063279 1.43179076 0.38644705 -0.27769729 > 1.09228815 -1.38605657 -1.39493734 -1.06365405] > c after (coo_matrix): (0, 0) 0.13836937102833774 > (0, 1) -0.44769876721981233 > (0, 2) -0.6748658374897415 > (0, 3) -0.18079948538318338 > (0, 4) 0.6875806622877358 > (0, 5) -0.7775641155122853 > (0, 6) 0.7781989700714793 > (1, 0) -0.07750617928808383 > (1, 1) 0.4572315207281202 > (1, 2) 0.4580163825027164 > (1, 3) 0.6951737836620446 > (1, 4) 0.035447606524930446 > (1, 5) 0.09793785855383845 > (1, 6) 0.13236508271088915 > (2, 0) -0.6539302277704122 > (2, 1) 1.1196120969391556 > (2, 2) 0.45519320523897316 > (2, 3) 0.27690163859596667 > (2, 4) -1.2867967876681026 > (2, 5) -0.4679162379777137 > (2, 6) 0.5906327892567922 > (3, 0) 1.4317907565386616 > (3, 1) 0.38644704937601526 > (3, 2) -0.2776972862955177 > (3, 3) 1.0922881499931638 > (3, 4) -1.3860565664680384 > (3, 5) -1.3949373395890408 > (3, 6) -1.0636540485066541 > c data after (coo_matrix): [ 0.13836937 -0.44769877 -0.67486584 > -0.18079949 0.68758066 -0.77756412 > 0.77819897 -0.07750618 0.45723152 0.45801638 0.69517378 0.03544761 > 0.09793786 0.13236508 -0.65393023 1.1196121 0.45519321 0.27690164 > -1.28679679 -0.46791624 0.59063279 1.43179076 0.38644705 -0.27769729 > 1.09228815 -1.38605657 -1.39493734 -1.06365405] > > In [2]: scipy.version.version > Out[2]: '1.0.0' > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.mikolas1 at gmail.com Thu Mar 22 07:46:21 2018 From: david.mikolas1 at gmail.com (David Mikolas) Date: Thu, 22 Mar 2018 19:46:21 +0800 Subject: [SciPy-User] instantiating interp1d() is pathologically slow for me, why? Message-ID: I've just posted this question in SO as well, happy with an answer either place: https://stackoverflow.com/q/49427533/3904031 The following takes over a minute for a few thousand points, whereas it seems it should be taking milliseconds. np.__version__ '1.13.0' scipy.__version__ '0.17.0' https://i.stack.imgur.com/3C69J.png import time import numpy as np import matplotlib.pyplot as plt from scipy.interpolate import interp1d times = [] for n in np.logspace(1, 3.5, 6).astype(int): x = np.arange(n, dtype=float) y = np.vstack((np.cos(x), np.sin(x))) start = time.clock() bob = interp1d(x, y, kind='quadratic', assume_sorted=True) times.append((n, time.clock() - start)) n, tim = zip(*times) plt.figure() plt.plot(n, tim) plt.xscale('log') plt.yscale('log') plt.show() -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at drhagen.com Thu Mar 22 08:16:49 2018 From: david at drhagen.com (David Hagen) Date: Thu, 22 Mar 2018 08:16:49 -0400 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: Message-ID: > ?Can you try on linux as well?? Sorry, I do not have access to a linux box right now. Can you try 1.0.0 in virtual environment? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ergun.bicici at boun.edu.tr Thu Mar 22 10:57:55 2018 From: ergun.bicici at boun.edu.tr (Ergun Bicici) Date: Thu, 22 Mar 2018 17:57:55 +0300 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: Message-ID: I'll wait for response from others. Your environment looks different. Thank you. Best Regards, Ergun Ergun Bi?ici http://bicici.github.com/ On Thu, Mar 22, 2018 at 3:16 PM, David Hagen wrote: > > ?Can you try on linux as well?? > > Sorry, I do not have access to a linux box right now. Can you try 1.0.0 in > virtual environment? > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Thu Mar 22 16:41:45 2018 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Thu, 22 Mar 2018 16:41:45 -0400 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: Message-ID: <63d5cf40-7878-4770-b7b7-1831f1e6469f@Spark> Hi Ergun, You should probably update your SciPy version to 1.0. It is more likely that the ?fixed? behaviour reported by David is indeed fixed in 1.0, rather than an OS difference. Juan. On 22 Mar 2018, 10:59 AM -0400, Ergun Bicici , wrote: > > I'll wait for response from others. Your environment looks different. Thank you. > > > Best Regards, > Ergun > > Ergun Bi?ici > http://bicici.github.com/ > > > > On Thu, Mar 22, 2018 at 3:16 PM, David Hagen wrote: > > > >? ?Can you try on linux as well? > > > > > > Sorry, I do not have access to a linux box right now. Can you try 1.0.0 in virtual environment? > > > > > > _______________________________________________ > > > SciPy-User mailing list > > > SciPy-User at python.org > > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From ergun.bicici at boun.edu.tr Thu Mar 22 18:53:23 2018 From: ergun.bicici at boun.edu.tr (Ergun Bicici) Date: Fri, 23 Mar 2018 01:53:23 +0300 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: <63d5cf40-7878-4770-b7b7-1831f1e6469f@Spark> References: <63d5cf40-7878-4770-b7b7-1831f1e6469f@Spark> Message-ID: I installed newer scipy using pip3 install scipy and there is no error this time. The test case might still be useful to test sparse matrices and their data. Best Regards, Ergun Ergun Bi?ici http://bicici.github.com/ On Thu, Mar 22, 2018 at 11:41 PM, Juan Nunez-Iglesias wrote: > Hi Ergun, > > You should probably update your SciPy version to 1.0. It is more likely > that the ?fixed? behaviour reported by David is indeed fixed in 1.0, rather > than an OS difference. > > Juan. > > On 22 Mar 2018, 10:59 AM -0400, Ergun Bicici , > wrote: > > > I'll wait for response from others. Your environment looks different. > Thank you. > > > Best Regards, > Ergun > > Ergun Bi?ici > http://bicici.github.com/ > > > On Thu, Mar 22, 2018 at 3:16 PM, David Hagen wrote: > >> > ?Can you try on linux as well?? >> >> Sorry, I do not have access to a linux box right now. Can you try 1.0.0 >> in virtual environment? >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jni.soma at gmail.com Thu Mar 22 19:00:08 2018 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Thu, 22 Mar 2018 19:00:08 -0400 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: References: <63d5cf40-7878-4770-b7b7-1831f1e6469f@Spark> Message-ID: <229ca40c-fc2c-4d32-9962-edd35957acf0@Spark> Actually, I hadn?t, and I realised I hadn?t updated scipy to 1.0, so it was a good chance to try both. I?m on macOS. Indeed, on 0.19 I see the same issue you do, while on 1.0 it is gone. Incidentally, there?s a few issues related to this that might have been resolved without closing the corresponding GH issue: https://github.com/scipy/scipy/issues/6902 https://github.com/scipy/scipy/issues/6603 These may have been fixed here: https://github.com/scipy/scipy/issues/7077 https://github.com/scipy/scipy/pull/7078 Any devs want to comment on these? Juan. On 22 Mar 2018, 6:22 PM -0400, Ergun Bicici , wrote: > > Hi Juan, > > Ok I am installing: > ?pip3 install scipy > > How about you? Did you try? > > > Best Regards, > Ergun > > Ergun Bi?ici > http://bicici.github.com/ > > > > On Thu, Mar 22, 2018 at 11:41 PM, Juan Nunez-Iglesias < > > > > jni.soma at gmail.com> wrote: > > > Hi Ergun, > > > > > > You should probably update your SciPy version to 1.0. It is more likely that the ?fixed? behaviour reported by David is indeed fixed in 1.0, rather than an OS difference. > > > > > > Juan. > > > > > > On 22 Mar 2018, 10:59 AM -0400, Ergun Bicici , wrote: > > > > > > > > I'll wait for response from others. Your environment looks different. Thank you. > > > > > > > > > > > > Best Regards, > > > > Ergun > > > > > > > > Ergun Bi?ici > > > > http://bicici.github.com/ > > > > > > > > > > > > > On Thu, Mar 22, 2018 at 3:16 PM, David Hagen wrote: > > > > > > >? ?Can you try on linux as well? > > > > > > > > > > > > Sorry, I do not have access to a linux box right now. Can you try 1.0.0 in virtual environment? > > > > > > > > > > > > _______________________________________________ > > > > > > SciPy-User mailing list > > > > > > SciPy-User at python.org > > > > > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > > > > > > > > > > _______________________________________________ > > > > SciPy-User mailing list > > > > SciPy-User at python.org > > > > https://mail.python.org/mailman/listinfo/scipy-user > > > > > > _______________________________________________ > > > SciPy-User mailing list > > > SciPy-User at python.org > > > https://mail.python.org/mailman/listinfo/scipy-user > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Thu Mar 22 23:28:02 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Fri, 23 Mar 2018 03:28:02 +0000 Subject: [SciPy-User] instantiating interp1d() is pathologically slow for me, why? References: Message-ID: Why not profile it using the cprofile module in Python? (Look for it in the docs.) Then you can see exactly where the bottleneck is. You can view the output directly or use either snakeviz or cprofilev from pypi to view results interactively. (Search Google for "snakeviz" or "ymichael cprofilev".) On Thu, Mar 22, 2018, 4:46 AM David Mikolas wrote: > I've just posted this question in SO as well, happy with an answer either > place: > > https://stackoverflow.com/q/49427533/3904031 > > The following takes over a minute for a few thousand points, whereas it > seems it should be taking milliseconds. > > np.__version__ '1.13.0' > scipy.__version__ '0.17.0' > > > https://i.stack.imgur.com/3C69J.png > > import time > import numpy as np > import matplotlib.pyplot as plt > from scipy.interpolate import interp1d > > times = [] > for n in np.logspace(1, 3.5, 6).astype(int): > x = np.arange(n, dtype=float) > y = np.vstack((np.cos(x), np.sin(x))) > start = time.clock() > bob = interp1d(x, y, kind='quadratic', assume_sorted=True) > times.append((n, time.clock() - start)) > > n, tim = zip(*times) > > plt.figure() > plt.plot(n, tim) > plt.xscale('log') > plt.yscale('log') > plt.show() > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From opossumnano at gmail.com Fri Mar 23 04:24:12 2018 From: opossumnano at gmail.com (Python School Organizers) Date: Fri, 23 Mar 2018 01:24:12 -0700 (PDT) Subject: [SciPy-User] =?utf-8?b?W0FOTl0gMTHhtZfKsCBBZHZhbmNlZCBTY2llbnRp?= =?utf-8?q?fic_Programming_in_Python_in_Camerino=2C_Italy=2C_3=E2=80=948_S?= =?utf-8?q?eptember=2C_2018?= Message-ID: <5ab4b9ac.08c41c0a.55881.885c@mx.google.com> 11?? Advanced Scientific Programming in Python ============================================== a Summer School by the G-Node and the University of Camerino https://python.g-node.org Scientists spend more and more time writing, maintaining, and debugging software. While techniques for doing this efficiently have evolved, only few scientists have been trained to use them. As a result, instead of doing their research, they spend far too much time writing deficient code and reinventing the wheel. In this course we will present a selection of advanced programming techniques and best practices which are standard in the industry, but especially tailored to the needs of a programming scientist. Lectures are devised to be interactive and to give the students enough time to acquire direct hands-on experience with the materials. Students will work in pairs throughout the school and will team up to practice the newly learned skills in a real programming project ? an entertaining computer game. We use the Python programming language for the entire course. Python works as a simple programming language for beginners, but more importantly, it also works great in scientific simulations and data analysis. We show how clean language design, ease of extensibility, and the great wealth of open source libraries for scientific computing and data visualization are driving Python to become a standard tool for the programming scientist. This school is targeted at Master or PhD students and Post-docs from all areas of science. Competence in Python or in another language such as Java, C/C++, MATLAB, or Mathematica is absolutely required. Basic knowledge of Python and of a version control system such as git, subversion, mercurial, or bazaar is assumed. Participants without any prior experience with Python and/or git should work through the proposed introductory material before the course. We are striving hard to get a pool of students which is international and gender-balanced: see how far we got in previous years ! Date & Location =============== 3?8 September, 2018. Camerino, Italy. Application =========== You can apply online: https://python.g-node.org/wiki/applications Application deadline: 23:59 UTC, 31 May, 2018. There will be no deadline extension, so be sure to apply on time. Be sure to read the FAQ before applying: https://python.g-node.org/wiki/faq Participation is for free, i.e. no fee is charged! Participants however should take care of travel, living, and accommodation expenses by themselves. Program ======= ? Version control with git and how to contribute to open source projects with GitHub ? Best practices in data visualization ? Organizing, documenting, and distributing scientific code ? Testing scientific code ? Profiling scientific code ? Advanced NumPy ? Advanced scientific Python: decorators, context managers, generators, and elements of object oriented programming ? Writing parallel applications in Python ? Speeding up scientific code with Cython and numba ? Memory-bound computations and the memory hierarchy ? Programming in teams Also see the detailed day-by-day schedule: https://python.g-node.org/wiki/schedule Faculty ======= ? Ashwin Trikuta Srinath, Cyberinfrastructure Technology Integration, Clemson University, SC USA ? Jenni Rinker, Department of Wind Energy, Technical University of Denmark, Roskilde Denmark ? Juan Nunez-Iglesias, Melbourne Bioinformatics, University of Melbourne Australia ? Nicolas P. Rougier, Inria Bordeaux Sud-Ouest, Institute of Neurodegenerative Disease, University of Bordeaux France ? Pietro Berkes, NAGRA Kudelski, Lausanne Switzerland ? Rike-Benjamin Schuppner, Institute for Theoretical Biology, Humboldt-Universit?t zu Berlin Germany ? Tiziano Zito, freelance consultant, Berlin Germany ? Zbigniew J?drzejewski-Szmek, Red Hat Inc., Warsaw Poland Organizers ========== For the German Neuroinformatics Node of the INCF (G-Node) Germany: ? Tiziano Zito, freelance consultant, Berlin Germany ? Caterina Buizza, Personal Robotics Lab, Imperial College London UK ? Zbigniew J?drzejewski-Szmek, Red Hat Inc., Warsaw Poland ? Jakob Jordan, Department of Physiology, University of Bern, Switzerland Switzerland For the University of Camerino Italy: ? Flavio Corradini, Computer Science Division, School of Science and Technology, University of Camerino Italy ? Barbara Re, Computer Science Division, School of Science and Technology, University of Camerino Italy Website: https://python.g-node.org Contact: python-info at g-node.org From zolotovo at gmail.com Fri Mar 23 12:24:58 2018 From: zolotovo at gmail.com (O.Zolotov) Date: Fri, 23 Mar 2018 19:24:58 +0300 Subject: [SciPy-User] Problem of 'rotated' data from mat-file when plotted Message-ID: Dear All, I try to ?plot? data from mat-file. The matlab file?s URL is https://agupubs.onlinelibrary.wiley.com/action/downloadSupplement?doi=10.1002%2F2016JA023579&attachmentId=2190682395 (~80Mb!) When I plot the figure, it looks rotated over the original one. Is there a way to fix this? I tested it using SciLab also, and it produced the figure as expected, so the problem is unlikely in the file itself. I use the following commands to plot the figure import scipy.io as s import numpy as np ds2 = s.loadmat(r"jgra53413-sup-0003-DataSetS2.mat") jx2, jy2, jz2 = ds2['jx'], ds2['jy'], ds2['jz'] x2, y2, z2 = ds2['x'][0], ds2['y'][0], ds2['z'][0] X2, Y2 = np.meshgrid(x2,y2) import matplotlib.pyplot as plt plt.subplot(2,2,1) plt.pcolor(X2, Y2, jz2[:,:,0]) plt.colorbar() plt.show() The original and plotted images are attached. Regards, Oleg -------------- next part -------------- A non-text attachment was scrubbed... Name: original.png Type: image/png Size: 33626 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: rotated(unexpectedly).png Type: image/png Size: 5076 bytes Desc: not available URL: From jni.soma at gmail.com Fri Mar 23 15:26:06 2018 From: jni.soma at gmail.com (Juan Nunez-Iglesias) Date: Fri, 23 Mar 2018 15:26:06 -0400 Subject: [SciPy-User] Problem of 'rotated' data from mat-file when plotted In-Reply-To: References: Message-ID: Hi Oleg, I haven?t gone carefully over your code but I suspect it has to do with mixing row/column coordinates with x/y (cartesian) coordinates. See this document for a brief explanation in the context of images: http://scikit-image.org/docs/dev/user_guide/numpy_images.html#coordinate-conventions So you probably have to mess with x2 and y2 somewhere to transpose them. Probably in the call to meshgrid, where NumPy would expect row/column and you are providing x/y. Juan. On 23 Mar 2018, 12:26 PM -0400, O.Zolotov , wrote: > Dear All, > > I try to ?plot? data from mat-file. The matlab file?s URL is > https://agupubs.onlinelibrary.wiley.com/action/downloadSupplement?doi=10.1002%2F2016JA023579&attachmentId=2190682395 > > (~80Mb!) > > When I plot the figure, it looks rotated over the original one. Is > there a way to fix this? I tested it using SciLab also, and it > produced the figure as expected, so the problem is unlikely in the > file itself. I use the following commands to plot the figure > > import scipy.io as s > import numpy as np > > ds2 = s.loadmat(r"jgra53413-sup-0003-DataSetS2.mat") > jx2, jy2, jz2 = ds2['jx'], ds2['jy'], ds2['jz'] > x2, y2, z2 = ds2['x'][0], ds2['y'][0], ds2['z'][0] > X2, Y2 = np.meshgrid(x2,y2) > > import matplotlib.pyplot as plt > plt.subplot(2,2,1) > plt.pcolor(X2, Y2, jz2[:,:,0]) > plt.colorbar() > plt.show() > > The original and plotted images are attached. > > Regards, > Oleg > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Mar 24 23:33:21 2018 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 24 Mar 2018 20:33:21 -0700 Subject: [SciPy-User] ANN: SciPy 1.0.1 released Message-ID: On behalf of the SciPy development team I am pleased to announce the availability of Scipy 1.0.1. This is a maintenance release, no new features with respect to 1.0.0. See the release notes below for details. Wheels and sources can be found on PyPI (https://pypi.python.org/pypi/scipy) and on Github (https://github.com/scipy/scipy/releases/tag/v1.0.1). The conda-forge channel will be up to date within a couple of hours. Thanks to everyone who contributed to this release! Cheers, Ralf SciPy 1.0.1 Release Notes ==================== SciPy 1.0.1 is a bug-fix release with no new features compared to 1.0.0. Probably the most important change is a fix for an incompatibility between SciPy 1.0.0 and ``numpy.f2py`` in the NumPy master branch. Authors ======= * Saurabh Agarwal + * Alessandro Pietro Bardelli * Philip DeBoer * Ralf Gommers * Matt Haberland * Eric Larson * Denis Laxalde * Mihai Capot? + * Andrew Nelson * Oleksandr Pavlyk * Ilhan Polat * Anant Prakash + * Pauli Virtanen * Warren Weckesser * @xoviat * Ted Ying + A total of 16 people contributed to this release. People with a "+" by their names contributed a patch for the first time. This list of names is automatically generated, and may not be fully complete. Issues closed for 1.0.1 ----------------------- - `#7493 `__: `ndimage.morphology` functions are broken with numpy 1.13.0 - `#8118 `__: minimize_cobyla broken if `disp=True` passed - `#8142 `__: scipy-v1.0.0 pdist with metric=`minkowski` raises `ValueError:... - `#8173 `__: `scipy.stats.ortho_group` produces all negative determinants... - `#8207 `__: gaussian_filter seg faults on float16 numpy arrays - `#8234 `__: `scipy.optimize.linprog` `interior-point` presolve bug with trivial... - `#8243 `__: Make csgraph importable again via `from scipy.sparse import*` - `#8320 `__: scipy.root segfaults with optimizer 'lm' Pull requests for 1.0.1 ----------------------- - `#8068 `__: BUG: fix numpy deprecation test failures - `#8082 `__: BUG: fix solve_lyapunov import - `#8144 `__: MRG: Fix for cobyla - `#8150 `__: MAINT: resolve UPDATEIFCOPY deprecation errors - `#8156 `__: BUG: missing check on minkowski w kwarg - `#8187 `__: BUG: Sign of elements in random orthogonal 2D matrices in "ortho_group_gen"... - `#8197 `__: CI: uninstall oclint - `#8215 `__: Fixes Numpy datatype compatibility issues - `#8237 `__: BUG: optimize: fix bug when variables fixed by bounds are inconsistent... - `#8248 `__: BUG: declare "gfk" variable before call of terminate() in newton-cg - `#8280 `__: REV: reintroduce csgraph import in scipy.sparse - `#8322 `__: MAINT: prevent scipy.optimize.root segfault closes #8320 - `#8334 `__: TST: stats: don't use exact equality check for hdmedian test - `#8477 `__: BUG: signal/signaltools: fix wrong refcounting in PyArray_OrderFilterND - `#8530 `__: BUG: linalg: Fixed typo in flapack.pyf.src. - `#8566 `__: CI: Temporarily pin Cython version to 0.27.3 - `#8573 `__: Backports for 1.0.1 - `#8581 `__: Fix Cython 0.28 build break of qhull.pyx -------------- next part -------------- An HTML attachment was scrubbed... URL: From ergun.bicici at boun.edu.tr Mon Mar 26 02:46:09 2018 From: ergun.bicici at boun.edu.tr (Ergun Bicici) Date: Mon, 26 Mar 2018 09:46:09 +0300 Subject: [SciPy-User] Error in scipy sparse: multiplication with coo_matrix copy changes the data and the format of the multiplied In-Reply-To: <229ca40c-fc2c-4d32-9962-edd35957acf0@Spark> References: <63d5cf40-7878-4770-b7b7-1831f1e6469f@Spark> <229ca40c-fc2c-4d32-9962-edd35957acf0@Spark> Message-ID: Hi Juan, Thank you for your confirmation. The issues you mentioned are also different. Best Regards, Ergun Ergun Bi?ici http://bicici.github.com/ On Fri, Mar 23, 2018 at 2:00 AM, Juan Nunez-Iglesias wrote: > Actually, I hadn?t, and I realised I hadn?t updated scipy to 1.0, so it > was a good chance to try both. I?m on macOS. Indeed, on 0.19 I see the same > issue you do, while on 1.0 it is gone. > > Incidentally, there?s a few issues related to this that might have been > resolved without closing the corresponding GH issue: > https://github.com/scipy/scipy/issues/6902 > https://github.com/scipy/scipy/issues/6603 > > These may have been fixed here: > https://github.com/scipy/scipy/issues/7077 > https://github.com/scipy/scipy/pull/7078 > > Any devs want to comment on these? > > Juan. > > On 22 Mar 2018, 6:22 PM -0400, Ergun Bicici , > wrote: > > > Hi Juan, > > Ok I am installing: > ?pip3 install scipy > ? > How about you? Did you try? > > > Best Regards, > Ergun > > Ergun Bi?ici > http://bicici.github.com/ > > > On Thu, Mar 22, 2018 at 11:41 PM, Juan Nunez-Iglesias < > ?? > jni.soma at gmail.com> wrote: > >> Hi Ergun, >> >> You should probably update your SciPy version to 1.0. It is more likely >> that the ?fixed? behaviour reported by David is indeed fixed in 1.0, rather >> than an OS difference. >> >> Juan. >> >> On 22 Mar 2018, 10:59 AM -0400, Ergun Bicici , >> wrote: >> >> >> I'll wait for response from others. Your environment looks different. >> Thank you. >> >> >> Best Regards, >> Ergun >> >> Ergun Bi?ici >> http://bicici.github.com/ >> >> >> On Thu, Mar 22, 2018 at 3:16 PM, David Hagen wrote: >> >>> > ?Can you try on linux as well?? >>> >>> Sorry, I do not have access to a linux box right now. Can you try 1.0.0 >>> in virtual environment? >>> >>> _______________________________________________ >>> SciPy-User mailing list >>> SciPy-User at python.org >>> https://mail.python.org/mailman/listinfo/scipy-user >>> >>> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andyfaff at gmail.com Mon Mar 26 19:57:00 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Tue, 27 Mar 2018 10:57:00 +1100 Subject: [SciPy-User] calculating the jacobian for a least-squares problem Message-ID: I would like to calculate the Jacobian for a least squares problem, followed by a Hessian estimation, then the covariance matrix from that Hessian. With my current approach I sometimes experience issues with the covariance matrix in that it's sometimes not positive semi-definite. I am using the covariance matrix to seed a MCMC sampling process by supplying it to `np.random.multivariate_normal` to get initial positions for the MC chain. I am using the following code: ``` from scipy.optimize._numdiff import approx_derivative jac = approx_derivative(residuals_func, x0) hess = np.matmul(jac.T, jac) covar = np.linalg.inv(hess) ``` Note that x0 may not be at a minimum. - would this be the usual way of estimating the Hessian, is there anything incorrect with the approach? - what is the recommended way (i.e. numerically stable) of inverting the Hessian in such a situation? - does `optimize.leastsq` do anything different? - if `x0` is not at a minimum should the covariance matrix be expected to be positive semi-definite anyway? -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.mikolas1 at gmail.com Tue Mar 27 00:38:53 2018 From: david.mikolas1 at gmail.com (David Mikolas) Date: Tue, 27 Mar 2018 12:38:53 +0800 Subject: [SciPy-User] instantiating interp1d() is pathologically slow for me, why? In-Reply-To: References: Message-ID: Mark thank you for the suggestion. I'd forgotten that I posted this here. It turns out, much to my surprise, that even though interp1d has been in SciPy for many years, it was refactored and this was included in v0.19, leading to literally a half-dozen orders of magnitude speed up for say 10,000 points. So all is well. See this SO answer: https://stackoverflow.com/a/49428804/3904031 On Fri, Mar 23, 2018 at 11:28 AM, Mark Alexander Mikofski < mikofski at berkeley.edu> wrote: > Why not profile it using the cprofile module in Python? (Look for it in > the docs.) Then you can see exactly where the bottleneck is. You can view > the output directly or use either snakeviz or cprofilev from pypi to view > results interactively. (Search Google for "snakeviz" or "ymichael > cprofilev".) > > On Thu, Mar 22, 2018, 4:46 AM David Mikolas > wrote: > >> I've just posted this question in SO as well, happy with an answer either >> place: >> >> https://stackoverflow.com/q/49427533/3904031 >> >> The following takes over a minute for a few thousand points, whereas it >> seems it should be taking milliseconds. >> >> np.__version__ '1.13.0' >> scipy.__version__ '0.17.0' >> >> >> https://i.stack.imgur.com/3C69J.png >> >> import time >> import numpy as np >> import matplotlib.pyplot as plt >> from scipy.interpolate import interp1d >> >> times = [] >> for n in np.logspace(1, 3.5, 6).astype(int): >> x = np.arange(n, dtype=float) >> y = np.vstack((np.cos(x), np.sin(x))) >> start = time.clock() >> bob = interp1d(x, y, kind='quadratic', assume_sorted=True) >> times.append((n, time.clock() - start)) >> >> n, tim = zip(*times) >> >> plt.figure() >> plt.plot(n, tim) >> plt.xscale('log') >> plt.yscale('log') >> plt.show() >> >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gregor.thalhammer at gmail.com Tue Mar 27 05:18:26 2018 From: gregor.thalhammer at gmail.com (Gregor Thalhammer) Date: Tue, 27 Mar 2018 11:18:26 +0200 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: Message-ID: <58FD82CC-9BA5-4104-9017-CA2A7A982B55@gmail.com> > Am 27.03.2018 um 01:57 schrieb Andrew Nelson : > > I would like to calculate the Jacobian for a least squares problem, followed by a Hessian estimation, then the covariance matrix from that Hessian. > > With my current approach I sometimes experience issues with the covariance matrix in that it's sometimes not positive semi-definite. I am using the covariance matrix to seed a MCMC sampling process by supplying it to `np.random.multivariate_normal` to get initial positions for the MC chain. I am using the following code: > > ``` > from scipy.optimize._numdiff import approx_derivative > jac = approx_derivative(residuals_func, x0) > hess = np.matmul(jac.T, jac) > covar = np.linalg.inv(hess) > ``` > > Note that x0 may not be at a minimum. > > - would this be the usual way of estimating the Hessian, is there anything incorrect with the approach? your straightforward approach is ok, especially since you don?t require the highest precision. An alternative would be to use automatic differentiation to calculate the derivatives accurately, e.g. using algopy, theano or tensor flow > - what is the recommended way (i.e. numerically stable) of inverting the Hessian in such a situation? If your hess matrix is close to being singular, you could gain some precision by using the QR decomposition of the jacobian. In general to solve a linear system it is recommended to avoid calculating the the inverse matrix. > - does `optimize.leastsq` do anything different? leastsq wraps the MINPACK library, which brings it own carefully tuned numeric differentiation routines, and it uses QR decomposition. > - if `x0` is not at a minimum should the covariance matrix be expected to be positive semi-definite anyway? If x0 is not a minimum, then there is no guarantee. Even if x0 is a minimum this might by violated due to numerical errors. best Gregor > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user From evgeny.burovskiy at gmail.com Wed Mar 28 02:37:57 2018 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Wed, 28 Mar 2018 06:37:57 +0000 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: <58FD82CC-9BA5-4104-9017-CA2A7A982B55@gmail.com> References: <58FD82CC-9BA5-4104-9017-CA2A7A982B55@gmail.com> Message-ID: Hi, Additionally to what Gregor said: - finite-differences estimation of the derivatives should really be a last resort; best is paper-and-pencil, or algorithmic differentiation (algopy et al). If that is not possible, I'd try some higher-order finite differences. E.g. approx_derivatives with method '3-point' or 'cs' (if that works). - approx_derivative is more sophisticated than fitpack actually. IIUC minpack only does the simplest two-point forward scheme, https://github.com/scipy/scipy/blob/master/scipy/optimize/minpack/fdjac2.f - linalg.inv(matrix) is generally better spelled as solve(matrix, identity_matrix) - in this case, it's indeed best to use QR or SVD. curve_fit does a pseudoinverse: https://github.com/scipy/scipy/blob/v1.0.0/scipy/optimize/minpack.py#L502-L790 (IIRC this was written by Nikolay, and he cited Ceres or some other industry-class optimization software). Cheers, Evgeni On Tue, Mar 27, 2018, 12:19 PM Gregor Thalhammer < gregor.thalhammer at gmail.com> wrote: > > > > Am 27.03.2018 um 01:57 schrieb Andrew Nelson : > > > > I would like to calculate the Jacobian for a least squares problem, > followed by a Hessian estimation, then the covariance matrix from that > Hessian. > > > > With my current approach I sometimes experience issues with the > covariance matrix in that it's sometimes not positive semi-definite. I am > using the covariance matrix to seed a MCMC sampling process by supplying it > to `np.random.multivariate_normal` to get initial positions for the MC > chain. I am using the following code: > > > > ``` > > from scipy.optimize._numdiff import approx_derivative > > jac = approx_derivative(residuals_func, x0) > > hess = np.matmul(jac.T, jac) > > covar = np.linalg.inv(hess) > > ``` > > > > Note that x0 may not be at a minimum. > > > > - would this be the usual way of estimating the Hessian, is there > anything incorrect with the approach? > your straightforward approach is ok, especially since you don?t require > the highest precision. An alternative would be to use automatic > differentiation to calculate the derivatives accurately, e.g. using algopy, > theano or tensor flow > > > - what is the recommended way (i.e. numerically stable) of inverting the > Hessian in such a situation? > > If your hess matrix is close to being singular, you could gain some > precision by using the QR decomposition of the jacobian. In general to > solve a linear system it is recommended to avoid calculating the the > inverse matrix. > > > - does `optimize.leastsq` do anything different? > > leastsq wraps the MINPACK library, which brings it own carefully tuned > numeric differentiation routines, and it uses QR decomposition. > > > - if `x0` is not at a minimum should the covariance matrix be expected > to be positive semi-definite anyway? > If x0 is not a minimum, then there is no guarantee. Even if x0 is a > minimum this might by violated due to numerical errors. > > best > Gregor > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mikofski at berkeley.edu Wed Mar 28 11:21:30 2018 From: mikofski at berkeley.edu (Mark Alexander Mikofski) Date: Wed, 28 Mar 2018 15:21:30 +0000 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: <58FD82CC-9BA5-4104-9017-CA2A7A982B55@gmail.com> Message-ID: Hi Andrew, The covariance can be calculated from the Jacobian as Cij = Jij * Sij * Jij^T where S is the covariance of the inputs, C is the covariance of the outputs, and J^T is the transpose of the Jacobian Jij = dxi/dyj. Sorry if this is redundant or unhelpful. Best, Mark On Tue, Mar 27, 2018, 11:38 PM Evgeni Burovski wrote: > Hi, > > Additionally to what Gregor said: > > - finite-differences estimation of the derivatives should really be a last > resort; best is paper-and-pencil, or algorithmic differentiation (algopy et > al). If that is not possible, I'd try some higher-order finite differences. > E.g. approx_derivatives with method '3-point' or 'cs' (if that works). > > - approx_derivative is more sophisticated than fitpack actually. IIUC > minpack only does the simplest two-point forward scheme, > https://github.com/scipy/scipy/blob/master/scipy/optimize/minpack/fdjac2.f > > - linalg.inv(matrix) is generally better spelled as solve(matrix, > identity_matrix) > > - in this case, it's indeed best to use QR or SVD. curve_fit does a > pseudoinverse: > > > https://github.com/scipy/scipy/blob/v1.0.0/scipy/optimize/minpack.py#L502-L790 > > (IIRC this was written by Nikolay, and he cited Ceres or some other > industry-class optimization software). > > Cheers, > > Evgeni > > > On Tue, Mar 27, 2018, 12:19 PM Gregor Thalhammer < > gregor.thalhammer at gmail.com> wrote: > >> >> >> > Am 27.03.2018 um 01:57 schrieb Andrew Nelson : >> > >> > I would like to calculate the Jacobian for a least squares problem, >> followed by a Hessian estimation, then the covariance matrix from that >> Hessian. >> > >> > With my current approach I sometimes experience issues with the >> covariance matrix in that it's sometimes not positive semi-definite. I am >> using the covariance matrix to seed a MCMC sampling process by supplying it >> to `np.random.multivariate_normal` to get initial positions for the MC >> chain. I am using the following code: >> > >> > ``` >> > from scipy.optimize._numdiff import approx_derivative >> > jac = approx_derivative(residuals_func, x0) >> > hess = np.matmul(jac.T, jac) >> > covar = np.linalg.inv(hess) >> > ``` >> > >> > Note that x0 may not be at a minimum. >> > >> > - would this be the usual way of estimating the Hessian, is there >> anything incorrect with the approach? >> your straightforward approach is ok, especially since you don?t require >> the highest precision. An alternative would be to use automatic >> differentiation to calculate the derivatives accurately, e.g. using algopy, >> theano or tensor flow >> >> > - what is the recommended way (i.e. numerically stable) of inverting >> the Hessian in such a situation? >> >> If your hess matrix is close to being singular, you could gain some >> precision by using the QR decomposition of the jacobian. In general to >> solve a linear system it is recommended to avoid calculating the the >> inverse matrix. >> >> > - does `optimize.leastsq` do anything different? >> >> leastsq wraps the MINPACK library, which brings it own carefully tuned >> numeric differentiation routines, and it uses QR decomposition. >> >> > - if `x0` is not at a minimum should the covariance matrix be expected >> to be positive semi-definite anyway? >> If x0 is not a minimum, then there is no guarantee. Even if x0 is a >> minimum this might by violated due to numerical errors. >> >> best >> Gregor >> >> > _______________________________________________ >> > SciPy-User mailing list >> > SciPy-User at python.org >> > https://mail.python.org/mailman/listinfo/scipy-user >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user >> > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Wed Mar 28 11:57:59 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 28 Mar 2018 11:57:59 -0400 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: <58FD82CC-9BA5-4104-9017-CA2A7A982B55@gmail.com> Message-ID: On Wed, Mar 28, 2018 at 2:37 AM, Evgeni Burovski wrote: > Hi, > > Additionally to what Gregor said: > > - finite-differences estimation of the derivatives should really be a last > resort; best is paper-and-pencil, or algorithmic differentiation (algopy et > al). If that is not possible, I'd try some higher-order finite differences. > E.g. approx_derivatives with method '3-point' or 'cs' (if that works). > > - approx_derivative is more sophisticated than fitpack actually. IIUC > minpack only does the simplest two-point forward scheme, > https://github.com/scipy/scipy/blob/master/scipy/optimize/minpack/fdjac2.f > > - linalg.inv(matrix) is generally better spelled as solve(matrix, > identity_matrix) > > - in this case, it's indeed best to use QR or SVD. curve_fit does a > pseudoinverse: > > https://github.com/scipy/scipy/blob/v1.0.0/scipy/optimize/minpack.py#L502-L790 > > (IIRC this was written by Nikolay, and he cited Ceres or some other > industry-class optimization software). Depending on the use of the hessian, I would regularize the hessian if it is not positive definite or positive semi-definite. Simplest is to clip singular values to a threshold at or above zero or to add a Ridge factor. (I added a Ridge factor to help home made Newton method in statsmodels to avoid at least some invertibility or singularity problems.) statsmodels also has some function to find the nearest positive (semi-) definite matrix. (We don't regularize the final hessian used for the covariance of the parameter estimates in nonlinear maximum likelihood models, because if that is not positive definite, then there are more serious problems with the model or the data.) Josef > > Cheers, > > Evgeni > > > On Tue, Mar 27, 2018, 12:19 PM Gregor Thalhammer > wrote: >> >> >> >> > Am 27.03.2018 um 01:57 schrieb Andrew Nelson : >> > >> > I would like to calculate the Jacobian for a least squares problem, >> > followed by a Hessian estimation, then the covariance matrix from that >> > Hessian. >> > >> > With my current approach I sometimes experience issues with the >> > covariance matrix in that it's sometimes not positive semi-definite. I am >> > using the covariance matrix to seed a MCMC sampling process by supplying it >> > to `np.random.multivariate_normal` to get initial positions for the MC >> > chain. I am using the following code: >> > >> > ``` >> > from scipy.optimize._numdiff import approx_derivative >> > jac = approx_derivative(residuals_func, x0) >> > hess = np.matmul(jac.T, jac) >> > covar = np.linalg.inv(hess) >> > ``` >> > >> > Note that x0 may not be at a minimum. >> > >> > - would this be the usual way of estimating the Hessian, is there >> > anything incorrect with the approach? >> your straightforward approach is ok, especially since you don?t require >> the highest precision. An alternative would be to use automatic >> differentiation to calculate the derivatives accurately, e.g. using algopy, >> theano or tensor flow >> >> > - what is the recommended way (i.e. numerically stable) of inverting the >> > Hessian in such a situation? >> >> If your hess matrix is close to being singular, you could gain some >> precision by using the QR decomposition of the jacobian. In general to solve >> a linear system it is recommended to avoid calculating the the inverse >> matrix. >> >> > - does `optimize.leastsq` do anything different? >> >> leastsq wraps the MINPACK library, which brings it own carefully tuned >> numeric differentiation routines, and it uses QR decomposition. >> >> > - if `x0` is not at a minimum should the covariance matrix be expected >> > to be positive semi-definite anyway? >> If x0 is not a minimum, then there is no guarantee. Even if x0 is a >> minimum this might by violated due to numerical errors. >> >> best >> Gregor >> >> > _______________________________________________ >> > SciPy-User mailing list >> > SciPy-User at python.org >> > https://mail.python.org/mailman/listinfo/scipy-user >> >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > From josef.pktd at gmail.com Wed Mar 28 12:05:35 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 28 Mar 2018 12:05:35 -0400 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: Message-ID: On Mon, Mar 26, 2018 at 7:57 PM, Andrew Nelson wrote: > I would like to calculate the Jacobian for a least squares problem, followed > by a Hessian estimation, then the covariance matrix from that Hessian. > > With my current approach I sometimes experience issues with the covariance > matrix in that it's sometimes not positive semi-definite. I am using the > covariance matrix to seed a MCMC sampling process by supplying it to > `np.random.multivariate_normal` to get initial positions for the MC chain. I never looked much at the details of MCMC. But if your data or starting point doesn't provide good information about the Hessian, then, I think, you could shrink the hessian to or combine it with the prior covariance matrix, e.g. use a weighted average. Josef I > am using the following code: > > ``` > from scipy.optimize._numdiff import approx_derivative > jac = approx_derivative(residuals_func, x0) > hess = np.matmul(jac.T, jac) > covar = np.linalg.inv(hess) > ``` > > Note that x0 may not be at a minimum. > > - would this be the usual way of estimating the Hessian, is there anything > incorrect with the approach? > - what is the recommended way (i.e. numerically stable) of inverting the > Hessian in such a situation? > - does `optimize.leastsq` do anything different? > - if `x0` is not at a minimum should the covariance matrix be expected to be > positive semi-definite anyway? > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > From andyfaff at gmail.com Wed Mar 28 19:33:56 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Thu, 29 Mar 2018 10:33:56 +1100 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: Message-ID: I'm using the Hessian to calculate the covariance matrix for parameter estimates in least squares, i.e. the equivalent of `pcov` in `curve_fit` (I don't want to do a fit, I just want the covariance around the current location). On 29 March 2018 at 03:05, wrote: > On Mon, Mar 26, 2018 at 7:57 PM, Andrew Nelson wrote: > > I would like to calculate the Jacobian for a least squares problem, > followed > > by a Hessian estimation, then the covariance matrix from that Hessian. > > > > With my current approach I sometimes experience issues with the > covariance > > matrix in that it's sometimes not positive semi-definite. I am using the > > covariance matrix to seed a MCMC sampling process by supplying it to > > `np.random.multivariate_normal` to get initial positions for the MC > chain. > > I never looked much at the details of MCMC. > But if your data or starting point doesn't provide good information about > the > Hessian, then, I think, you could shrink the hessian to or combine it with > the > prior covariance matrix, e.g. use a weighted average. > > Josef > > > I > > am using the following code: > > > > ``` > > from scipy.optimize._numdiff import approx_derivative > > jac = approx_derivative(residuals_func, x0) > > hess = np.matmul(jac.T, jac) > > covar = np.linalg.inv(hess) > > ``` > > > > Note that x0 may not be at a minimum. > > > > - would this be the usual way of estimating the Hessian, is there > anything > > incorrect with the approach? > > - what is the recommended way (i.e. numerically stable) of inverting the > > Hessian in such a situation? > > - does `optimize.leastsq` do anything different? > > - if `x0` is not at a minimum should the covariance matrix be expected > to be > > positive semi-definite anyway? > > > > _______________________________________________ > > SciPy-User mailing list > > SciPy-User at python.org > > https://mail.python.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > -- _____________________________________ Dr. Andrew Nelson _____________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Wed Mar 28 20:33:18 2018 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 28 Mar 2018 20:33:18 -0400 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: Message-ID: On Wed, Mar 28, 2018 at 7:33 PM, Andrew Nelson wrote: > I'm using the Hessian to calculate the covariance matrix for parameter > estimates in least squares, i.e. the equivalent of `pcov` in `curve_fit` (I > don't want to do a fit, I just want the covariance around the current > location). But if you are just using it to initialize a Bayesian estimator, then you don't need the covariance from the data by itself. What I guess: You are using the cross-product of the jacobian (outer product of gradient). This should in general be positive definite. If it is not positive definite, then locally at least one of the parameters is not identified, i.e. your X in analogy to linear regression is singular. You can fiddle with numerical precision to maybe get it noisily PSD, and then you have just an almost singular jacobian with a very noise inverse. pinv(jac) Josef > > On 29 March 2018 at 03:05, wrote: >> >> On Mon, Mar 26, 2018 at 7:57 PM, Andrew Nelson wrote: >> > I would like to calculate the Jacobian for a least squares problem, >> > followed >> > by a Hessian estimation, then the covariance matrix from that Hessian. >> > >> > With my current approach I sometimes experience issues with the >> > covariance >> > matrix in that it's sometimes not positive semi-definite. I am using the >> > covariance matrix to seed a MCMC sampling process by supplying it to >> > `np.random.multivariate_normal` to get initial positions for the MC >> > chain. >> >> I never looked much at the details of MCMC. >> But if your data or starting point doesn't provide good information about >> the >> Hessian, then, I think, you could shrink the hessian to or combine it with >> the >> prior covariance matrix, e.g. use a weighted average. >> >> Josef >> >> >> I >> > am using the following code: >> > >> > ``` >> > from scipy.optimize._numdiff import approx_derivative >> > jac = approx_derivative(residuals_func, x0) >> > hess = np.matmul(jac.T, jac) >> > covar = np.linalg.inv(hess) >> > ``` >> > >> > Note that x0 may not be at a minimum. >> > >> > - would this be the usual way of estimating the Hessian, is there >> > anything >> > incorrect with the approach? >> > - what is the recommended way (i.e. numerically stable) of inverting the >> > Hessian in such a situation? >> > - does `optimize.leastsq` do anything different? >> > - if `x0` is not at a minimum should the covariance matrix be expected >> > to be >> > positive semi-definite anyway? >> > >> > _______________________________________________ >> > SciPy-User mailing list >> > SciPy-User at python.org >> > https://mail.python.org/mailman/listinfo/scipy-user >> > >> _______________________________________________ >> SciPy-User mailing list >> SciPy-User at python.org >> https://mail.python.org/mailman/listinfo/scipy-user > > > > > -- > _____________________________________ > Dr. Andrew Nelson > > > _____________________________________ > > _______________________________________________ > SciPy-User mailing list > SciPy-User at python.org > https://mail.python.org/mailman/listinfo/scipy-user > From andyfaff at gmail.com Wed Mar 28 20:40:49 2018 From: andyfaff at gmail.com (Andrew Nelson) Date: Thu, 29 Mar 2018 11:40:49 +1100 Subject: [SciPy-User] calculating the jacobian for a least-squares problem In-Reply-To: References: Message-ID: > You are using the cross-product of the jacobian (outer product of gradient). This is correct. I discovered a bug in the way I was calculating the covariance matrix. I was initially scaling all parameters to unity, and unwinding that scaling after inverting the Hessian. I was multiplying by the incorrect values when I did so. The diagonal terms in the covariance matrix were fine, but the off diagonal terms were incorrect, leading to problems when using np.random.multivariate_normal. -------------- next part -------------- An HTML attachment was scrubbed... URL: