From peridot.faceted at gmail.com Fri Sep 1 17:27:51 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Fri, 1 Sep 2006 17:27:51 -0400 Subject: [SciPy-user] fancy indexing and += Message-ID: Hi, I'm trying to figure out how to use fancy indexing for ordinary (non-sparse) arrays. On reading, it's clear enough what is meant: a = arange(3) a[[1,1,2]] -> array([1,1,2]) But when writing it's not so clear what should happen: a[[1,1,2]] = array([1,2,3]) a -> array([0,2,3]) All right, the assignment a[1]=... has been done twice, so I only see the effects of the second. Fair enough. But: a[[1,1,2]] += 1 a -> array([0,3,4]) That is, a[[1,1,2]] += 1 is equivalent to a[[1,1,2]] = a[[1,1,2]] + 1 and not for i in [1,1,2]: a[i]+=1 Is this really intended? It's surprising, it seems like that forces numpy to make a copy, and it leaves me with the question of how to implement the second without a loop... Well, for a better example: x = array([...]) i1 = ravel(ix_(arange(len(x)),arange(len(x)))[...,0]) i2 = ravel(ix_(arange(len(x)),arange(len(x)))[...,1]) dx = f(x[i1],x[i2]) # compute all the pairwise interactions # x[i1] += dx for (i,a) in enumerate(i1): x[a]+=dx[i] How do I write something like this without a loop? Sort i1 and concoct something that adds all the values with the same index before adding it to x? Thanks, A. M. Archibald From robert.kern at gmail.com Fri Sep 1 17:43:26 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 01 Sep 2006 16:43:26 -0500 Subject: [SciPy-user] fancy indexing and += In-Reply-To: References: Message-ID: <44F8A97E.2050703@gmail.com> A. M. Archibald wrote: > Hi, > > I'm trying to figure out how to use fancy indexing for ordinary > (non-sparse) arrays. On reading, it's clear enough what is meant: > > a = arange(3) > a[[1,1,2]] > -> array([1,1,2]) > > But when writing it's not so clear what should happen: > > a[[1,1,2]] = array([1,2,3]) > a > -> array([0,2,3]) > > All right, the assignment a[1]=... has been done twice, so I only see > the effects of the second. Fair enough. But: > > a[[1,1,2]] += 1 > a > -> array([0,3,4]) > > That is, a[[1,1,2]] += 1 is equivalent to > > a[[1,1,2]] = a[[1,1,2]] + 1 > > and not > > for i in [1,1,2]: > a[i]+=1 > > Is this really intended? It's surprising, it seems like that forces > numpy to make a copy, and it leaves me with the question of how to > implement the second without a loop... We've gone over this before on the numpy-discussion list. Given the way that Python implements augmented assignment, there is no way for the array object to know that you wanted to do some kind of loop. Python implements that statement as something equivalent to tmp = a.__getitem__([1,1,2]) tmp.__iadd__(1) a.__setitem__([1,1,2], tmp) There is no place to implement the kind of loop that you want to see. Nor should we. According to the reference manual, augmented assignment methods should be implemented such that x += y is nearly equivalent to x = x + y with the exception of possible in-place modification of x. http://docs.python.org/ref/augassign.html -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From peridot.faceted at gmail.com Fri Sep 1 21:06:27 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Fri, 1 Sep 2006 21:06:27 -0400 Subject: [SciPy-user] fancy indexing and += In-Reply-To: <44F8A97E.2050703@gmail.com> References: <44F8A97E.2050703@gmail.com> Message-ID: On 01/09/06, Robert Kern wrote: > Nor should we. According to the reference manual, augmented assignment methods > should be implemented such that > > x += y > > is nearly equivalent to > > x = x + y > > with the exception of possible in-place modification of x. One could argue that what I was asking for is the appropriate notion of "in-place modification" in this context, but that's a technicality. I suppose either version will be surprising to somebody. Thanks for your prompt reply. A.M. Archibald From robert.kern at gmail.com Fri Sep 1 23:09:18 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 01 Sep 2006 22:09:18 -0500 Subject: [SciPy-user] fancy indexing and += In-Reply-To: References: <44F8A97E.2050703@gmail.com> Message-ID: <44F8F5DE.5030404@gmail.com> A. M. Archibald wrote: > On 01/09/06, Robert Kern wrote: > >> Nor should we. According to the reference manual, augmented assignment methods >> should be implemented such that >> >> x += y >> >> is nearly equivalent to >> >> x = x + y >> >> with the exception of possible in-place modification of x. > > One could argue that what I was asking for is the appropriate notion > of "in-place modification" in this context, but that's a technicality. Well, not quite. The idea that the manual is trying to express is that both forms, "x += y" and "x = x + y" should result in final values of "x" that are the equal to each other. The only difference should be that the *identities* of the objects finally referenced by "x" might be different. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From jdhunter at ace.bsd.uchicago.edu Mon Sep 4 22:31:51 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Mon, 04 Sep 2006 21:31:51 -0500 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <44AEA608.4050101@gmail.com> (Robert Kern's message of "Fri, 07 Jul 2006 13:20:56 -0500") References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> Message-ID: <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Robert" == Robert Kern writes: Robert> Raycast with simulation of simplicity to handle degeneracy Robert> is probably your best bet. Robert> Robert> http://www.ecse.rpi.edu/Homepages/wrf/Research/Short_Notes/pnpoly.html Robert> John's "add up the angles" approach is not really a good Robert> one. I frequently find it referred to in the literature as Robert> "the worst thing you could possibly do." :-) OK, I could only stand Robert's taunt for so long. I added the pnpoly routine to matplotlib svn extension code: there is a function to test whether a a single x, y point is inside a polygon (matplotlib.nxutils.pnpoly) and a function which takes a list of points and returns a mask for the points inside (matplotlib.nxutils.points_inside_poly). When profiling against my original "worst thing you could possibly do" implementation, this new version is 15-35 times faster, in addition to being a better algorithm in terms of handling the degenerate cases. It is also considerably faster than the pure numpy pnpoly implementation, since it avoids all the temporaries and extra passes through the data of doing things at the numeric level. For 50 vertices and 1000 candidate inclusion points: nxutils.pnpoly vs pure numpy pnpoly: 50x speedup nxutils.points_inside_poly vs pure numpy pnpoly: 250x speedup #!/usr/bin/env python import time import matplotlib.nxutils as nxutils import matplotlib.numerix as nx import numpy as N def pnpoly(x, y, verts): """Check whether point is in the polygon defined by verts. verts - 2xN array point - (2,) array See http://www.ecse.rpi.edu/Homepages/wrf/Research/Short_Notes/pnpoly.html """ verts = verts.astype(float) xpi = verts[:,0] ypi = verts[:,1] # shift xpj = xpi[N.arange(xpi.size)-1] ypj = ypi[N.arange(ypi.size)-1] possible_crossings = ((ypi <= y) & (y < ypj)) | ((ypj <= y) & (y < ypi)) xpi = xpi[possible_crossings] ypi = ypi[possible_crossings] xpj = xpj[possible_crossings] ypj = ypj[possible_crossings] crossings = x < (xpj-xpi)*(y - ypi) / (ypj - ypi) + xpi return sum(crossings) % 2 numtrials, numverts, numpoints = 10, 50, 1000 verts = nx.mlab.rand(numverts, 2) points = nx.mlab.rand(numpoints, 2) mask1 = nx.zeros((numpoints, )) mask2 = nx.zeros((numpoints, )) t0 = time.time() for i in range(numtrials): for j in range(numpoints): x, y = points[j] b = pnpoly(x, y, verts) mask1[j] = b told = time.time() - t0 t0 = time.time() for i in range(numtrials): for j in range(numpoints): x, y = points[j] b = nxutils.pnpoly(x, y, verts) mask2[j] = b tnew = time.time() - t0 t0 = time.time() for i in range(numtrials): mask3 = nxutils.points_inside_poly(points, verts) tmany = time.time() - t0 print told, tnew, told/tnew print told, tmany, told/tmany for v0, v1, v2 in zip(mask1, mask2, mask3): assert(v0==v1) assert(v1==v2) JDH From robert.kern at gmail.com Mon Sep 4 23:22:56 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 04 Sep 2006 22:22:56 -0500 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <44FCED90.8010301@gmail.com> John Hunter wrote: > OK, I could only stand Robert's taunt for so long. I can get you to do what I want by *taunting* you? Fan*tast*ic! -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From willemjagter at gmail.com Tue Sep 5 02:34:56 2006 From: willemjagter at gmail.com (William Hunter) Date: Tue, 5 Sep 2006 08:34:56 +0200 Subject: [SciPy-user] amd64 specific error in fitpack spline interpolation In-Reply-To: References: <1156932026.713.36.camel@amcnote2.mcmorland.mph.auckland.ac.nz> Message-ID: <8b3894bc0609042334j38880903w2531d2ad99a7036c@mail.gmail.com> Said I'd reply... Works fine on my machine. On 30/08/06, George Nurser wrote: > > On 30 Aug 2006, at 11:00, Angus McMorland wrote: > > > Hi all, > > > > I posted about this problem a few months ago, but we didn't get very > > far, but the annoyance level of the problem is enough now for me to > > have > > another go. > > > > I'm trying to run the spline example from the cookbook (my exact code, > > called spl.py, is inlined below). On an ix86 computer this works fine, > > and returns the expected graphs, but on an amd64 machine I get the > > error > > below. Both computers are running debian etch, with recent svn > > numpy and > > scipy. > > > > TypeError: array cannot be safely cast to required type > >> /usr/lib/python2.4/site-packages/scipy/interpolate/fitpack.py(217) > >> splprep() > > 216 > > t,c,o=_fitpack._parcur(ravel(transpose(x)),w,u,ub,ue,k,task,ipar,s,t, > > --> 217 nest,wrk,iwrk,per) > > 218 _parcur_cache['u']=o['u'] > > > > I believe the error is generated by the _parcur call -> once in pdb, > > calling it again re-raises the same error. > > > > ipdb>_fitpack._parcur(ravel(transpose(x)),w,u,ub,ue,k,task,ipar,s,t, > > nest,wrk,iwrk,per) > > Same problem on our Opterons. > numpy v 2631 > scipy v 1614 > built & linked to acml > > [8]nohow@/noc/users/agn/python> python amdtest.py > Traceback (most recent call last): > File "amdtest.py", line 23, in ? > tckp,u = splprep([x,y,z],s=s,k=k,nest=-1) > File "/data/jrd/mod1/agn/ext/Linux/lib64/python2.3/site-packages/ > scipy/interpolate/fitpack.py", line 217, in splprep > TypeError: array cannot be safely cast to required type > > > -George. > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From stefan at sun.ac.za Tue Sep 5 04:15:44 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 5 Sep 2006 10:15:44 +0200 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <20060905081544.GJ26842@mentat.za.net> Hi John On Mon, Sep 04, 2006 at 09:31:51PM -0500, John Hunter wrote: > Robert> Raycast with simulation of simplicity to handle degeneracy > Robert> is probably your best bet. > > Robert> > Robert> http://www.ecse.rpi.edu/Homepages/wrf/Research/Short_Notes/pnpoly.html > > Robert> John's "add up the angles" approach is not really a good > Robert> one. I frequently find it referred to in the literature as > Robert> "the worst thing you could possibly do." :-) > > OK, I could only stand Robert's taunt for so long. I added the pnpoly > routine to matplotlib svn extension code: there is a function to test > whether a a single x, y point is inside a polygon > (matplotlib.nxutils.pnpoly) and a function which takes a list of > points and returns a mask for the points inside > (matplotlib.nxutils.points_inside_poly). When profiling against my > original "worst thing you could possibly do" implementation, this new > version is 15-35 times faster, in addition to being a better algorithm > in terms of handling the degenerate cases. It is also considerably > faster than the pure numpy pnpoly implementation, since it avoids all > the temporaries and extra passes through the data of doing things at > the numeric level. I am glad to see this included. The python implementation is my first choice, but if you are looking for speed specifically, I also wrote weave and ctype versions. Regards St?fan From jdhunter at ace.bsd.uchicago.edu Tue Sep 5 07:36:37 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 05 Sep 2006 06:36:37 -0500 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <20060905081544.GJ26842@mentat.za.net> (Stefan van der Walt's message of "Tue, 5 Sep 2006 10:15:44 +0200") References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> <20060905081544.GJ26842@mentat.za.net> Message-ID: <87odtuh7ei.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Stefan" == Stefan van der Walt writes: Stefan> I am glad to see this included. The python implementation Stefan> is my first choice, but if you are looking for speed Stefan> specifically, I also wrote weave and ctype versions. Hey Stefan, I have been working on a lasso tool for matplotlib, and the lasso polygon might have more than 100 vertices and the plot might have more than 100,000 markers, so I am definitely looking for top speed. Since it is for matplotlib, I also need maximum portability, so generally I can't count on the presence of weave (no scipy dependency) or ctypes (no python 2.4 dependency). But having your python implementation was a big help as a motivator and point of reference. Eventually, when we no longer need to support Numeric and numarray, which should be getting close, I'd like to see this kind of thing go into numpy if there is a suitable place for it there. At that point, we might also be able to jettison 2.3 support, and a ctypes solution would be an excellent choice. I didn't see your ctypes or weave implementations in the original thread -- could you post a link or send them my way? Thanks! JDH From maik.troemel at maitro.net Tue Sep 5 08:05:38 2006 From: maik.troemel at maitro.net (=?ISO-8859-1?Q?Maik_Tr=F6mel?=) Date: Tue, 05 Sep 2006 14:05:38 +0200 Subject: [SciPy-user] Interpolate 1D Message-ID: <44FD6812.70106@maitro.net> Hello list, when I try to use scipy.interpolate.interpolate.interp1d( olddims[-1], a, kind='cubic' ) I get an ERROR: File "/usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py", line 118, in __init__ raise NotImplementedError, "Only linear supported for now. Use "\ NotImplementedError: Only linear supported for now. Use fitpack routines for other types. But under http://www.scipy.org/doc/api_docs/scipy.interpolate.interpolate.interp1d.html kind = 'cubic' is listed. Whats wrong with the command? Or is cubic not implemented yet? Thanks for your help! Greetings Maik From jdhunter at ace.bsd.uchicago.edu Tue Sep 5 10:55:53 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 05 Sep 2006 09:55:53 -0500 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <87odtuh7ei.fsf@peds-pc311.bsd.uchicago.edu> (John Hunter's message of "Tue, 05 Sep 2006 06:36:37 -0500") References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> <20060905081544.GJ26842@mentat.za.net> <87odtuh7ei.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <87fyf6uzuu.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "John" == John Hunter writes: John> need maximum portability, so generally I can't count on the John> presence of weave (no scipy dependency) or ctypes (no python John> 2.4 dependency). But having your python implementation was Hmm, it now occurs to me that ctypes is only in python 2.5. Note to self: never post before the morning coffee has kicked in. JDH From peridot.faceted at gmail.com Tue Sep 5 11:29:49 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Tue, 5 Sep 2006 11:29:49 -0400 Subject: [SciPy-user] Interpolate 1D In-Reply-To: <44FD6812.70106@maitro.net> References: <44FD6812.70106@maitro.net> Message-ID: On 05/09/06, Maik Tr?mel wrote: > Hello list, > > when I try to use scipy.interpolate.interpolate.interp1d( olddims[-1], > a, kind='cubic' ) I get an ERROR: > > File > "/usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py", > line 118, in __init__ > raise NotImplementedError, "Only linear supported for now. Use "\ > NotImplementedError: Only linear supported for now. Use fitpack routines > for other types. > > But under > http://www.scipy.org/doc/api_docs/scipy.interpolate.interpolate.interp1d.html > kind = 'cubic' is listed. Whats wrong with the command? Or is cubic not > implemented yet? > > Thanks for your help! It seems not to be implemented, but you can use scipy.interpolate.fitpack2.InterpolatedUnivariateSpline to do the same thing for splines (of various orders). I think it also provides all the handy extras like derivatives and root-finding. It doesn't seem to be able to raise exceptions or insert NaNs for out-of-bound values, it just extrapolates. But otherwise it seems to be the right tool for the job. A. M. Archibald From sransom at nrao.edu Tue Sep 5 14:29:10 2006 From: sransom at nrao.edu (Scott Ransom) Date: Tue, 5 Sep 2006 14:29:10 -0400 Subject: [SciPy-user] small bug in filter_design.py Message-ID: <200609051429.10428.sransom@nrao.edu> Hi All, There is a small bug in the initial conditions parameter in ellipap() in the filter_design.py module: File "/users/sransom/lib/python2.3/site-packages/scipy/signal/filter_design.py", line 1049, in ellipap disp=0) File "/users/sransom/lib/python2.3/site-packages/scipy/optimize/optimize.py", line 140, in fmin N = len(x0) TypeError: len() of unsized object Here is the "fix": 1048c1048 < m = optimize.fmin(kratio, 0.5, args=(krat,), maxfun=250, maxiter=250, --- > m = optimize.fmin(kratio, [0.5], args=(krat,), maxfun=250, maxiter=250, Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From stefan at sun.ac.za Tue Sep 5 20:08:00 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 6 Sep 2006 02:08:00 +0200 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <87fyf6uzuu.fsf@peds-pc311.bsd.uchicago.edu> References: <44A4E961.8000208@iam.uni-stuttgart.de> <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> <20060905081544.GJ26842@mentat.za.net> <87odtuh7ei.fsf@peds-pc311.bsd.uchicago.edu> <87fyf6uzuu.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <20060906000800.GJ20816@mentat.za.net> On Tue, Sep 05, 2006 at 09:55:53AM -0500, John Hunter wrote: > >>>>> "John" == John Hunter writes: > > John> need maximum portability, so generally I can't count on the > John> presence of weave (no scipy dependency) or ctypes (no python > John> 2.4 dependency). But having your python implementation was > > Hmm, it now occurs to me that ctypes is only in python 2.5. Note to > self: never post before the morning coffee has kicked in. Either way, the code can be found at http://mentat.za.net I guess a Numpy C-API wrapper would be easy to write -- I just never got round to it, since I had the ctypes and weave versions working. Regards St?fan From stefan at sun.ac.za Tue Sep 5 20:32:30 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 6 Sep 2006 02:32:30 +0200 Subject: [SciPy-user] Triangulation of L-shaped domains In-Reply-To: <20060906000800.GJ20816@mentat.za.net> References: <87wtaysrof.fsf@peds-pc311.bsd.uchicago.edu> <44A52CA8.6060303@iam.uni-stuttgart.de> <44A5521C.6060206@gmail.com> <2239236B-934B-435C-A715-DE66EC6B74FA@tamu.edu> <44AEA608.4050101@gmail.com> <87bqpvniw8.fsf@peds-pc311.bsd.uchicago.edu> <20060905081544.GJ26842@mentat.za.net> <87odtuh7ei.fsf@peds-pc311.bsd.uchicago.edu> <87fyf6uzuu.fsf@peds-pc311.bsd.uchicago.edu> <20060906000800.GJ20816@mentat.za.net> Message-ID: <20060906003230.GK20816@mentat.za.net> On Wed, Sep 06, 2006 at 02:08:00AM +0200, Stefan van der Walt wrote: > On Tue, Sep 05, 2006 at 09:55:53AM -0500, John Hunter wrote: > > >>>>> "John" == John Hunter writes: > > > > John> need maximum portability, so generally I can't count on the > > John> presence of weave (no scipy dependency) or ctypes (no python > > John> 2.4 dependency). But having your python implementation was > > > > Hmm, it now occurs to me that ctypes is only in python 2.5. Note to > > self: never post before the morning coffee has kicked in. > > Either way, the code can be found at > > http://mentat.za.net > > I guess a Numpy C-API wrapper would be easy to write -- I just never > got round to it, since I had the ctypes and weave versions working. You guys don't waste much time! I see this has already been done in matplotlib SVN. St?fan From stefan at sun.ac.za Tue Sep 5 20:47:33 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 6 Sep 2006 02:47:33 +0200 Subject: [SciPy-user] small bug in filter_design.py In-Reply-To: <200609051429.10428.sransom@nrao.edu> References: <200609051429.10428.sransom@nrao.edu> Message-ID: <20060906004733.GM20816@mentat.za.net> On Tue, Sep 05, 2006 at 02:29:10PM -0400, Scott Ransom wrote: > Hi All, > > There is a small bug in the initial conditions parameter in ellipap() in > the filter_design.py module: > > File "/users/sransom/lib/python2.3/site-packages/scipy/signal/filter_design.py", > line 1049, in ellipap > disp=0) > File "/users/sransom/lib/python2.3/site-packages/scipy/optimize/optimize.py", > line 140, in fmin > N = len(x0) > TypeError: len() of unsized object Fixed in SVN. Would you mind writing a test to see whether ellipap gives the correct answer? Thanks St?fan From pgmdevlist at gmail.com Tue Sep 5 23:03:42 2006 From: pgmdevlist at gmail.com (PGM) Date: Tue, 5 Sep 2006 23:03:42 -0400 Subject: [SciPy-user] Correlation Message-ID: <200609052303.44081.pgmdevlist@gmail.com> Folks, I'd need to compute the cross-correlation of two series. For the 1D case, I use numpy.correlate and select the last half of the result to get what I want. I need to work with 2Darrays, either by row, or by colum, or on the flattened array. I'm a bit at loss about what commands to use: scipy.signal.signaltools does not allow to work per axis. Any comments will be welcome Thanks in advance P. From stefan at sun.ac.za Wed Sep 6 04:29:44 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 6 Sep 2006 10:29:44 +0200 Subject: [SciPy-user] Correlation In-Reply-To: <200609052303.44081.pgmdevlist@gmail.com> References: <200609052303.44081.pgmdevlist@gmail.com> Message-ID: <20060906082944.GR20816@mentat.za.net> On Tue, Sep 05, 2006 at 11:03:42PM -0400, PGM wrote: > Folks, > I'd need to compute the cross-correlation of two series. > For the 1D case, I use numpy.correlate and select the last half of the result > to get what I want. > I need to work with 2Darrays, either by row, or by colum, or on the flattened > array. I'm a bit at loss about what commands to use: > scipy.signal.signaltools does not allow to work per axis. You can try computing cross-correlation axis-wise using the FFT (by reversing one of the input arguments). I attach an example of how to do convolution. Regards St?fan -------------- next part -------------- A non-text attachment was scrubbed... Name: cfilt.py Type: text/x-python Size: 447 bytes Desc: not available URL: From erickt at dslextreme.com Wed Sep 6 04:42:16 2006 From: erickt at dslextreme.com (Erick Tryzelaar) Date: Wed, 06 Sep 2006 01:42:16 -0700 Subject: [SciPy-user] test failures in 0.5.1 Message-ID: <44FE89E8.2050905@dslextreme.com> Hello, I just installed 0.5.1, and I got these errors. Any idea what's going on? -e ====================================================================== ERROR: check loadmat case cellnest ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 80, in cc self._check_case(name, expected) File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 75, in _check_case self._check_level(k_label, expected, matdict[k]) File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 33, in _check_level self._check_level(level_label, ev, actual[i]) File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 30, in _check_level assert len(expected) == len(actual), "Different list lengths at %s" % label TypeError: len() of unsized object ====================================================================== FAIL: check loadmat case vec ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 80, in cc self._check_case(name, expected) File "/opt/local/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 74, in _check_case assert k in matdict, "Missing key at %s" % k_label AssertionError: Missing key at Test 'vec', file:/opt/local/lib/python2.4/site-packages/scipy/io/tests/./data/testvec_4_GLNX86.mat, variable fit_params ====================================================================== FAIL: check_dot (scipy.lib.tests.test_blas.test_fblas1_simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/local/lib/python2.4/site-packages/scipy/lib/blas/tests/test_blas.py", line 76, in check_dot assert_almost_equal(f([3j,-4,3-4j],[2,3,1]),-9+2j) File "/opt/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 156, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: (-1.998720645904541+5.8724710243597094e-37j) DESIRED: (-9+2j) ====================================================================== FAIL: check_dot (scipy.linalg.tests.test_blas.test_fblas1_simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/local/lib/python2.4/site-packages/scipy/linalg/tests/test_blas.py", line 75, in check_dot assert_almost_equal(f([3j,-4,3-4j],[2,3,1]),-9+2j) File "/opt/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 156, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: (-1.998720645904541+5.8724710243597094e-37j) DESIRED: (-9+2j) ---------------------------------------------------------------------- Ran 1569 tests in 12.867s FAILED (failures=3, errors=1) From Clinton.Evans at comdev.ca Wed Sep 6 09:09:15 2006 From: Clinton.Evans at comdev.ca (Clinton Evans) Date: Wed, 06 Sep 2006 09:09:15 -0400 Subject: [SciPy-user] =?iso-8859-1?q?Genetic_Algorithm_Module_*_ga?= Message-ID: I have fixed the ga module sufficiently to run the example code and run my old code. I am not an "official" scipy developer so I can't post it to subversion. However, if anyone wants a copy, please let me know. Clinton From robert.kern at gmail.com Wed Sep 6 11:27:39 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 06 Sep 2006 10:27:39 -0500 Subject: [SciPy-user] Genetic Algorithm Module * ga In-Reply-To: References: Message-ID: <44FEE8EB.2030206@gmail.com> Clinton Evans wrote: > I have fixed the ga module sufficiently to run the example code and run my old code. I am not an "official" scipy developer so I can't post it to subversion. However, if anyone wants a copy, please let me know. You can, however, post a patch to the Trac after registering. http://projects.scipy.org/scipy/scipy -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josegomez at gmx.net Wed Sep 6 12:44:04 2006 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Wed, 06 Sep 2006 18:44:04 +0200 Subject: [SciPy-user] Fitting a distribution to some data Message-ID: <20060906164404.59450@gmx.net> Hi, I am trying to fit a distribution to some data points (actually, I want to test which distribution is best to model the data). Moreover, I want to estimate the distributions parameters. I am sure that the stats module has many functions to help me with this, but I am not sure I understand how to use them. As a test, I create some RVs using say y=stats.distributions.norm.rvs ( size=100). I can then find the mean and standard deviation using (mu,sigma)=stats.distributions.norm.fit(y) which works fine. However, some methods do not work. For example, the pdf(self,x..) method returns an array of 0s, and similarly for the cdf() method. However, the _pdf() and _cdf() methods give the desired result. Is this a bug? Am I suppossed to use the underscore methods or the public methods? Also, is there some example of how to use kstest? it might be related, but if I try to test the previous data, I get the following error: stats.kstest(y,'norm',args=(mu,sigma)) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /home/ggjlgd/ /usr/lib/python2.4/site-packages/scipy/stats/stats.py in kstest(rvs, cdf, args, N) 1721 # D = max(D1,D2) 1722 D = D1 -> 1723 return D, distributions.ksone.sf(D,N) 1724 1725 def chisquare(f_obs, f_exp=None): /usr/lib/python2.4/site-packages/scipy/stats/distributions.py in sf(self, x, *args, **kwds) 521 output = zeros(shape(cond),'d') 522 insert(output,(1-cond0)*(cond1==cond1),self.badvalue) --> 523 insert(output,cond2,1.0) 524 goodargs = argsreduce(cond, *((x,)+args)) 525 insert(output,cond,self._sf(*goodargs)) /usr/lib/python2.4/site-packages/numpy/lib/function_base.py in insert(arr, obj, values, axis) 1190 1191 obj = asarray(obj, dtype=intp) -> 1192 numnew = len(obj) 1193 index1 = obj + arange(numnew) 1194 index2 = setdiff1d(arange(numnew+N),index1) TypeError: len() of unsized object Am I doing something wrong here? Is it a bug? My versions are Scipy: 0.5.0.2178 Numpy: 1.0b5 and I run on Kubuntu Dapper on Linux. I use Andrew Straw's packages. Thanks Jose -- "Feel free" ? 10 GB Mailbox, 100 FreeSMS/Monat ... Jetzt GMX TopMail testen: http://www.gmx.net/de/go/topmail From atriolo at research.telcordia.com Wed Sep 6 14:51:08 2006 From: atriolo at research.telcordia.com (Tony Triolo) Date: Wed, 06 Sep 2006 14:51:08 -0400 Subject: [SciPy-user] Problem with Windows Scipy 0.5 In-Reply-To: References: Message-ID: <44FF189C.3080202@research.telcordia.com> David, Thanks for the info. The naming convention you mentioned, i.e., adding the numpy version which was used for compiling scipy to the filename, would have saved me some time and effort. Oh well... The enthought edition worked well on both my WinXP and Win2000 machines. I am now happily on my way. Thanks again. -Tony > Hi Tony! > > You have to use a (C-API) compatible Numpy version which should be > numpy-1.0b1.win32-py2.4.exe for your scipy download. ...but it seems > like this is no longer available on SF for download. If you don't want > to compile yourself you may also give http://code.enthought.com/enthon/ > a try. > > The download page on the wiki is not pointing out this C-API > compatibility issue clearly. To avoid confusion it would be good to add > the numpy version which was used for compiling scipy into the name of > the binary in the future. E.g. > scipy-0.5.0-numpy1.0.1b1-win32-py2.4.exe > > Regards, > David > > > ------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > End of SciPy-user Digest, Vol 37, Issue 1 > ***************************************** > From fiepye at atlas.cz Wed Sep 6 15:13:54 2006 From: fiepye at atlas.cz (Fie Pye) Date: Wed, 06 Sep 2006 21:13:54 +0200 Subject: [SciPy-user] Fw: Scientific computing and data visualization. Message-ID: Hallo I would like to have a high class open source tools for scientific computing and powerful 2D and 3D data visualisation. Therefore I chose python, numpy and scipy as a base. Now I am in search for a visualisation tool. I tried matplotlib and py_opendx with OpenDx. OpenDx seems to me very good but the project py_opendx looks like closed. After py_opendx instalation and subsequent testing I got an error that needs discussion with author or an experienced user. Unfortunately a mail to author returned as undeliverable. Does anybody now about suitable visualisation tool? Does anybody have an experience with OpenDx and py_opendx instalation? Thanks for your response. fiepye From oliphant.travis at ieee.org Wed Sep 6 15:29:45 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 06 Sep 2006 13:29:45 -0600 Subject: [SciPy-user] Fitting a distribution to some data In-Reply-To: <20060906164404.59450@gmx.net> References: <20060906164404.59450@gmx.net> Message-ID: <44FF21A9.40807@ieee.org> Jose Luis Gomez Dans wrote: > Hi, > I am trying to fit a distribution to some data points (actually, I want to test which distribution is best to model the data). Moreover, I want to estimate the distributions parameters. I am sure that the stats module has many functions to help me with this, but I am not sure I understand how to use them. > > As a test, I create some RVs using say y=stats.distributions.norm.rvs ( size=100). > I can then find the mean and standard deviation using > (mu,sigma)=stats.distributions.norm.fit(y) > which works fine. However, some methods do not work. For example, the pdf(self,x..) method returns an array of 0s, and similarly for the cdf() method. Are you using these functions correctly? They seem to work for me. For example, you don't pass in "self" as the first argument. Here is a usage x = r_[-10:10:100j] y = stats.distribution.norm.pdf(x, loc=0.3, scale=2.0) > Also, is there some example of how to use kstest? it might be related, but if I try to test the previous data, I get the following error: > stats.kstest(y,'norm',args=(mu,sigma)) > Hmm.. This works for me. y=stats.distributions.norm.rvs(size=100) mu,sigma = stats.distributions.norm.fit(y) stats.kstest(y,'norm',args=(mu,sigma)) I am running scipy 0.5.1, but I don't know if that should matter in this case. -Travis From fredantispam at free.fr Wed Sep 6 15:41:34 2006 From: fredantispam at free.fr (fred) Date: Wed, 06 Sep 2006 21:41:34 +0200 Subject: [SciPy-user] Fw: Scientific computing and data visualization. In-Reply-To: References: Message-ID: <44FF246E.3030505@free.fr> Fie Pye a ?crit : > Hallo > > I would like to have a high class open source tools for scientific computing and powerful 2D and 3D data visualisation. Therefore I chose python, numpy and scipy as a base. Now I am in search for a visualisation tool. I tried matplotlib and py_opendx with OpenDx. OpenDx seems to me very good but the project py_opendx looks like closed. After py_opendx instalation and subsequent testing I got an error that needs discussion with author or an experienced user. Unfortunately a mail to author returned as undeliverable. > > Does anybody now about suitable visualisation tool? > > Does anybody have an experience with OpenDx and py_opendx instalation? > > Thanks for your response. Mayavi2, sure ! :-) http://scipy.org/Cookbook/MayaVi -- Fred. From josegomez at gmx.net Wed Sep 6 18:11:31 2006 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Thu, 07 Sep 2006 00:11:31 +0200 Subject: [SciPy-user] Fitting a distribution to some data In-Reply-To: <44FF21A9.40807@ieee.org> References: <20060906164404.59450@gmx.net> <44FF21A9.40807@ieee.org> Message-ID: <20060906221131.292200@gmx.net> Hi Travis, Many thanks for your quick reply! After reading some other messages it dawned on me that the problem was the Numpy version. I had 1.0b5, and this caused the problems. Using 1.0b2 works fine now. Many thanks again! Jos? -- Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! Ideal f?r Modem und ISDN: http://www.gmx.net/de/go/smartsurfer From brendansimons at yahoo.ca Thu Sep 7 01:08:39 2006 From: brendansimons at yahoo.ca (Brendan Simons) Date: Thu, 7 Sep 2006 01:08:39 -0400 Subject: [SciPy-user] Efficient submatrix of a sparse matrix In-Reply-To: References: Message-ID: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> Hi all, I have a neat little problem for which I think sparse matrices are the answer. I need to store a bunch of overlapping "intervals", (a, b pairs for which any a <= val < b crosses that interval) in a manner which makes "stabbing inquiries" (which intervals cross a specified value) easy. The canonical way to to this is with interval trees (a good summary here: http://w3.jouy.inra.fr/unites/miaj/public/vigneron/ cs4235/l5cs4235.pdf), however I think I can simplify things as follows: 1) perform an argsort on the interval a and b endpoints. This gives the position of each interval in an "order" matrix M, where each row and each column contains only one interval, and intervals are sorted in rows by start point, and in columns by endpoint 2) for a "stab" point x, do a binary search to find the row i and column j where x could be inserted into M and maintain its ordering. The submatrix of M[:i, j:] would contains all those intervals which start before x, and end after x. Since the order matrix M is mostly empty it would make sense to use a sparse matrix storage scheme, however the only one in scipy.sparse that allows 2d slicing is the dok_matrix, and the slicing algorithm is O(n), which is much too expensive for my purposes (I have to do thousands of stabbing inquiries on a space containing thousands of intervals). Is there no more efficient algorithm for 2d slicing of a sparse matrix? -Brendan -- Brendan Simons __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From maik.troemel at maitro.net Thu Sep 7 06:00:02 2006 From: maik.troemel at maitro.net (=?ISO-8859-1?Q?Maik_Tr=F6mel?=) Date: Thu, 07 Sep 2006 12:00:02 +0200 Subject: [SciPy-user] Interpolate 1D In-Reply-To: <44FFDAE6.3060709@maitro.net> References: <44FFDAE6.3060709@maitro.net> Message-ID: <44FFEDA2.3070406@maitro.net> Hello, scipy.interpolate.fitpack2.InterpolatedUnivariateSpline seems to be a 1d interpolation method. I used the Cookbook/Rebinning-Example to Interpolate. I don't know if it works with InterpolatedUnivariateSpline. I also tried out scipy.interpolate.interpolate.interp2d. But I get an error. Probably somebody knows what I'm doing wrong. Here's the script: import scipy.interpolate #just some data a = numpy.zeros((4,4), dtype=''Float32') oldx= numpy.arange(4) oldy = numpy.arange(4) newx = numpy.zeros((4), dtype=''Float32') newy =numpy.zeros((4), dtype=''Float32') for s in range(8): newx[s] = s * 0.5 newy[s] = s * 0.5 #interpolation intinst = scipy.interpolate.interpolate.interp2d(oldx, oldy, a, kind = 'cubic') interpolated = intinst(newx, newy) And here's the error: /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in __call__(self, x, y, dx, dy) 62 x = atleast_1d(x) 63 y = atleast_1d(y) ---> 64 z,ier=fitpack._fitpack._bispev(*(self.tck+[x,y,dx,dy])) 65 if ier==10: raise ValueError,"Invalid input data" 66 if ier: raise TypeError,"An error occurred" AttributeError: interp2d instance has no attribute 'tck' Thanks for your help. Greetings Maik Maik Tr?mel wrote: > Date: Tue, 5 Sep 2006 11:29:49 -0400 > > From: "A. M. Archibald" > Subject: Re: [SciPy-user] Interpolate 1D > To: "SciPy Users List" > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > On 05/09/06, Maik Tr?mel wrote: > >> > Hello list, >> > >> > when I try to use scipy.interpolate.interpolate.interp1d( olddims[-1], >> > a, kind='cubic' ) I get an ERROR: >> > >> > File >> > "/usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py", >> > line 118, in __init__ >> > raise NotImplementedError, "Only linear supported for now. Use "\ >> > NotImplementedError: Only linear supported for now. Use fitpack >> routines >> > for other types. >> > >> > But under >> > >> http://www.scipy.org/doc/api_docs/scipy.interpolate.interpolate.interp1d.html >> >> > kind = 'cubic' is listed. Whats wrong with the command? Or is cubic >> not >> > implemented yet? >> > >> > Thanks for your help! >> > > It seems not to be implemented, but you can use > scipy.interpolate.fitpack2.InterpolatedUnivariateSpline to do the same > thing for splines (of various orders). I think it also provides all > the handy extras like derivatives and root-finding. It doesn't seem to > be able to raise exceptions or insert NaNs for out-of-bound values, it > just extrapolates. But otherwise it seems to be the right tool for the > job. > > A. M. Archibald > > > ------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > End of SciPy-user Digest, Vol 37, Issue 3 > ***************************************** > > From amcmorl at gmail.com Thu Sep 7 07:09:42 2006 From: amcmorl at gmail.com (Angus McMorland) Date: Thu, 7 Sep 2006 23:09:42 +1200 Subject: [SciPy-user] Interpolate 1D In-Reply-To: <44FFEDA2.3070406@maitro.net> References: <44FFDAE6.3060709@maitro.net> <44FFEDA2.3070406@maitro.net> Message-ID: Hi Maik, On 07/09/06, Maik Tr?mel wrote: > > Hello, I also tried out scipy.interpolate.interpolate.interp2d. But I get an > error. Probably somebody knows what I'm doing wrong. Here's the script: > > import scipy.interpolate > > #just some data > a = numpy.zeros((4,4), dtype=''Float32') > oldx= numpy.arange(4) > oldy = numpy.arange(4) > newx = numpy.zeros((4), dtype=''Float32') > newy =numpy.zeros((4), dtype=''Float32') > for s in range(8): > newx[s] = s * 0.5 > newy[s] = s * 0.5 > > #interpolation > intinst = scipy.interpolate.interpolate.interp2d(oldx, oldy, a, kind = > 'cubic') > interpolated = intinst(newx, newy) > > > And here's the error: > /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in > __call__(self, x, y, dx, dy) > 62 x = atleast_1d(x) > 63 y = atleast_1d(y) > ---> 64 z,ier=fitpack._fitpack._bispev(*(self.tck+[x,y,dx,dy])) > 65 if ier==10: raise ValueError,"Invalid input data" > 66 if ier: raise TypeError,"An error occurred" > > AttributeError: interp2d instance has no attribute 'tck' > > Thanks for your help. > > Greetings Maik I've tried your script, and get several different errors, so are you sure that your error messages are coming from this script? The biggest conceptual jump needed with your script, as it runs on my system, (scipy 0.5.0.2177, numpy 1.0b4.dev3055) is that oldx and oldy need to be the same length as the flattened 'a' - that is an (x,y) pair for each value of a, which itself could also be flat. Here is, I think, a working version: import scipy.interpolate import numpy #just some data a = numpy.random.random_sample((4,4)).astype('Float32') oldinds = numpy.indices((4,4)) oldx = oldinds[1].flatten() # because indices[1] is the x values oldy = oldinds[0].flatten() newx = numpy.linspace(0,3,8).astype('Float32') newy = numpy.linspace(0,3,8).astype('Float32') #interpolation intinst = scipy.interpolate.interp2d(oldx, oldy, a, kind = 'cubic') interpolated = intinst(newx, newy) #display the result - can ignore if you don't have matplotlib installed import pylab pylab.subplot(121) pylab.imshow(a,interpolation='nearest') pylab.subplot(122) pylab.imshow(interpolated, interpolation='nearest') -- AJC McMorland, PhD Student Physiology, University of Auckland Armourer, Auckland University Fencing Secretary, Fencing North Inc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From willemjagter at gmail.com Thu Sep 7 11:10:58 2006 From: willemjagter at gmail.com (William Hunter) Date: Thu, 7 Sep 2006 17:10:58 +0200 Subject: [SciPy-user] Efficient submatrix of a sparse matrix In-Reply-To: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> References: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> Message-ID: <8b3894bc0609070810v2963d9ddlb02ef108ab2e5de3@mail.gmail.com> You might consider making use of UMFPACK (.../scipy/linsolve/umfpack)? I'm still figuring it out myself, so this is all I can offer :( I think it will require you to do some of your work in C. See http://www.cise.ufl.edu/research/sparse in order to download the sources (and more). The instructions on the website are very clear and well written. Hope this helps (a bit?) -- WH On 07/09/06, Brendan Simons wrote: > Hi all, > > I have a neat little problem for which I think sparse matrices are > the answer. I need to store a bunch of overlapping "intervals", (a, > b pairs for which any a <= val < b crosses that interval) in a manner > which makes "stabbing inquiries" (which intervals cross a specified > value) easy. The canonical way to to this is with interval trees (a > good summary here: http://w3.jouy.inra.fr/unites/miaj/public/vigneron/ > cs4235/l5cs4235.pdf), however I think I can simplify things as follows: > > 1) perform an argsort on the interval a and b endpoints. This gives > the position of each interval in an "order" matrix M, where each row > and each column contains only one interval, and intervals are sorted > in rows by start point, and in columns by endpoint > > 2) for a "stab" point x, do a binary search to find the row i and > column j where x could be inserted into M and maintain its ordering. > The submatrix of M[:i, j:] would contains all those intervals which > start before x, and end after x. > > Since the order matrix M is mostly empty it would make sense to use a > sparse matrix storage scheme, however the only one in scipy.sparse > that allows 2d slicing is the dok_matrix, and the slicing algorithm > is O(n), which is much too expensive for my purposes (I have to do > thousands of stabbing inquiries on a space containing thousands of > intervals). > > Is there no more efficient algorithm for 2d slicing of a sparse matrix? > > -Brendan > -- > Brendan Simons > __________________________________________________ > Do You Yahoo!? > Tired of spam? Yahoo! Mail has the best spam protection around > http://mail.yahoo.com > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From jstrunk at enthought.com Thu Sep 7 19:16:52 2006 From: jstrunk at enthought.com (Jeff Strunk) Date: Thu, 7 Sep 2006 18:16:52 -0500 Subject: [SciPy-user] ISP changeover Tuesday 9/12 7:00 PM Central Message-ID: <200609071816.53060.jstrunk@enthought.com> Good afternoon, Unfortunately, our recent change in internet service providers is not working out. We will be switching to a more reliable provider on Tuesday 9/12 at 7:00 PM Central. Please allow for up to two hours of downtime. I will send an email announcing the start and completion of this maintenance. This will effect all Enthought servers as well as the SciPy server which hosts many Open Source projects. The problem was that our in-building internet service provider neglected to mention that our internet connection was over a wireless link to a different building. This link had very high latency. They have fixed this problem, but we feel that wireless is not stable enough for our needs. In order to provide higher quality service, we will be using 3 bonded T1s from AT&T after this switchover. Please pass this message along to people that I have missed. If you have any questions, please direct them to me. We apologize for the inconvenience. Jeff Strunk Enthought, Inc. From rshepard at appl-ecosys.com Thu Sep 7 19:43:15 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Thu, 7 Sep 2006 16:43:15 -0700 (PDT) Subject: [SciPy-user] ISP changeover Tuesday 9/12 7:00 PM Central In-Reply-To: <200609071816.53060.jstrunk@enthought.com> References: <200609071816.53060.jstrunk@enthought.com> Message-ID: On Thu, 7 Sep 2006, Jeff Strunk wrote: > Unfortunately, our recent change in internet service providers is not > working out. We will be switching to a more reliable provider on Tuesday > 9/12 at 7:00 PM Central. Please allow for up to two hours of downtime. I > will send an email announcing the start and completion of this > maintenance. Thank you, Jeff, for your consideration and attention to detail and quality. Those of us on the mail list appreciate your efforts. I'll head to the local brewpub Tuesday evening, then. :-) Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 From fccoelho at gmail.com Thu Sep 7 21:35:13 2006 From: fccoelho at gmail.com (Flavio Coelho) Date: Thu, 7 Sep 2006 22:35:13 -0300 Subject: [SciPy-user] f2py and xmingw Message-ID: Hi, I am trying to compile a Fortran extension module for windows using Scipy. I want to know if it is possible to do this with xmingw (from my Gentoo Linux Box) Under Linux I can automate the generation of binary packages with setup.pybdist (and a properly written setup.py) what is the recomended way to generate binary packages(with fortran extensions) for windows? I have tried to run f2y specifying the crosscompilers from xmingw but it failed: command line: f2py -c --compiler=/opt/xmingw/bin/i386-mingw32msvc-gcc --f77exec=opt/xmingw/bin/i386-mingw32msvc-g77 -m flib flib.f Error: error: don't know how to compile C/C++ code on platform 'posix' with '/opt/xmingw/bin/i386-mingw32msvc-gcc' compiler It looks like I may be missing some compiler flags... Since my experience with compiled languages is somewhat limited, I will gladly accept any suggestions. thanks, -- Fl?vio Code?o Coelho registered Linux user # 386432 --------------------------- "Laws are like sausages. It's better not to see them being made." Otto von Bismark -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryanlists at gmail.com Thu Sep 7 22:55:29 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Thu, 7 Sep 2006 21:55:29 -0500 Subject: [SciPy-user] signal.lti Message-ID: I am having a rough time trying to use the lti portion of scipy.signal. For starters, I just wanted to generate a step response for an integrator (1/s). I tried creating this system two ways, and neither seems to work correctly. I think this system has a numerator of 1 and a denominator of [1,0]. If I create the system based on the num, den I get: In [51]: mysys=scipy.signal.lti(1,[1,0]) In [52]: mysys.A Out[52]: NumPy array, format: long [[-0.]] In [53]: mysys.B Out[53]: NumPy array, format: long [[1]] In [54]: mysys.C Out[54]: NumPy array, format: long [[ 1.]] In [55]: mysys.D Out[55]: NumPy array, format: long [ 0.] I get an error when I try to find the step response of this system. I think part of the problem is that the matrices aren't right. I think they should be In [46]: A=array([[0,1],[0,0]]) In [47]: B=array([[0],[1]]) In [48]: C=array([[1,0]]) In [49]: D=array([[0]]) but for some reason I can't create a system with these matrices: In [50]: mysys=scipy.signal.lti(A,B,C,D) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /home/ryan/siue/mechatronics/system_modeling/ /usr/lib/python2.4/site-packages/scipy/signal/ltisys.py in __init__(self, *args, **kwords) 223 self.__dict__['C'], \ 224 self.__dict__['D'] = abcd_normalize(*args)--> 225 self.__dict__['zeros'], self.__dict__['poles'], \ 226 self.__dict__['gain'] = ss2zpk(*args) 227 self.__dict__['num'], self.__dict__['den'] = ss2tf(*args) /usr/lib/python2.4/site-packages/scipy/signal/ltisys.py in ss2zpk(A, B, C, D, input) 183 z, p, k -- zeros and poles in sequences and gain constant. 184 """ --> 185 return tf2zpk(*ss2tf(A,B,C,D,input=input)) 186 187 class lti: /usr/lib/python2.4/site-packages/scipy/signal/ltisys.py in ss2tf(A, B, C, D, input) 154 for k in range(nout): 155 Ck = atleast_2d(C[k,:]) --> 156 num[k] = poly(A - dot(B,Ck)) + (D[k]-1)*den 157 158 return num, den TypeError: array cannot be safely cast to required type What am I doing wrong? Thanks, Ryan From willemjagter at gmail.com Fri Sep 8 01:52:14 2006 From: willemjagter at gmail.com (William Hunter) Date: Fri, 8 Sep 2006 07:52:14 +0200 Subject: [SciPy-user] Efficient submatrix of a sparse matrix In-Reply-To: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> References: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> Message-ID: <8b3894bc0609072252k6f5cdee5p367695c3dd96371f@mail.gmail.com> Brendan; It occured to me after I sent the mail that you're probably not into solving sparse systems, but actually just slicing sparse matrices... If so, my previous post is obviously not of much meaning to you. -- WH On 07/09/06, Brendan Simons wrote: > Hi all, > > I have a neat little problem for which I think sparse matrices are > the answer. I need to store a bunch of overlapping "intervals", (a, > b pairs for which any a <= val < b crosses that interval) in a manner > which makes "stabbing inquiries" (which intervals cross a specified > value) easy. The canonical way to to this is with interval trees (a > good summary here: http://w3.jouy.inra.fr/unites/miaj/public/vigneron/ > cs4235/l5cs4235.pdf), however I think I can simplify things as follows: > > 1) perform an argsort on the interval a and b endpoints. This gives > the position of each interval in an "order" matrix M, where each row > and each column contains only one interval, and intervals are sorted > in rows by start point, and in columns by endpoint > > 2) for a "stab" point x, do a binary search to find the row i and > column j where x could be inserted into M and maintain its ordering. > The submatrix of M[:i, j:] would contains all those intervals which > start before x, and end after x. > > Since the order matrix M is mostly empty it would make sense to use a > sparse matrix storage scheme, however the only one in scipy.sparse > that allows 2d slicing is the dok_matrix, and the slicing algorithm > is O(n), which is much too expensive for my purposes (I have to do > thousands of stabbing inquiries on a space containing thousands of > intervals). > > Is there no more efficient algorithm for 2d slicing of a sparse matrix? > > -Brendan > -- > Brendan Simons > __________________________________________________ > Do You Yahoo!? > Tired of spam? Yahoo! Mail has the best spam protection around > http://mail.yahoo.com > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From willemjagter at gmail.com Fri Sep 8 01:53:59 2006 From: willemjagter at gmail.com (William Hunter) Date: Fri, 8 Sep 2006 07:53:59 +0200 Subject: [SciPy-user] 3D plotting in Python Message-ID: <8b3894bc0609072253m68c46e7fgb9ca402669555614@mail.gmail.com> See http://www.scipy.org/WilnaDuToit for those who are interested, don't think it was posted to this ML. -- WH From maik.troemel at maitro.net Fri Sep 8 03:49:53 2006 From: maik.troemel at maitro.net (=?ISO-8859-1?Q?Maik_Tr=F6mel?=) Date: Fri, 08 Sep 2006 09:49:53 +0200 Subject: [SciPy-user] Interpolate 1D (2D) Message-ID: <450120A1.9070906@maitro.net> Thanks for your help! But there was still the same problem: /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in __call__(self, x, y, dx, dy) 62 x = atleast_1d(x) 63 y = atleast_1d(y) ---> 64 z,ier=fitpack._fitpack._bispev(*(self.tck+[x,y,dx,dy])) 65 if ier==10: raise ValueError,"Invalid input data" 66 if ier: raise TypeError,"An error occurred" AttributeError: interp2d instance has no attribute 'tck' I tried it on another system, but there ist the same problem. But there were the same scipy and numpy verions. So, I've updated to the same scipy and numpy version like you. New (or same?) problem: /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in __call__(self, x, y, dx, dy) 62 Output: 63 z - 2-d array of interpolated values of shape (len(y), len(x)). ---> 64 """ 65 x = atleast_1d(x) 66 y = atleast_1d(y) AttributeError: interp2d instance has no attribute 'tck' ?????????????????? Where's the problem? Anybody an idea? Greetings Maik Maik Tr?mel wrote: > Message: 8 > Date: Thu, 7 Sep 2006 23:09:42 +1200 > From: "Angus McMorland" > Subject: Re: [SciPy-user] Interpolate 1D > To: "SciPy Users List" > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Hi Maik, > > On 07/09/06, Maik Tr?mel wrote: > >> > >> > Hello, >> > > > > > I also tried out scipy.interpolate.interpolate.interp2d. But I get an > >> > error. Probably somebody knows what I'm doing wrong. Here's the >> script: >> > >> > import scipy.interpolate >> > >> > #just some data >> > a = numpy.zeros((4,4), dtype=''Float32') >> > oldx= numpy.arange(4) >> > oldy = numpy.arange(4) >> > newx = numpy.zeros((4), dtype=''Float32') >> > newy =numpy.zeros((4), dtype=''Float32') >> > for s in range(8): >> > newx[s] = s * 0.5 >> > newy[s] = s * 0.5 >> > >> > #interpolation >> > intinst = scipy.interpolate.interpolate.interp2d(oldx, oldy, a, kind = >> > 'cubic') >> > interpolated = intinst(newx, newy) >> > >> > >> > And here's the error: >> > /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in >> > __call__(self, x, y, dx, dy) >> > 62 x = atleast_1d(x) >> > 63 y = atleast_1d(y) >> > ---> 64 >> z,ier=fitpack._fitpack._bispev(*(self.tck+[x,y,dx,dy])) >> > 65 if ier==10: raise ValueError,"Invalid input data" >> > 66 if ier: raise TypeError,"An error occurred" >> > >> > AttributeError: interp2d instance has no attribute 'tck' >> > >> > Thanks for your help. >> > >> > Greetings Maik >> > > > I've tried your script, and get several different errors, so are you sure > that your error messages are coming from this script? > > The biggest conceptual jump needed with your script, as it runs on my > system, (scipy 0.5.0.2177, numpy 1.0b4.dev3055) is that oldx and oldy > need > to be the same length as the flattened 'a' - that is an (x,y) pair for > each > value of a, which itself could also be flat. Here is, I think, a working > version: > > import scipy.interpolate > import numpy > > #just some data > a = numpy.random.random_sample((4,4)).astype('Float32') > oldinds = numpy.indices((4,4)) > oldx = oldinds[1].flatten() # because indices[1] is the x values > oldy = oldinds[0].flatten() > newx = numpy.linspace(0,3,8).astype('Float32') > newy = numpy.linspace(0,3,8).astype('Float32') > > #interpolation > intinst = scipy.interpolate.interp2d(oldx, oldy, a, kind = 'cubic') > interpolated = intinst(newx, newy) > > #display the result - can ignore if you don't have matplotlib installed > import pylab > > pylab.subplot(121) > pylab.imshow(a,interpolation='nearest') > pylab.subplot(122) > pylab.imshow(interpolated, interpolation='nearest') > > > From stefan at sun.ac.za Fri Sep 8 06:22:27 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 8 Sep 2006 12:22:27 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: References: Message-ID: <20060908102227.GH20816@mentat.za.net> Hi Ryan On Thu, Sep 07, 2006 at 09:55:29PM -0500, Ryan Krauss wrote: > I am having a rough time trying to use the lti portion of > scipy.signal. For starters, I just wanted to generate a step response > for an integrator (1/s). I tried creating this system two ways, and > neither seems to work correctly. I think this system has a numerator > of 1 and a denominator of [1,0]. > In [46]: A=array([[0,1],[0,0]]) > > In [47]: B=array([[0],[1]]) > > In [48]: C=array([[1,0]]) > > In [49]: D=array([[0]]) > > but for some reason I can't create a system with these matrices: > > In [50]: mysys=scipy.signal.lti(A,B,C,D) > TypeError: array cannot be safely cast to required type Are you running a 64-bit platform? Maybe try import numpy as N import scipy.signal A=N.array([[0,1],[0,0]],dtype=N.int32) B=N.array([[0],[1]],dtype=N.int32) C=N.array([[1,0]],dtype=N.int32) D=N.array([[0]],dtype=N.int32) mysys=scipy.signal.lti(A,B,C,D) Regards St?fan From stefan at sun.ac.za Fri Sep 8 06:22:27 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 8 Sep 2006 12:22:27 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: References: Message-ID: <20060908102227.GH20816@mentat.za.net> Hi Ryan On Thu, Sep 07, 2006 at 09:55:29PM -0500, Ryan Krauss wrote: > I am having a rough time trying to use the lti portion of > scipy.signal. For starters, I just wanted to generate a step response > for an integrator (1/s). I tried creating this system two ways, and > neither seems to work correctly. I think this system has a numerator > of 1 and a denominator of [1,0]. > In [46]: A=array([[0,1],[0,0]]) > > In [47]: B=array([[0],[1]]) > > In [48]: C=array([[1,0]]) > > In [49]: D=array([[0]]) > > but for some reason I can't create a system with these matrices: > > In [50]: mysys=scipy.signal.lti(A,B,C,D) > TypeError: array cannot be safely cast to required type Are you running a 64-bit platform? Maybe try import numpy as N import scipy.signal A=N.array([[0,1],[0,0]],dtype=N.int32) B=N.array([[0],[1]],dtype=N.int32) C=N.array([[1,0]],dtype=N.int32) D=N.array([[0]],dtype=N.int32) mysys=scipy.signal.lti(A,B,C,D) Regards St?fan From stefan at sun.ac.za Fri Sep 8 06:30:56 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 8 Sep 2006 12:30:56 +0200 Subject: [SciPy-user] Interpolate 1D (2D) In-Reply-To: <450120A1.9070906@maitro.net> References: <450120A1.9070906@maitro.net> Message-ID: <20060908103056.GI20816@mentat.za.net> Hi Maik On Fri, Sep 08, 2006 at 09:49:53AM +0200, Maik Tr?mel wrote: > Thanks for your help! > But there was still the same problem: > > /usr/lib/python2.3/site-packages/scipy/interpolate/interpolate.py in > __call__(self, x, y, dx, dy) > 62 x = atleast_1d(x) > 63 y = atleast_1d(y) > ---> 64 z,ier=fitpack._fitpack._bispev(*(self.tck+[x,y,dx,dy])) > 65 if ier==10: raise ValueError,"Invalid input data" > 66 if ier: raise TypeError,"An error occurred" > > AttributeError: interp2d instance has no attribute 'tck' This looks like the error that used to be present in older versions of scipy. Are you sure you are running the latest version? It works here, with In [17]: numpy.__version__ Out[17]: '1.0rc1.dev3131' In [18]: scipy.__version__ Out[18]: '0.5.2.dev2190' Regards St?fan From ryanlists at gmail.com Fri Sep 8 08:40:28 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 8 Sep 2006 07:40:28 -0500 Subject: [SciPy-user] signal.lti In-Reply-To: <20060908102227.GH20816@mentat.za.net> References: <20060908102227.GH20816@mentat.za.net> Message-ID: Thanks for your thought Stefan. No, I am not running on a 64 bit machine. I tried your idea and tried dtype=N.float32 (which is what I think it should be). Same error message. Ryan On 9/8/06, Stefan van der Walt wrote: > Hi Ryan > > On Thu, Sep 07, 2006 at 09:55:29PM -0500, Ryan Krauss wrote: > > I am having a rough time trying to use the lti portion of > > scipy.signal. For starters, I just wanted to generate a step response > > for an integrator (1/s). I tried creating this system two ways, and > > neither seems to work correctly. I think this system has a numerator > > of 1 and a denominator of [1,0]. > > > > > In [46]: A=array([[0,1],[0,0]]) > > > > In [47]: B=array([[0],[1]]) > > > > In [48]: C=array([[1,0]]) > > > > In [49]: D=array([[0]]) > > > > but for some reason I can't create a system with these matrices: > > > > In [50]: mysys=scipy.signal.lti(A,B,C,D) > > > > > TypeError: array cannot be safely cast to required type > > Are you running a 64-bit platform? Maybe try > > import numpy as N > import scipy.signal > > A=N.array([[0,1],[0,0]],dtype=N.int32) > B=N.array([[0],[1]],dtype=N.int32) > C=N.array([[1,0]],dtype=N.int32) > D=N.array([[0]],dtype=N.int32) > mysys=scipy.signal.lti(A,B,C,D) > > Regards > St?fan > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From brendansimons at yahoo.ca Fri Sep 8 10:27:22 2006 From: brendansimons at yahoo.ca (Brendan Simons) Date: Fri, 8 Sep 2006 07:27:22 -0700 (PDT) Subject: [SciPy-user] SciPy-user Digest, Vol 37, Issue 6 In-Reply-To: Message-ID: <20060908142722.24733.qmail@web31115.mail.mud.yahoo.com> That's OK William. I gave it some more thought, and I don't think there's any sparse storage scheme which would allow me to do constant or log time slicing over two axes. The results from slicing the first axis would always have to be sorted, or individually tested, giving on average a complexity of O(n*log(n)) for the one axis sort and O(k*(log(n) + n/2)) for the lookups, where n is the number of intervals, and k is the number of stab inquiries. Since both k and n are high for my case, I've decided to go the traditional route and use an interval tree, which has a complexity of O(n*log(n)) for pre-processing, and only O(k * log(n)) for the lookups. Thanks anyway, Brendan ------------------------------ Message: 5 Date: Fri, 8 Sep 2006 07:52:14 +0200 From: "William Hunter" Subject: Re: [SciPy-user] Efficient submatrix of a sparse matrix To: "SciPy Users List" Message-ID: <8b3894bc0609072252k6f5cdee5p367695c3dd96371f at mail.gmail.com> Content-Type: text/plain; charset=UTF-8; format=flowed Brendan; It occured to me after I sent the mail that you're probably not into solving sparse systems, but actually just slicing sparse matrices... If so, my previous post is obviously not of much meaning to you. -- WH On 07/09/06, Brendan Simons wrote: > Hi all, > > I have a neat little problem for which I think sparse matrices are > the answer. I need to store a bunch of overlapping "intervals", (a, > b pairs for which any a <= val < b crosses that interval) in a manner > which makes "stabbing inquiries" (which intervals cross a specified > value) easy. The canonical way to to this is with interval trees (a > good summary here: http://w3.jouy.inra.fr/unites/miaj/public/vigneron/ > cs4235/l5cs4235.pdf), however I think I can simplify things as follows: > > 1) perform an argsort on the interval a and b endpoints. This gives > the position of each interval in an "order" matrix M, where each row > and each column contains only one interval, and intervals are sorted > in rows by start point, and in columns by endpoint > > 2) for a "stab" point x, do a binary search to find the row i and > column j where x could be inserted into M and maintain its ordering. > The submatrix of M[:i, j:] would contains all those intervals which > start before x, and end after x. > > Since the order matrix M is mostly empty it would make sense to use a > sparse matrix storage scheme, however the only one in scipy.sparse > that allows 2d slicing is the dok_matrix, and the slicing algorithm > is O(n), which is much too expensive for my purposes (I have to do > thousands of stabbing inquiries on a space containing thousands of > intervals). > > Is there no more efficient algorithm for 2d slicing of a sparse matrix? > > -Brendan > -- > Brendan Simons > __________________________________________________ > Do You Yahoo!? > Tired of spam? Yahoo! Mail has the best spam protection around > http://mail.yahoo.com > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From rshepard at appl-ecosys.com Fri Sep 8 13:18:39 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Fri, 8 Sep 2006 10:18:39 -0700 (PDT) Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array Message-ID: After decades of Fortran and C, I'm quite new to Python and all the great tools it provides. So, I'd like mentoring in how to most elegantly and effectively parse data read over a serial port from a scanner, and how to build a 2D array from it. After this, I'll probably need help in other NumPy array (matrix) manipulations. But, first things first: Data are entered on an optical mark readable (OMR) form, which is scanned and the data sent over a serial line. Each form read transmits 69 bytes; there are 2 bytes for each "column" (where there is a timing mark on the edge of the form) and their position within that column determines their value. The form has two blank lines (with timing marks) and three lines with labels; these transmit the equivalent of '0' and are not recorded. The final byte is a carriage return, '\r,' which is translated into a newline ('\n') by the method. For each form read I want to record the form number (programmatically sequentially assigned, starting with '1') and the data from the form as a row. The total forms read then fill a two-dimensional array. The two issues with which I would like help are: How to slice the list of data from the scanner stream into two-byte chunks (leaving the end-of-record byte to be dealt with separately), How to build the array so that each form read creates a new row in the array. While the 31 columns are fixed, the number of rows is indeterminate until all submitted forms have been read. I've attached the 95-line 'OnScan.py' to this message; it's a single function within the application. TIA, Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 -------------- next part -------------- def OnScan(self, event): """ OMR form timing marks and their meaning: Col. # Value ----------------- 1 Category labels 2 Category mark 3 Blank line 4 Position labels 5 Polition mark 6 Row labels 7-34 Pairwise comparison votes A total of 69 bytes are sent in a stream; the last is '\r' for end of record """ # scanner row translations DATA_MAP_BLANK = { chr(32)+chr(32): 0 } DATA_MAP_2 = { chr(32)+chr(36): 'nat', chr(96)+chr(32): 'eco', chr(36)+chr(32): 'soc' } DATA_MAP_5 = { chr(32)+chr(36): 'pro', chr(96)+chr(32): 'neu', chr(36)+chr(32): 'con' } DATA_MAP_7 = { chr(32)+chr(16): 1.000, chr(32)+chr(8): 2.000, chr(32)+chr(4): 3.000, chr(32)+chr(2): 4.000, chr(32)+chr(1): 5.000, chr(64)+chr(32): 6.000, chr(16)+chr(32): 7.000, chr(8)+chr(32): 8.000, chr(4)+chr(32): 9.000, chr(34)+chr(8): 0.500, chr(34)+chr(4): 0.333, chr(34)+chr(2): 0.025, chr(34)+chr(1): 0.200, chr(66)+chr(32): 0.167, chr(18)+chr(32): 0.143, chr(10)+chr(32): 0.125, chr(6)+chr(32): 0.111 } # Open serial port ser = serial.Serial('/dev/ttyUSB0', 9600, timeout=1) ser.open() if ser.isOpen() != True: serial.SerialException searchString = '\r' # end of card read vote_id = 0 # card number incremented at start of read loop newStart = 0 # only > 0 if the run is aborted by the operator progressMax = 500 omrDialog = wx.ProgressDialog("Scoping Form Input", "Processing ...", progressMax, style=wx.PD_CAN_ABORT | wx.PD_APP_MODAL | wx.PD_ELAPSED_TIME) keepGoing = True while True: vote_id += 1 # vote record number; heads list row & displayed for entry on form if newStart > 0: # read was aborted (card mis-read? jammed?) and restarted vote_id = newStart omrDialog.Resume() # read the line sent by the OMR reader; record end is '\r'. line = ser.readline # get all 69 bytes in one chunk """ if value == DATA_MAP_BLANK: # skip line if blank or labels continue if value == searchString: # we're done with that form vrecord[:-1] # slice off the terminal comma vrecord.append('\n') # add a newline break vrecord.extend([line, ", "]) # add the pairwise comparison value, a comma, and a space Now I need to add that vote record to an array, and start a new row """ while keepGoing and vote_id < progressMax: # increment progress dialog if keepGoing == False: newStart = vote_id break keepGoing = omrDialog.Update(vote_id) # close the dialog box and serial port omrDialog.Destroy() ser.Close() From a.u.r.e.l.i.a.n at gmx.net Fri Sep 8 16:33:26 2006 From: a.u.r.e.l.i.a.n at gmx.net (Johannes Loehnert) Date: Fri, 8 Sep 2006 22:33:26 +0200 Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: References: Message-ID: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> Hi, I must admit I did not understand the data format correctly, however maybe I can help. > How to slice the list of data from the scanner stream into two-byte > chunks (leaving the end-of-record byte to be dealt with separately), Since you want it as array in the end, you could convert it straight away using fromstring: #code s = '\x00\x01\x00\x02\x00\x03' # some binary data a = fromstring(s, dtype='>u2') # >: Big Endian, u: uint, 2: bytes/val #endcode You will have to strip the EOR byte. Choose appropriate byteorder. > > How to build the array so that each form read creates a new row in the > array. While the 31 columns are fixed, the number of rows is indeterminate > until all submitted forms have been read. Since the row count is undetermined, the best way is to build a list (linked list, builtin type) containing all the columns and convert it to an array when finished with reading: # code rows = [] # create empty list while still_rows_left: rows.append(row_that_was_just_read) # now rows is [array(1,2,3,..), array(4,5,6,...), ...] mydata = array(rows) # make array out of list #endcode HTH, Johannes From rshepard at appl-ecosys.com Fri Sep 8 17:48:42 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Fri, 8 Sep 2006 14:48:42 -0700 (PDT) Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> References: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> Message-ID: On Fri, 8 Sep 2006, Johannes Loehnert wrote: > I must admit I did not understand the data format correctly, however maybe I > can help. Johannes, The data format is a stream of ASCII bytes. Each two-byte combination represents either a string (for two columns) or an integer (for the other 28 data columns). The conversion from position-specific bit values to the ASCII characters is accomplished using the data mapping dictionaries. I accept that I wasn't as clear as I could have been. > Since you want it as array in the end, you could convert it straight away > using fromstring: > > #code > s = '\x00\x01\x00\x02\x00\x03' # some binary data > a = fromstring(s, dtype='>u2') # >: Big Endian, u: uint, 2: bytes/val > #endcode After sending the message it occurred to me that the string.Split() function is probably what I want. I'm not receiving Hex values from the scanner. I'll play with this over the weekend. You are correct, however, that I do want to take that incoming string and store it into an array in the fewest possible steps. I'll have to look at 'fromstring()' to see just what that does. I assumed that each row in the array was a list, is that incorrect? > You will have to strip the EOR byte. Choose appropriate byteorder. If the incoming data is stored as a list, I can slice off the last byte. On AMD processors (and Intel, too) I believe that the byte order has always been littleendian. > Since the row count is undetermined, the best way is to build a list (linked > list, builtin type) containing all the columns and convert it to an array > when finished with reading: > > # code > rows = [] # create empty list > while still_rows_left: > rows.append(row_that_was_just_read) > # now rows is [array(1,2,3,..), array(4,5,6,...), ...] > mydata = array(rows) # make array out of list > #endcode Ah! I think that's just what I need. I've started reading Travis' book, but haven't hit this part yet. Many thanks, Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 From a.u.r.e.l.i.a.n at gmx.net Fri Sep 8 19:06:40 2006 From: a.u.r.e.l.i.a.n at gmx.net (Johannes Loehnert) Date: Sat, 9 Sep 2006 01:06:40 +0200 Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: References: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> Message-ID: <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> Hi, > The data format is a stream of ASCII bytes. Each two-byte combination > represents either a string (for two columns) or an integer (for the other > 28 data columns). The conversion from position-specific bit values to the > ASCII characters is accomplished using the data mapping dictionaries. I > accept that I wasn't as clear as I could have been. Understood now. But you cannot store strings and floats in an array simultaneously. > After sending the message it occurred to me that the string.Split() > function is probably what I want. I'm not receiving Hex values from the > scanner. I'll play with this over the weekend. Not sure about string.split(), I don't think this is useful (would split into 1-byte chars iirc). You can you use a list comprehension for the data part, like this: line = read_data() mapped_vals = [DATA_MAP_7[i:i+2] for i in range(2*7, 2*35)] although I admit it is no very pretty solution. Maybe somebody has a better idea. > You are correct, however, that I do want to take that incoming string > and store it into an array in the fewest possible steps. I'll have to look > at 'fromstring()' to see just what that does. I assumed that each row in > the array was a list, is that incorrect? well, since the data has to be mapped in a certain manner first, I'm not sure if fromstring() really helps here. I did not realize this at the first glance. BTW, while having a closer look to your code I noticed that at line 72 ("line = ser.readline") there is a pair of brackets missing. Here is one free of charge: () ;-) Johannes From rshepard at appl-ecosys.com Fri Sep 8 19:35:24 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Fri, 8 Sep 2006 16:35:24 -0700 (PDT) Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> References: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> Message-ID: On Sat, 9 Sep 2006, Johannes Loehnert wrote: > Understood now. But you cannot store strings and floats in an array > simultaneously. Johannes, I understand. I was thinking of lists, I guess. This is not a problem because I can store a numeric code (1, 2, 3) for each of the two strings and translate them later in the program. They're there for selecting rows. Originally I was going to put the incoming data into a SQLite3 database table, then extract the values using SQL 'select from ... where ...' statements, but I thought that direct application of python datatypes would be more efficient. > Not sure about string.split(), I don't think this is useful (would split > into 1-byte chars iirc). You can you use a list comprehension for the data > part, like this: One-byte chars are exactly what I want. That's what is coming from the scanner, in bit-mapped format. What the scanner folks call a column we see as a row on the form. Each 'row' (as we look at the form in portrait orientation) has 12 positions, each with a bit value. There are two bytes encoded there. I'm using the DATA_MAP_ dictionaries to translate from the transmitted bytes to values meaningful to the application. > line = read_data() > mapped_vals = [DATA_MAP_7[i:i+2] for i in range(2*7, 2*35)] DATA_MAP is a dictionary; can I find keys by slicing? If so, it would be [i:i+1] from bytes 12-67. > although I admit it is no very pretty solution. Maybe somebody has a better > idea. You are helping me, but what I'm trying to do is beyond what all my introductory references cover, and I'm not finding clearly presented solutions using Google. > BTW, while having a closer look to your code I noticed that at line 72 > = ("line ser.readline") there is a pair of brackets missing. Here is one > free of charge: () > ;-) Thank you. I caught that earlier today and fixed it. Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 From oliphant at ee.byu.edu Fri Sep 8 21:12:35 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 08 Sep 2006 19:12:35 -0600 Subject: [SciPy-user] signal.lti In-Reply-To: References: Message-ID: <45021503.90307@ee.byu.edu> Ryan Krauss wrote: >I am having a rough time trying to use the lti portion of >scipy.signal. For starters, I just wanted to generate a step response >for an integrator (1/s). > It turns out your starting-point is a special-case not handled by the current code (I checked this on old SciPy and it raised errors as well). The current code is "general-purpose" but does not handle lonely integrators. Thus, any time your denominator ends in a zero, you will have trouble. This case needs to be dealt with separately but so far isn't (the error could be more informative of course). You would be better of trying a simple first-order system as your starting point. 1/(s+1) Which works fine for me with current SciPy. from pylab import * from scipy import * plot(*signal.step((1,[1,1]))) -Travis -------------- next part -------------- A non-text attachment was scrubbed... Name: tmp.png Type: image/png Size: 12782 bytes Desc: not available URL: From a.u.r.e.l.i.a.n at gmx.net Sat Sep 9 02:59:53 2006 From: a.u.r.e.l.i.a.n at gmx.net (Johannes Loehnert) Date: Sat, 9 Sep 2006 08:59:53 +0200 Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: References: <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> Message-ID: <200609090859.54333.a.u.r.e.l.i.a.n@gmx.net> Hi again, > > line = read_data() > > mapped_vals = [DATA_MAP_7[i:i+2] for i in range(2*7, 2*35)] > > DATA_MAP is a dictionary; can I find keys by slicing? If so, it would be > [i:i+1] from bytes 12-67. I am sorry, it was late in the evening. :-) What I meant was: mapped_vals = [DATA_MAP_7[line[i:i+2]] for i in range(2*7, 2*35)] i.e. take a 2-byte slice of the string "line" (slicing strings is no problem) and then look up the resulting 2-byte chunk in a dictionary. What I had in mind was something like this: for line in scanner.readline(): header = [ DATA_MAP_2[line[2:4], DATA_MAP_5[line[5:7]] ] mapped_vals = [DATA_MAP_7[line[i:i+2]] for i in range(2*7, 2*35)] col_array = array(header + mapped_vals) ("+" concatenates the lists). Thinking about it, you do not need to convert it to an array right here. You can build a list of lists and convert it into an array as the very last step. Johannes From pgmdevlist at gmail.com Sat Sep 9 05:24:44 2006 From: pgmdevlist at gmail.com (PGM) Date: Sat, 9 Sep 2006 05:24:44 -0400 Subject: [SciPy-user] problem with scipy.stats.distributions ? Message-ID: <200609090524.44259.pgmdevlist@gmail.com> Hello, I recently update to numpy 1.0b5 and scipy 0.5.1, and I'm running into this message: #.................................................................... >>> scipy.stats.distributions.norm.ppf(0.95) --------------------------------------------------------------------------- /usr/lib64/python2.4/site-packages/scipy/stats/distributions.py in ppf(self, q, *args, **kwds) 551 place(output,cond2,self.b*scale + loc) 552 goodargs = argsreduce(cond, *((q,)+args+(scale,loc))) --> 553 scale, loc, goodargs = goodargs[-2], goodargs[-1], goodargs[:-2] 554 place(output,cond,self._ppf(*goodargs)*scale + loc) 555 return output /usr/lib64/python2.4/site-packages/numpy/lib/function_base.py in insert(arr, obj, values, axis) 1165 if (obj < 0): obj += N 1166 if (obj < 0 or obj >=N): -> 1167 raise ValueError, "index (%d) out of range (0<=index<=%d) "\ 1168 "in dimension %d" % (obj, N, axis) 1169 newshape[axis] += 1; ValueError: index (1) out of range (0<=index<=1) in dimension 0 #.................................................................. Would anybody have any idea of what it means ? Thx in advance P. From elcorto at gmx.net Sat Sep 9 06:05:48 2006 From: elcorto at gmx.net (Steve Schmerler) Date: Sat, 09 Sep 2006 12:05:48 +0200 Subject: [SciPy-user] problem with scipy.stats.distributions ? In-Reply-To: <200609090524.44259.pgmdevlist@gmail.com> References: <200609090524.44259.pgmdevlist@gmail.com> Message-ID: <450291FC.7050300@gmx.net> PGM wrote: > Hello, > I recently update to numpy 1.0b5 and scipy 0.5.1, and I'm running into this > message: > #.................................................................... >>>> scipy.stats.distributions.norm.ppf(0.95) Hmm works here: In [2]: scipy.stats.distributions.norm.ppf(0.95) Out[2]: array(1.6448536269514722) In [3]: import numpy In [4]: numpy.__version__ Out[4]: '1.0b5.dev3100' In [5]: scipy.__version__ Out[5]: '0.5.1.dev2185' -- cheers, steve Random number generation is the art of producing pure gibberish as quickly as possible. From oliphant.travis at ieee.org Sat Sep 9 09:21:08 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 09 Sep 2006 07:21:08 -0600 Subject: [SciPy-user] problem with scipy.stats.distributions ? In-Reply-To: <200609090524.44259.pgmdevlist@gmail.com> References: <200609090524.44259.pgmdevlist@gmail.com> Message-ID: <4502BFC4.3010706@ieee.org> PGM wrote: > Hello, > I recently update to numpy 1.0b5 and scipy 0.5.1, and I'm running into this > message: > #.................................................................... > >>>> scipy.stats.distributions.norm.ppf(0.95) >>>> > --------------------------------------------------------------------------- > /usr/lib64/python2.4/site-packages/scipy/stats/distributions.py in ppf(self, > q, *args, **kwds) > 551 place(output,cond2,self.b*scale + loc) > 552 goodargs = argsreduce(cond, *((q,)+args+(scale,loc))) > --> 553 scale, loc, goodargs = goodargs[-2], goodargs[-1], > goodargs[:-2] > 554 place(output,cond,self._ppf(*goodargs)*scale + loc) > 555 return output > > /usr/lib64/python2.4/site-packages/numpy/lib/function_base.py in insert(arr, > obj, values, axis) > 1165 if (obj < 0): obj += N > 1166 if (obj < 0 or obj >=N): > -> 1167 raise ValueError, "index (%d) out of range (0<=index<=%d) > "\ > 1168 "in dimension %d" % (obj, N, axis) > 1169 newshape[axis] += 1; > > It looks like you have an old .pyc file that is being picked up and confusing the interpreter. Try clearing out the site-packages/scipy and site-packages/numpy directories and installing again. -Travis From stefan at sun.ac.za Sat Sep 9 03:30:42 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Sat, 9 Sep 2006 09:30:42 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: <45021503.90307@ee.byu.edu> References: <45021503.90307@ee.byu.edu> Message-ID: <20060909073042.GD13363@mentat.za.net> On Fri, Sep 08, 2006 at 07:12:35PM -0600, Travis Oliphant wrote: > Ryan Krauss wrote: > > >I am having a rough time trying to use the lti portion of > >scipy.signal. For starters, I just wanted to generate a step response > >for an integrator (1/s). > > > It turns out your starting-point is a special-case not handled by the > current code (I checked this on old SciPy and it raised errors as > well). The current code is "general-purpose" but does not handle > lonely integrators. Thus, any time your denominator ends in a zero, you > will have trouble. This case needs to be dealt with separately but so > far isn't (the error could be more informative of course). You would be > better of trying a simple first-order system as your starting point. Interestingly, he didn't get the IndexError: index out of bounds that I see, but rather TypeError: array cannot be safely cast to required type which is what lead me to ask about the 64-bit platform. Same problem? Regards St?fan From strawman at astraw.com Mon Sep 11 12:40:09 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 11 Sep 2006 09:40:09 -0700 Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: References: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> Message-ID: <45059169.9080305@astraw.com> Rich Shepard wrote: > On Sat, 9 Sep 2006, Johannes Loehnert wrote: > > >> Understood now. But you cannot store strings and floats in an array >> simultaneously. >> > > Johannes, > > I understand. I was thinking of lists, I guess. This is not a problem > because I can store a numeric code (1, 2, 3) for each of the two strings and > translate them later in the program. They're there for selecting rows. > Actually, with the sophisticated dtypes (datatypes) now in numpy, you can now store strings and floats in the same array. However, you solution sounds sensible and simpler than creating such an array, so you may prefer to carry on. From rshepard at appl-ecosys.com Mon Sep 11 12:48:03 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Mon, 11 Sep 2006 09:48:03 -0700 (PDT) Subject: [SciPy-user] NumPy Newcomer Questions: Reading Data Stream & Building Array In-Reply-To: <45059169.9080305@astraw.com> References: <200609082233.28422.a.u.r.e.l.i.a.n@gmx.net> <200609090106.41829.a.u.r.e.l.i.a.n@gmx.net> <45059169.9080305@astraw.com> Message-ID: On Mon, 11 Sep 2006, Andrew Straw wrote: > Actually, with the sophisticated dtypes (datatypes) now in numpy, you can > now store strings and floats in the same array. However, you solution > sounds sensible and simpler than creating such an array, so you may prefer > to carry on. Thank you, Andrew. Thinking more about the process, I decided that writing the parsed and processed stream into a sqlite3 table is the better approach. This gives us permanent storage of the digital images of each scanned paper form so they can be compared if a data audit is required. Also, it is easier -- for me, at least -- to write the SQL select statements to calculate column averages grouped by the two string values. It is these average values that are to be inserted into a 2D array. Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 From ryanlists at gmail.com Mon Sep 11 18:51:52 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 11 Sep 2006 17:51:52 -0500 Subject: [SciPy-user] signal.lti In-Reply-To: <45021503.90307@ee.byu.edu> References: <45021503.90307@ee.byu.edu> Message-ID: I thought about this over the weekend. I was wrong about the matrices not being correct. The A matrix really is just 0. So, passing in the numerator and denominator polynomials does correctly create a system, you just can't take its step response. I will play with other systems for now. Is anyone maintaining this code? I am going to be using it more the systems modeling portion of my mechatronics class and I am trying to convert students from Matlab. I will also be teaching Dynamic System Modeling in the spring which is basically transfer functions and impulse and step responses. I would be willing to invest some time in this if no one is doing it. But if someone already knows the code and can easily modify it, that would be great. Ryan On 9/8/06, Travis Oliphant wrote: > Ryan Krauss wrote: > > >I am having a rough time trying to use the lti portion of > >scipy.signal. For starters, I just wanted to generate a step response > >for an integrator (1/s). > > > It turns out your starting-point is a special-case not handled by the > current code (I checked this on old SciPy and it raised errors as > well). The current code is "general-purpose" but does not handle > lonely integrators. Thus, any time your denominator ends in a zero, you > will have trouble. This case needs to be dealt with separately but so > far isn't (the error could be more informative of course). You would be > better of trying a simple first-order system as your starting point. > > 1/(s+1) > > Which works fine for me with current SciPy. > > from pylab import * > from scipy import * > > plot(*signal.step((1,[1,1]))) > > > > > > -Travis > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > From oliphant at ee.byu.edu Mon Sep 11 19:25:32 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 11 Sep 2006 17:25:32 -0600 Subject: [SciPy-user] signal.lti In-Reply-To: References: <45021503.90307@ee.byu.edu> Message-ID: <4505F06C.1040703@ee.byu.edu> Ryan Krauss wrote: >I thought about this over the weekend. I was wrong about the matrices >not being correct. The A matrix really is just 0. So, passing in the >numerator and denominator polynomials does correctly create a system, >you just can't take its step response. > > > >I will play with other systems for now. > >Is anyone maintaining this code? > Yes. If it's in the SciPy space it's being maintained (by me at least). Some things were pulled out into the sandbox (ga, cow) because they were not being maintained. I use the lti stuff whenever I teach our signals class. -Travis From lfriedri at imtek.de Tue Sep 12 01:04:06 2006 From: lfriedri at imtek.de (Lars Friedrich) Date: Tue, 12 Sep 2006 07:04:06 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: <4505F06C.1040703@ee.byu.edu> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> Message-ID: <1158037447.6722.4.camel@gdur.breisach> Hello, > Ryan Krauss wrote: > > >I thought about this over the weekend. I was wrong about the matrices > >not being correct. The A matrix really is just 0. So, passing in the > >numerator and denominator polynomials does correctly create a system, > >you just can't take its step response. shouldn't signal.lsim2 be able to handle such systems? I tried the following: #*************** start shell session from scipy import signal from numpy import * sys1=signal.lti((1),(1,0)) signal.lsim2(sys1, ones(10), arange(10)) Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy/signal/ltisys.py", line 333, in lsim2 ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, fill_value=0) AttributeError: 'module' object has no attribute 'linear_1d' #***************** end shell session What about this error? I get the same errror with sys=signal.lti((1),(1,1)). Thanks Lars From nwagner at iam.uni-stuttgart.de Tue Sep 12 05:17:47 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 12 Sep 2006 11:17:47 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: <1158037447.6722.4.camel@gdur.breisach> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> Message-ID: <45067B3B.105@iam.uni-stuttgart.de> Lars Friedrich wrote: > Hello, > > >> Ryan Krauss wrote: >> >> >>> I thought about this over the weekend. I was wrong about the matrices >>> not being correct. The A matrix really is just 0. So, passing in the >>> numerator and denominator polynomials does correctly create a system, >>> you just can't take its step response. >>> > > shouldn't signal.lsim2 be able to handle such systems? I tried the > following: > > #*************** start shell session > from scipy import signal > from numpy import * > > sys1=signal.lti((1),(1,0)) > > signal.lsim2(sys1, ones(10), arange(10)) > > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.4/site-packages/scipy/signal/ltisys.py", line > 333, in lsim2 > ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, > fill_value=0) > AttributeError: 'module' object has no attribute 'linear_1d' > #***************** end shell session > > What about this error? I get the same errror with > sys=signal.lti((1),(1,1)). > > Thanks > Lars > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > BTW, an "entry" for lsim2 is missing help (signal) yields ... Linear Systems: lti -- linear time invariant system object. lsim -- continuous-time simulation of output to linear system. impulse -- impulse response of linear, time-invariant (LTI) system. step -- step response of continous-time LTI system. Nils From stefan at sun.ac.za Tue Sep 12 04:37:14 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 12 Sep 2006 10:37:14 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: <1158037447.6722.4.camel@gdur.breisach> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> Message-ID: <20060912083714.GK13530@mentat.za.net> On Tue, Sep 12, 2006 at 07:04:06AM +0200, Lars Friedrich wrote: > > >I thought about this over the weekend. I was wrong about the matrices > > >not being correct. The A matrix really is just 0. So, passing in the > > >numerator and denominator polynomials does correctly create a system, > > >you just can't take its step response. > > shouldn't signal.lsim2 be able to handle such systems? I tried the > following: > > #*************** start shell session > from scipy import signal > from numpy import * > > sys1=signal.lti((1),(1,0)) > > signal.lsim2(sys1, ones(10), arange(10)) > > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.4/site-packages/scipy/signal/ltisys.py", line > 333, in lsim2 > ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, > fill_value=0) > AttributeError: 'module' object has no attribute 'linear_1d' > #***************** end shell session Should be fixed in SVN now. Cheers St?fan From nmarais at sun.ac.za Tue Sep 12 08:27:29 2006 From: nmarais at sun.ac.za (Neilen Marais) Date: Tue, 12 Sep 2006 14:27:29 +0200 Subject: [SciPy-user] Polynomials in several variables Message-ID: Hi Just wondering if a module similar to the older Scientific Python multivariate polynomial module (Scientific.Functions.Polynomial) is in scipy. It works similar to the 1d polynomial functions except that an n-dimensional array of coeficients are passed in for a polynomial in n variables. Thanks Neilen -- you know its kind of tragic we live in the new world but we've lost the magic -- Battery 9 (www.battery9.co.za) From jdhunter at ace.bsd.uchicago.edu Tue Sep 12 10:00:43 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 12 Sep 2006 09:00:43 -0500 Subject: [SciPy-user] problem with interp1d Message-ID: <87wt895gms.fsf@peds-pc311.bsd.uchicago.edu> I thought the following code would resample the irregularly spaced 't' to the regularly spaced 'newt' but I am getting a traceback import scipy import scipy.interpolate t = scipy.cumsum(scipy.randn(100)) # increasing time s = scipy.sin(2*scipy.pi*t) interp = scipy.interpolate.interp1d(t, s) newt = scipy.arange(min(t), max(t), 0.1) news = interp(newt) > python tst.py Traceback (most recent call last): File "tst.py", line 7, in ? news = interp(newt) File "/usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py", line 154, in __call__ out_of_bounds = self._check_bounds(x_new) File "/usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py", line 208, in _check_bounds raise ValueError, " A value in x_new is below the"\ ValueError: A value in x_new is below the interpolation range. In [2]: scipy.__version__ Out[2]: '0.5.2.dev2196' Any ideas? Thanks, JDH From jdhunter at ace.bsd.uchicago.edu Tue Sep 12 10:08:43 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 12 Sep 2006 09:08:43 -0500 Subject: [SciPy-user] problem with interp1d In-Reply-To: <87wt895gms.fsf@peds-pc311.bsd.uchicago.edu> (John Hunter's message of "Tue, 12 Sep 2006 09:00:43 -0500") References: <87wt895gms.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <873bax5g9g.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "John" == John Hunter writes: John> I thought the following code would resample the irregularly John> spaced 't' to the regularly spaced 'newt' but I am getting a John> traceback OK, please ignore me. Because I was using randn to generate the intervals, I was getting non-monotonic x. Duh import scipy import scipy.interpolate t = scipy.cumsum(scipy.rand(100)) # increasing time s = scipy.sin(0.1*2*scipy.pi*t) interp = scipy.interpolate.interp1d(t, s) newt = scipy.arange(min(t), max(t), 0.1) news = interp(newt) from pylab import subplot, show ax = subplot(111) ax.plot(t, s, 'o', newt, news, '-') show() From jdhunter at ace.bsd.uchicago.edu Tue Sep 12 10:08:43 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 12 Sep 2006 09:08:43 -0500 Subject: [SciPy-user] problem with interp1d In-Reply-To: <87wt895gms.fsf@peds-pc311.bsd.uchicago.edu> (John Hunter's message of "Tue, 12 Sep 2006 09:00:43 -0500") References: <87wt895gms.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <873bax5g9g.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "John" == John Hunter writes: John> I thought the following code would resample the irregularly John> spaced 't' to the regularly spaced 'newt' but I am getting a John> traceback OK, please ignore me. Because I was using randn to generate the intervals, I was getting non-monotonic x. Duh import scipy import scipy.interpolate t = scipy.cumsum(scipy.rand(100)) # increasing time s = scipy.sin(0.1*2*scipy.pi*t) interp = scipy.interpolate.interp1d(t, s) newt = scipy.arange(min(t), max(t), 0.1) news = interp(newt) from pylab import subplot, show ax = subplot(111) ax.plot(t, s, 'o', newt, news, '-') show() From jstrunk at enthought.com Tue Sep 12 10:26:51 2006 From: jstrunk at enthought.com (Jeff Strunk) Date: Tue, 12 Sep 2006 09:26:51 -0500 Subject: [SciPy-user] REMINDER: ISP changeover today 9/12 7:00 PM Central Message-ID: <200609120926.51762.jstrunk@enthought.com> Good morning, This is a reminder for this evening's network maintenance. Unfortunately, our recent change in internet service providers is not working out. We will be switching to a more reliable provider tonight at 7:00 PM Central. Please allow for up to two hours of downtime. I will send an email announcing the start and completion of this maintenance. This will effect all Enthought servers as well as the SciPy server which hosts many Open Source projects. Please pass this message along to people that I have missed. If you have any questions, please direct them to me. We apologize for the inconvenience. Jeff Strunk Enthought, Inc. From hasslerjc at adelphia.net Tue Sep 12 10:43:48 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Tue, 12 Sep 2006 10:43:48 -0400 Subject: [SciPy-user] Scipy works on AMD now Message-ID: <4506C7A4.3030004@adelphia.net> On an AMD AthlonXP (Model 6) 1600+ with: Python 2.4.3 numpy '1.0b5' scipy '0.5.1' numpy.test() runs without error. scipy.test() shows one error (see below). I'm not sure what this is, but all of my own numerical programs run without problems. Matplotlib didn't install properly, but when I added the missing files, it worked. All of my programs are now happily working. Many thanks to the people who put this all together. john ====================================================================== ERROR: check loadmat case cellnest ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\program files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 80, in cc self._check_case(name, expected) File "D:\program files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 75, in _check_case self._check_level(k_label, expected, matdict[k]) File "D:\program files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 33, in _check_level self._check_level(level_label, ev, actual[i]) File "D:\program files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 30, in _check_level assert len(expected) == len(actual), "Different list lengths at %s" % label TypeError: len() of unsized object ---------------------------------------------------------------------- Ran 1569 tests in 23.143s FAILED (errors=1) From ryanlists at gmail.com Tue Sep 12 18:21:32 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 12 Sep 2006 17:21:32 -0500 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: <4506C7A4.3030004@adelphia.net> References: <4506C7A4.3030004@adelphia.net> Message-ID: John, Are you running 32bit or 64bit? I have heard of people wrestling with 64bit, but never heard anyone fully declare victory. I am going to build a new computer soon and wonder if I should get a 64bit AMD chip. Ryan On 9/12/06, John Hassler wrote: > On an AMD AthlonXP (Model 6) 1600+ with: > Python 2.4.3 > numpy '1.0b5' > scipy '0.5.1' > > numpy.test() runs without error. scipy.test() shows one error (see > below). I'm not sure what this is, but all of my own numerical programs > run without problems. Matplotlib didn't install properly, but when I > added the missing files, it worked. All of my programs are now happily > working. > > Many thanks to the people who put this all together. > john > > > ====================================================================== > ERROR: check loadmat case cellnest > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "D:\program > files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 80, in cc > self._check_case(name, expected) > File "D:\program > files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 75, > in _check_case > self._check_level(k_label, expected, matdict[k]) > File "D:\program > files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 33, > in _check_level > self._check_level(level_label, ev, actual[i]) > File "D:\program > files\Python24\Lib\site-packages\scipy\io\tests\test_mio.py", line 30, > in _check_level > assert len(expected) == len(actual), "Different list lengths at %s" > % label > TypeError: len() of unsized object > > ---------------------------------------------------------------------- > Ran 1569 tests in 23.143s > > FAILED (errors=1) > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From hasslerjc at adelphia.net Tue Sep 12 19:04:29 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Tue, 12 Sep 2006 19:04:29 -0400 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: References: <4506C7A4.3030004@adelphia.net> Message-ID: <45073CFD.3020001@adelphia.net> An HTML attachment was scrubbed... URL: From jstrunk at enthought.com Tue Sep 12 19:30:03 2006 From: jstrunk at enthought.com (Jeff Strunk) Date: Tue, 12 Sep 2006 18:30:03 -0500 Subject: [SciPy-user] ISP changeover beginning in 30 minutes (7:00 PM Central) Message-ID: <200609121830.03566.jstrunk@enthought.com> Good evening, We will begin switching our internet service over in about 30 minutes. Please allow for up to two hours of downtime. I will send an email announcing the completion of this maintenance. This will effect all Enthought servers as well as the SciPy server which hosts many Open Source projects. We apologize for the inconvenience. Jeff Strunk Enthought, Inc. From peridot.faceted at gmail.com Tue Sep 12 19:46:35 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Tue, 12 Sep 2006 19:46:35 -0400 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: References: <4506C7A4.3030004@adelphia.net> Message-ID: On 12/09/06, Ryan Krauss wrote: > John, > > Are you running 32bit or 64bit? I have heard of people wrestling with > 64bit, but never heard anyone fully declare victory. I am going to > build a new computer soon and wonder if I should get a 64bit AMD chip. > > Ryan On an AMD opteron 64-bit system, numpy tests fine but scipy has problems: Ran 1569 tests in 3.253s FAILED (failures=17, errors=1) scipy.version.version -> '0.5.1' numpy.version.version -> '1.0b5' I'll attach the output of "python -c 'import scipy; scipy.test()'". FWIW, I haven't run into anything I needed that didn't work (one of the several spline interpolation functions was broken because of a size mismatch, in the version I was using), it wasn't hard to write portable code (Fortran wrappers need int32, not int), and the machine is fast. A. M. Archibald -------------- next part -------------- A non-text attachment was scrubbed... Name: results Type: application/octet-stream Size: 24363 bytes Desc: not available URL: From stefan at sun.ac.za Tue Sep 12 20:00:58 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 13 Sep 2006 02:00:58 +0200 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: <4506C7A4.3030004@adelphia.net> References: <4506C7A4.3030004@adelphia.net> Message-ID: <20060913000057.GA24793@mentat.za.net> On Tue, Sep 12, 2006 at 10:43:48AM -0400, John Hassler wrote: > On an AMD AthlonXP (Model 6) 1600+ with: > Python 2.4.3 > numpy '1.0b5' > scipy '0.5.1' > ====================================================================== > ERROR: check loadmat case cellnest > ---------------------------------------------------------------------- Thanks, John. This was filed under ticket #258 at http://projects.scipy.org/scipy/scipy/ticket/258 Matthew Brett is busy rewriting some of those routines, so we should have it sorted out soon. Regards St?fan From jstrunk at enthought.com Tue Sep 12 21:11:58 2006 From: jstrunk at enthought.com (Jeff Strunk) Date: Tue, 12 Sep 2006 20:11:58 -0500 Subject: [SciPy-user] ISP Changeover complete Message-ID: <200609122011.58960.jstrunk@enthought.com> Good evening, We have completed our internet service switchover. Thank you for your patience. If you have any trouble, please let me know. Thank you, Jeff Strunk IT Administrator Enthought, Inc. From ryanlists at gmail.com Tue Sep 12 23:37:28 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 12 Sep 2006 22:37:28 -0500 Subject: [SciPy-user] signal.lti In-Reply-To: <1158037447.6722.4.camel@gdur.breisach> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> Message-ID: I think Lars is right. I made 1 change to line 333 of signal/ltisys.py: # ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, fill_value=0) ufunc = interpolate.interp1d(T, U, axis=0, bounds_error=0, fill_value=0) It seems like interp1d replaces linear_1d. With this change, I was able to generate this fabulous step response of an intergrator based on this code: T=2.0 fs=100 dt=1.0/fs t=arange(0.0,T+dt,dt) u=where(t>0.5,1,0) ylim([-0.1,1.1]) mysys=signal.lti(1,[1,0]) yo=signal.lsim2(mysys,u,t) cla() plot(t,u) plot(t,yo[1]) legend(['input','output'],2) xlabel('Time (sec)') ylabel('Amplitude') Ryan On 9/12/06, Lars Friedrich wrote: > Hello, > > > Ryan Krauss wrote: > > > > >I thought about this over the weekend. I was wrong about the matrices > > >not being correct. The A matrix really is just 0. So, passing in the > > >numerator and denominator polynomials does correctly create a system, > > >you just can't take its step response. > > shouldn't signal.lsim2 be able to handle such systems? I tried the > following: > > #*************** start shell session > from scipy import signal > from numpy import * > > sys1=signal.lti((1),(1,0)) > > signal.lsim2(sys1, ones(10), arange(10)) > > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.4/site-packages/scipy/signal/ltisys.py", line > 333, in lsim2 > ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, > fill_value=0) > AttributeError: 'module' object has no attribute 'linear_1d' > #***************** end shell session > > What about this error? I get the same errror with > sys=signal.lti((1),(1,1)). > > Thanks > Lars > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- A non-text attachment was scrubbed... Name: integrator.png Type: image/png Size: 27393 bytes Desc: not available URL: From strawman at astraw.com Wed Sep 13 01:39:47 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 12 Sep 2006 22:39:47 -0700 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: References: <4506C7A4.3030004@adelphia.net> Message-ID: <450799A3.1030600@astraw.com> Ryan Krauss wrote: > John, > > Are you running 32bit or 64bit? I have heard of people wrestling with > 64bit, but never heard anyone fully declare victory. numpy/scipy (and just about all other open source software I've tried) work great in full 64-bit mode linux (both Debian sarge and Ubuntu Dapper). In fact, the only reasons I keep a 32-bit chroot around are Google Earth, Skype, Flash, and to build x86 packages on a fast machine. (This machine is, by far, faster than others I have access too. This in fact is a problem -- I only want to work on it now!) I have not tried Windows in 64-bit mode, but I hear that driver support is often an issue. From stefan at sun.ac.za Wed Sep 13 05:33:37 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 13 Sep 2006 11:33:37 +0200 Subject: [SciPy-user] signal.lti In-Reply-To: References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> Message-ID: <20060913093337.GB7168@mentat.za.net> On Tue, Sep 12, 2006 at 10:37:28PM -0500, Ryan Krauss wrote: > I think Lars is right. I made 1 change to line 333 of signal/ltisys.py: > # ufunc = interpolate.linear_1d(T, U, axis=0, bounds_error=0, > fill_value=0) > ufunc = interpolate.interp1d(T, U, axis=0, bounds_error=0, fill_value=0) > > It seems like interp1d replaces linear_1d. With this change, I was > able to generate this fabulous step response of an intergrator based > on this code: As mentioned yesterday: http://projects.scipy.org/scipy/scipy/changeset/2196 Regards St?fan From nwagner at iam.uni-stuttgart.de Wed Sep 13 05:50:58 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 13 Sep 2006 11:50:58 +0200 Subject: [SciPy-user] L-BFGS in scipy Message-ID: <4507D482.9040707@iam.uni-stuttgart.de> Hi all, Has someone implemented the limited memory BFGS method in scipy ? Nils http://citeseer.ist.psu.edu/liu89limited.html From jelle.feringa at ezct.net Wed Sep 13 07:50:01 2006 From: jelle.feringa at ezct.net (Jelle Feringa / EZCT Architecture & Design Research) Date: Wed, 13 Sep 2006 13:50:01 +0200 Subject: [SciPy-user] Scipy works on AMD now In-Reply-To: <45073CFD.3020001@adelphia.net> Message-ID: <007401c6d72a$bf23cab0$c000a8c0@JELLE> Many thanks to the people who put this all together. john +1 Great news! -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Sep 13 08:34:21 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 13 Sep 2006 07:34:21 -0500 Subject: [SciPy-user] L-BFGS in scipy In-Reply-To: <4507D482.9040707@iam.uni-stuttgart.de> References: <4507D482.9040707@iam.uni-stuttgart.de> Message-ID: <4507FACD.9070902@gmail.com> Nils Wagner wrote: > Hi all, > > Has someone implemented the limited memory BFGS method in scipy ? Yes. scipy.optimize.fmin_l_bfgs_b(). Please grep for these things. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From nwagner at iam.uni-stuttgart.de Wed Sep 13 08:47:28 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 13 Sep 2006 14:47:28 +0200 Subject: [SciPy-user] L-BFGS in scipy In-Reply-To: <4507FACD.9070902@gmail.com> References: <4507D482.9040707@iam.uni-stuttgart.de> <4507FACD.9070902@gmail.com> Message-ID: <4507FDE0.1040702@iam.uni-stuttgart.de> Robert Kern wrote: > Nils Wagner wrote: > >> Hi all, >> >> Has someone implemented the limited memory BFGS method in scipy ? >> > > Yes. scipy.optimize.fmin_l_bfgs_b(). Please grep for these things. > > Thank you Robert. If bounds=None we have an unconstraint version. Thus fmin_l_bfgs_b is also an unconstrained optimizer. I missed that. Maybe fmin_l_bfgs_b should also be added to the list of general-purpose optimization routines help (optimize) yields A collection of general-purpose optimization routines. fmin -- Nelder-Mead Simplex algorithm (uses only function calls) fmin_powell -- Powell's (modified) level set method (uses only function calls) fmin_cg -- Non-linear (Polak-Ribiere) conjugate gradient algorithm (can use function and gradient). fmin_bfgs -- Quasi-Newton method (Broydon-Fletcher-Goldfarb-Shanno); (can use function and gradient) fmin_ncg -- Line-search Newton Conjugate Gradient (can use function, gradient and Hessian). leastsq -- Minimize the sum of squares of M equations in N unknowns given a starting estimate. Constrained Optimizers (multivariate) fmin_l_bfgs_b -- Zhu, Byrd, and Nocedal's L-BFGS-B constrained optimizer (if you use this please quote their papers -- see help) and I disregard fmin_l_bfgs_b because it is given in the section Constrained Optimizers. Sorry for the noise. Nils From schofield at ftw.at Wed Sep 13 08:51:45 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 13 Sep 2006 14:51:45 +0200 Subject: [SciPy-user] Efficient submatrix of a sparse matrix In-Reply-To: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> References: <88FE1B9D-B712-4201-860A-0E713C49A81E@yahoo.ca> Message-ID: <4507FEE1.6010601@ftw.at> Brendan Simons wrote: > Hi all, > > I have a neat little problem for which I think sparse matrices are > the answer. I need to store a bunch of overlapping "intervals", (a, > b pairs for which any a <= val < b crosses that interval) in a manner > which makes "stabbing inquiries" (which intervals cross a specified > value) easy. The canonical way to to this is with interval trees (a > good summary here: http://w3.jouy.inra.fr/unites/miaj/public/vigneron/ > cs4235/l5cs4235.pdf), however I think I can simplify things as follows: > > 1) perform an argsort on the interval a and b endpoints. This gives > the position of each interval in an "order" matrix M, where each row > and each column contains only one interval, and intervals are sorted > in rows by start point, and in columns by endpoint > > 2) for a "stab" point x, do a binary search to find the row i and > column j where x could be inserted into M and maintain its ordering. > The submatrix of M[:i, j:] would contains all those intervals which > start before x, and end after x. > > Since the order matrix M is mostly empty it would make sense to use a > sparse matrix storage scheme, however the only one in scipy.sparse > that allows 2d slicing is the dok_matrix, and the slicing algorithm > is O(n), which is much too expensive for my purposes (I have to do > thousands of stabbing inquiries on a space containing thousands of > intervals). > > Is there no more efficient algorithm for 2d slicing of a sparse matrix? > Hi Brendan, You should be able to obtain 2d slices of sparse matrices much more efficiently than O(n) using the csc_matrix, csr_matrix, and lil_matrix objects. Some of the code you'd need, such as the binary search, is there already, although it deserves to be cleaned up and streamlined. Extending it to allow 2d slices should be simple. I don't have much time to work on it myself right now, but I'd be happy to give you pointers and suggestions ... I'd gladly accept a patch! -- Ed From baum at stommel.tamu.edu Wed Sep 13 15:06:02 2006 From: baum at stommel.tamu.edu (Manbreaker Crag) Date: Wed, 13 Sep 2006 14:06:02 -0500 Subject: [SciPy-user] Compiling on a 64-bit Athlon Linux Machine Message-ID: <20060913190059.M49770@stommel.tamu.edu> I've successfully compiled and installed Numpy (latest SVN version), Scipy (latest SVN version), FFTW 3.1.2 and GotoBLAS 1.06 on a 64-bit dual-processor Athlon machine running Linux Fedora Core 3. Detailed directions on how this was done are available at: http://pong.tamu.edu/tiki/tiki-view_blog_post.php?blogId=6&postId=112 Corrections, suggestions, etc. are welcomed. Steve Baum From wangxj.uc at gmail.com Wed Sep 13 16:22:26 2006 From: wangxj.uc at gmail.com (Xiaojian Wang) Date: Wed, 13 Sep 2006 13:22:26 -0700 Subject: [SciPy-user] L-BFGS in scipy In-Reply-To: <4507FDE0.1040702@iam.uni-stuttgart.de> References: <4507D482.9040707@iam.uni-stuttgart.de> <4507FACD.9070902@gmail.com> <4507FDE0.1040702@iam.uni-stuttgart.de> Message-ID: Hi, Is anybody know which optimize module can handle general constrains? not like lower(i) < Xi < upper(i) in scipy.optimize.fmin_l_bfgs_b(). instead, I would like to include constraint: Gi = cos(X1) + X2**2 + X3*X4 <= 0.0 Xiaojian On 9/13/06, Nils Wagner wrote: > > Robert Kern wrote: > > Nils Wagner wrote: > > > >> Hi all, > >> > >> Has someone implemented the limited memory BFGS method in scipy ? > >> > > > > Yes. scipy.optimize.fmin_l_bfgs_b(). Please grep for these things. > > > > > Thank you Robert. > If bounds=None we have an unconstraint version. > Thus fmin_l_bfgs_b is also an unconstrained optimizer. I missed that. > Maybe fmin_l_bfgs_b should also be added to the list of general-purpose > optimization routines > > help (optimize) yields > > A collection of general-purpose optimization routines. > > fmin -- Nelder-Mead Simplex algorithm > (uses only function calls) > fmin_powell -- Powell's (modified) level set method (uses only > function calls) > fmin_cg -- Non-linear (Polak-Ribiere) conjugate gradient > algorithm > (can use function and gradient). > fmin_bfgs -- Quasi-Newton method > (Broydon-Fletcher-Goldfarb-Shanno); > (can use function and gradient) > fmin_ncg -- Line-search Newton Conjugate Gradient (can use > function, gradient and Hessian). > leastsq -- Minimize the sum of squares of M equations in > N unknowns given a starting estimate. > > > Constrained Optimizers (multivariate) > > fmin_l_bfgs_b -- Zhu, Byrd, and Nocedal's L-BFGS-B constrained > optimizer > (if you use this please quote their papers -- > see help) > > and I disregard fmin_l_bfgs_b because it is given in the section > Constrained Optimizers. > > Sorry for the noise. > > Nils > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Thu Sep 14 02:24:00 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 14 Sep 2006 08:24:00 +0200 Subject: [SciPy-user] L-BFGS in scipy In-Reply-To: References: <4507D482.9040707@iam.uni-stuttgart.de> <4507FACD.9070902@gmail.com> <4507FDE0.1040702@iam.uni-stuttgart.de> Message-ID: <4508F580.8070703@iam.uni-stuttgart.de> Xiaojian Wang wrote: > Hi, > Is anybody know which optimize module can handle general constrains? > not like > lower(i) < Xi < upper(i) in scipy.optimize.fmin_l_bfgs_b(). > instead, I would like to include constraint: > Gi = cos(X1) + X2**2 + X3*X4 <= 0.0 > > Xiaojian > > fmin_cobyla(func, x0, cons, args=(), consargs=None, rhobeg=1.0, rhoend=0.0001, iprint=1, maxfun=1000) Minimize a function using the Constrained Optimization BY Linear Approximation (COBYLA) method Arguments: func -- function to minimize. Called as func(x, *args) x0 -- initial guess to minimum cons -- a sequence of functions that all must be >=0 (a single function if only 1 constraint) args -- extra arguments to pass to function consargs -- extra arguments to pass to constraints (default of None means use same extra arguments as those passed to func). Use () for no extra arguments. rhobeg -- reasonable initial changes to the variables rhoend -- final accuracy in the optimization (not precisely guaranteed) iprint -- controls the frequency of output: 0 (no output),1,2,3 maxfun -- maximum number of function evaluations. Nils > > > > On 9/13/06, *Nils Wagner* > wrote: > > Robert Kern wrote: > > Nils Wagner wrote: > > > >> Hi all, > >> > >> Has someone implemented the limited memory BFGS method in scipy ? > >> > > > > Yes. scipy.optimize.fmin_l_bfgs_b(). Please grep for these things. > > > > > Thank you Robert. > If bounds=None we have an unconstraint version. > Thus fmin_l_bfgs_b is also an unconstrained optimizer. I missed that. > Maybe fmin_l_bfgs_b should also be added to the list of > general-purpose > optimization routines > > help (optimize) yields > > A collection of general-purpose optimization routines. > > fmin -- Nelder-Mead Simplex algorithm > (uses only function calls) > fmin_powell -- Powell's (modified) level set method (uses only > function calls) > fmin_cg -- Non-linear (Polak-Ribiere) conjugate gradient > algorithm > (can use function and gradient). > fmin_bfgs -- Quasi-Newton method > (Broydon-Fletcher-Goldfarb-Shanno); > (can use function and gradient) > fmin_ncg -- Line-search Newton Conjugate Gradient (can use > function, gradient and Hessian). > leastsq -- Minimize the sum of squares of M equations in > N unknowns given a starting estimate. > > > Constrained Optimizers (multivariate) > > fmin_l_bfgs_b -- Zhu, Byrd, and Nocedal's L-BFGS-B constrained > optimizer > (if you use this please quote their papers -- > see help) > > and I disregard fmin_l_bfgs_b because it is given in the section > Constrained Optimizers. > > Sorry for the noise. > > Nils > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From williams at astro.ox.ac.uk Thu Sep 14 10:55:38 2006 From: williams at astro.ox.ac.uk (Michael Williams) Date: Thu, 14 Sep 2006 15:55:38 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) Message-ID: <20060914145538.GB8453@astro.ox.ac.uk> Hi, I'm running scipy 0.8.5 and numpy 1.0b5 built from source using Debian's stable environment (gcc 3.3, Python 2.3.5) on an AMD64 machine. numpy.test() runs fine, but scipy.test does not. Here are the errors: ====================================================================== ERROR: generic 1d filter 1 ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/williams/usr/lib/python2.3/site-packages/scipy/ndimage/tests/test_ndimage.py", line 1141, in test_generic_filter1d01 extra_keywords = {'total': weights.sum()}) File "/home/williams/usr//lib/python2.3/site-packages/scipy/ndimage/filters.py", line 632, in generic_filter1d mode, cval, origin, extra_arguments, extra_keywords) MemoryError ====================================================================== ERROR: check loadmat case cellnest ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/williams/usr/lib/python2.3/site-packages/scipy/io/tests/test_mio.py", line 80, in cc self._check_case(name, expected) File "/home/williams/usr/lib/python2.3/site-packages/scipy/io/tests/test_mio.py", line 75, in _check_case self._check_level(k_label, expected, matdict[k]) File "/home/williams/usr/lib/python2.3/site-packages/scipy/io/tests/test_mio.py", line 33, in _check_level self._check_level(level_label, ev, actual[i]) File "/home/williams/usr/lib/python2.3/site-packages/scipy/io/tests/test_mio.py", line 30, in _check_level assert len(expected) == len(actual), "Different list lengths at %s" % label TypeError: len() of unsized object ---------------------------------------------------------------------- Ran 1569 tests in 2.105s FAILED (errors=2) I would be very grateful for any suggestions? Indeed, should I even worry about this? Thanks, -- Michael Williams Theoretical Physics, University of Oxford From williams at astro.ox.ac.uk Thu Sep 14 11:09:34 2006 From: williams at astro.ox.ac.uk (Michael Williams) Date: Thu, 14 Sep 2006 16:09:34 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060914145538.GB8453@astro.ox.ac.uk> References: <20060914145538.GB8453@astro.ox.ac.uk> Message-ID: <20060914150934.GD8453@astro.ox.ac.uk> On Thu, Sep 14, 2006 at 03:55:38PM +0100, Michael Williams wrote: >====================================================================== >ERROR: check loadmat case cellnest >---------------------------------------------------------------------- I see this particular error on AMD64 is under discussion elsewhere on the list: http://thread.gmane.org/gmane.comp.python.scientific.user/8768/focus=8772. If this is the same problem, sorry for the list noise! I'm still interested in solving the ld filter error though! Cheers, -- Michael Williams Theoretical Physics, University of Oxford From stefan at sun.ac.za Thu Sep 14 11:16:47 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 14 Sep 2006 17:16:47 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060914145538.GB8453@astro.ox.ac.uk> References: <20060914145538.GB8453@astro.ox.ac.uk> Message-ID: <20060914151647.GI963@mentat.za.net> On Thu, Sep 14, 2006 at 03:55:38PM +0100, Michael Williams wrote: > Hi, > > I'm running scipy 0.8.5 and numpy 1.0b5 built from source using Debian's > stable environment (gcc 3.3, Python 2.3.5) on an AMD64 machine. > numpy.test() runs fine, but scipy.test does not. Here are the errors: I tried 'from __future__ import scipy' but I can't import v0.8.5 or 0.5.8 :) Is this the latest SVN version (today's)? I committed some changes to the mio code this morning, which may have fixed the cellnest error. Rehards St?fan From williams at astro.ox.ac.uk Thu Sep 14 12:16:10 2006 From: williams at astro.ox.ac.uk (Michael Williams) Date: Thu, 14 Sep 2006 17:16:10 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060914151647.GI963@mentat.za.net> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> Message-ID: <20060914161610.GG8453@astro.ox.ac.uk> On Thu, Sep 14, 2006 at 05:16:47PM +0200, Stefan van der Walt wrote: >I tried 'from __future__ import scipy' but I can't import v0.8.5 >or 0.5.8 :) I have no idea where I got that version number from! It's been a long day. In fact I was running the 0.5.1 release. >Is this the latest SVN version (today's)? I committed some changes to >the mio code this morning, which may have fixed the cellnest error. I just checked out and built numpy and scipy from source (1.0rc1.dev3154 and 0.5.2.dev2198 respectively), and the cellnest error is gone. Thanks very much! (The ld filter error remains, but has been discussed elsewhere.) Cheers, -- Mike From irbdavid at gmail.com Thu Sep 14 12:27:27 2006 From: irbdavid at gmail.com (David Andrews) Date: Thu, 14 Sep 2006 18:27:27 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel Message-ID: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> Hi, I'm a little lost here - i've d/led and installed everything in the scipy superpack, yet i am still unable to import certain components of scipy and matplotlib / pylab. The only thing that does appear to be fully installed is numpy, which i guess is a start. So here is what I get when i try 'from scipy import *': Traceback (most recent call last): File "", line 1, in ? File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", line 5, in ? from linsolve import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", line 1, in ? from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from sparse import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", line 12, in ? import sparsetools ImportError: Failure linking new module: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: Library not loaded: /usr/local/lib/libgfortran.1.dylib Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so Reason: image not found And with 'from pylab import *': Traceback (most recent call last): File "", line 1, in ? File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", line 1, in ? from matplotlib.pylab import * ImportError: No module named matplotlib.pylab So any suggestions? I previously had a working install of matplotlib/pylab and numpy, if that suggest anything to you? I think sticking a readme file in the superpack might be a good idea. Cheers, Dave From nwagner at iam.uni-stuttgart.de Thu Sep 14 13:31:19 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 14 Sep 2006 19:31:19 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060914161610.GG8453@astro.ox.ac.uk> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> Message-ID: On Thu, 14 Sep 2006 17:16:10 +0100 Michael Williams wrote: > On Thu, Sep 14, 2006 at 05:16:47PM +0200, Stefan van der >Walt wrote: >>I tried 'from __future__ import scipy' but I can't import >>v0.8.5 >>or 0.5.8 :) > > I have no idea where I got that version number from! >It's been a long > day. In fact I was running the 0.5.1 release. > >>Is this the latest SVN version (today's)? I committed >>some changes to >>the mio code this morning, which may have fixed the >>cellnest error. > > I just checked out and built numpy and scipy from source > (1.0rc1.dev3154 and 0.5.2.dev2198 respectively), and the >cellnest error > is gone. Thanks very much! > But a new bug is introduced in latest svn ... See http://projects.scipy.org/scipy/scipy/ticket/263 Nils > (The ld filter error remains, but has been discussed >elsewhere.) > I cannot reproduce that error... > Cheers, > -- Mike > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From stefan at sun.ac.za Thu Sep 14 15:55:35 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 14 Sep 2006 21:55:35 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> Message-ID: <20060914195535.GL963@mentat.za.net> On Thu, Sep 14, 2006 at 07:31:19PM +0200, Nils Wagner wrote: > On Thu, 14 Sep 2006 17:16:10 +0100 > Michael Williams wrote: > > On Thu, Sep 14, 2006 at 05:16:47PM +0200, Stefan van der > >Walt wrote: > >>I tried 'from __future__ import scipy' but I can't import > >>v0.8.5 > >>or 0.5.8 :) > > > > I have no idea where I got that version number from! > >It's been a long > > day. In fact I was running the 0.5.1 release. > > > >>Is this the latest SVN version (today's)? I committed > >>some changes to > >>the mio code this morning, which may have fixed the > >>cellnest error. > > > > I just checked out and built numpy and scipy from source > > (1.0rc1.dev3154 and 0.5.2.dev2198 respectively), and the > >cellnest error > > is gone. Thanks very much! > > > But a new bug is introduced in latest svn ... > See > > http://projects.scipy.org/scipy/scipy/ticket/263 Are you sure that this is a bug? Please give feedback as requested in the ticket comment. Regards St?fan From conor.robinson at gmail.com Thu Sep 14 16:01:49 2006 From: conor.robinson at gmail.com (Conor Robinson) Date: Thu, 14 Sep 2006 13:01:49 -0700 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> Message-ID: Strange, I used the superpack on a Mac Pro and everything installed fine. Did you install the pre-compiled gfortran for intel compilers on your macine first? Just a thought. I think they're included in the superpack. Conor On 9/14/06, David Andrews wrote: > Hi, > > I'm a little lost here - i've d/led and installed everything in the > scipy superpack, yet i am still unable to import certain components of > scipy and matplotlib / pylab. The only thing that does appear to be > fully installed is numpy, which i guess is a start. > > So here is what I get when i try 'from scipy import *': > > Traceback (most recent call last): > File "", line 1, in ? > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", > line 5, in ? > from linsolve import * > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", > line 1, in ? > from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", > line 5, in ? > from sparse import * > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", > line 12, in ? > import sparsetools > ImportError: Failure linking new module: > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: > Library not loaded: /usr/local/lib/libgfortran.1.dylib > Referenced from: > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so > Reason: image not found > > And with 'from pylab import *': > > Traceback (most recent call last): > File "", line 1, in ? > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > line 1, in ? > from matplotlib.pylab import * > ImportError: No module named matplotlib.pylab > > So any suggestions? > > I previously had a working install of matplotlib/pylab and numpy, if > that suggest anything to you? > > I think sticking a readme file in the superpack might be a good idea. > > Cheers, > > Dave > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From irbdavid at gmail.com Thu Sep 14 16:30:17 2006 From: irbdavid at gmail.com (David Andrews) Date: Thu, 14 Sep 2006 22:30:17 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> Message-ID: <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> Part of the problem would appear to have been me borking the install of gfortran - reinstalling the binaries for it has fixed the scipy issue. The issue with pylab not working persists - should this superpack provide a full working install of matplotlib/pylab? I remember a while back having to fight with backends for it, though that doesnt appear to have a bearing on the current problem i'm having. Thanks for the quick reply Conor, Cheers, Dave On 14/09/06, Conor Robinson wrote: > Strange, I used the superpack on a Mac Pro and everything installed > fine. Did you install the pre-compiled gfortran for intel compilers > on your macine first? Just a thought. I think they're included in the > superpack. > > Conor > > On 9/14/06, David Andrews wrote: > > Hi, > > > > I'm a little lost here - i've d/led and installed everything in the > > scipy superpack, yet i am still unable to import certain components of > > scipy and matplotlib / pylab. The only thing that does appear to be > > fully installed is numpy, which i guess is a start. > > > > So here is what I get when i try 'from scipy import *': > > > > Traceback (most recent call last): > > File "", line 1, in ? > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", > > line 5, in ? > > from linsolve import * > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", > > line 1, in ? > > from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", > > line 5, in ? > > from sparse import * > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", > > line 12, in ? > > import sparsetools > > ImportError: Failure linking new module: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: > > Library not loaded: /usr/local/lib/libgfortran.1.dylib > > Referenced from: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so > > Reason: image not found > > > > And with 'from pylab import *': > > > > Traceback (most recent call last): > > File "", line 1, in ? > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > line 1, in ? > > from matplotlib.pylab import * > > ImportError: No module named matplotlib.pylab > > > > So any suggestions? > > > > I previously had a working install of matplotlib/pylab and numpy, if > > that suggest anything to you? > > > > I think sticking a readme file in the superpack might be a good idea. > > > > Cheers, > > > > Dave > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From hasslerjc at adelphia.net Thu Sep 14 16:51:54 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Thu, 14 Sep 2006 16:51:54 -0400 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> Message-ID: <4509C0EA.3020806@adelphia.net> I got the "No module named matplotlib.pylab" error. I'd installed from the windows installer, and after some considerable head-scratching, figured out that the __init__.py file was missing from the matplotlib module ... so Python didn't recognize it as a module. I found the file in the unzipped .egg, copied it to matplotlib, and everybody seems to be happy. john David Andrews wrote: > Part of the problem would appear to have been me borking the install > of gfortran - reinstalling the binaries for it has fixed the scipy > issue. The issue with pylab not working persists - should this > superpack provide a full working install of matplotlib/pylab? I > remember a while back having to fight with backends for it, though > that doesnt appear to have a bearing on the current problem i'm > having. > > Thanks for the quick reply Conor, > > Cheers, > > Dave > > > >>> "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", >>> line 1, in ? >>> from matplotlib.pylab import * >>> ImportError: No module named matplotlib.pylab >>> >>> So any suggestions? >>> >>> > From conor.robinson at gmail.com Thu Sep 14 17:02:00 2006 From: conor.robinson at gmail.com (Conor Robinson) Date: Thu, 14 Sep 2006 14:02:00 -0700 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> Message-ID: David, I just tried "from pylab import *" and got the same error. It looks to me that python is not understanding matplotlib. as the dir for pylab.py I tried copying the actual pylab into the python2.4 dir or site-packages dir (replace pylab.py int version) so it finds it first before the current pylab init and then it can't find cm.py and im assuming the rest of the modules. I've currently been unsing gnuplot.py so I never bothered testing this, but it would be nice to make it work. Fairly sure this is a dir issue. Any ideas? Conor On 9/14/06, David Andrews wrote: > Part of the problem would appear to have been me borking the install > of gfortran - reinstalling the binaries for it has fixed the scipy > issue. The issue with pylab not working persists - should this > superpack provide a full working install of matplotlib/pylab? I > remember a while back having to fight with backends for it, though > that doesnt appear to have a bearing on the current problem i'm > having. > > Thanks for the quick reply Conor, > > Cheers, > > Dave > > On 14/09/06, Conor Robinson wrote: > > Strange, I used the superpack on a Mac Pro and everything installed > > fine. Did you install the pre-compiled gfortran for intel compilers > > on your macine first? Just a thought. I think they're included in the > > superpack. > > > > Conor > > > > On 9/14/06, David Andrews wrote: > > > Hi, > > > > > > I'm a little lost here - i've d/led and installed everything in the > > > scipy superpack, yet i am still unable to import certain components of > > > scipy and matplotlib / pylab. The only thing that does appear to be > > > fully installed is numpy, which i guess is a start. > > > > > > So here is what I get when i try 'from scipy import *': > > > > > > Traceback (most recent call last): > > > File "", line 1, in ? > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", > > > line 5, in ? > > > from linsolve import * > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", > > > line 1, in ? > > > from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", > > > line 5, in ? > > > from sparse import * > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", > > > line 12, in ? > > > import sparsetools > > > ImportError: Failure linking new module: > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: > > > Library not loaded: /usr/local/lib/libgfortran.1.dylib > > > Referenced from: > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so > > > Reason: image not found > > > > > > And with 'from pylab import *': > > > > > > Traceback (most recent call last): > > > File "", line 1, in ? > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > > line 1, in ? > > > from matplotlib.pylab import * > > > ImportError: No module named matplotlib.pylab > > > > > > So any suggestions? > > > > > > I previously had a working install of matplotlib/pylab and numpy, if > > > that suggest anything to you? > > > > > > I think sticking a readme file in the superpack might be a good idea. > > > > > > Cheers, > > > > > > Dave > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From matthew.brett at gmail.com Thu Sep 14 17:12:03 2006 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 14 Sep 2006 22:12:03 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060914161610.GG8453@astro.ox.ac.uk> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> Message-ID: <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> Hi, > (The ld filter error remains, but has been discussed elsewhere.) I am getting that error too on an x86_64 machine. Has it in fact been discussed elsewhere? I think the thread you pointed to was just the error report wasn't it? Matthew From conor.robinson at gmail.com Thu Sep 14 17:13:10 2006 From: conor.robinson at gmail.com (Conor Robinson) Date: Thu, 14 Sep 2006 14:13:10 -0700 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> Message-ID: I looked in the unzipped egg and didn't see the init, is it left in the folder after the install? thanks On 9/14/06, Conor Robinson wrote: > David, > > I just tried "from pylab import *" and got the same error. It looks > to me that python is not understanding matplotlib. as the dir for > pylab.py I tried copying the actual pylab into the python2.4 dir or > site-packages dir (replace pylab.py int version) so it finds it first > before the current pylab init and then it can't find cm.py and im > assuming the rest of the modules. I've currently been unsing > gnuplot.py so I never bothered testing this, but it would be nice to > make it work. Fairly sure this is a dir issue. Any ideas? > > Conor > > On 9/14/06, David Andrews wrote: > > Part of the problem would appear to have been me borking the install > > of gfortran - reinstalling the binaries for it has fixed the scipy > > issue. The issue with pylab not working persists - should this > > superpack provide a full working install of matplotlib/pylab? I > > remember a while back having to fight with backends for it, though > > that doesnt appear to have a bearing on the current problem i'm > > having. > > > > Thanks for the quick reply Conor, > > > > Cheers, > > > > Dave > > > > On 14/09/06, Conor Robinson wrote: > > > Strange, I used the superpack on a Mac Pro and everything installed > > > fine. Did you install the pre-compiled gfortran for intel compilers > > > on your macine first? Just a thought. I think they're included in the > > > superpack. > > > > > > Conor > > > > > > On 9/14/06, David Andrews wrote: > > > > Hi, > > > > > > > > I'm a little lost here - i've d/led and installed everything in the > > > > scipy superpack, yet i am still unable to import certain components of > > > > scipy and matplotlib / pylab. The only thing that does appear to be > > > > fully installed is numpy, which i guess is a start. > > > > > > > > So here is what I get when i try 'from scipy import *': > > > > > > > > Traceback (most recent call last): > > > > File "", line 1, in ? > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", > > > > line 5, in ? > > > > from linsolve import * > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", > > > > line 1, in ? > > > > from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", > > > > line 5, in ? > > > > from sparse import * > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", > > > > line 12, in ? > > > > import sparsetools > > > > ImportError: Failure linking new module: > > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: > > > > Library not loaded: /usr/local/lib/libgfortran.1.dylib > > > > Referenced from: > > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so > > > > Reason: image not found > > > > > > > > And with 'from pylab import *': > > > > > > > > Traceback (most recent call last): > > > > File "", line 1, in ? > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > > > line 1, in ? > > > > from matplotlib.pylab import * > > > > ImportError: No module named matplotlib.pylab > > > > > > > > So any suggestions? > > > > > > > > I previously had a working install of matplotlib/pylab and numpy, if > > > > that suggest anything to you? > > > > > > > > I think sticking a readme file in the superpack might be a good idea. > > > > > > > > Cheers, > > > > > > > > Dave > > > > _______________________________________________ > > > > SciPy-user mailing list > > > > SciPy-user at scipy.org > > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > From irbdavid at gmail.com Thu Sep 14 17:18:20 2006 From: irbdavid at gmail.com (David Andrews) Date: Thu, 14 Sep 2006 23:18:20 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <4509C0EA.3020806@adelphia.net> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <4509C0EA.3020806@adelphia.net> Message-ID: <8edec5ec0609141418i12936a42vd5fd42ef47f48c0a@mail.gmail.com> Hi John, I'm looking at the contents of the installer archive, i don't seem to see an __init_.py in the matplotlib directory, though they do appear for sub directories of the matplotlib directory (numericx/ etc) - perhaps its physically missing from the installer? Conor; as John has noted the lack of an __init__.py in the module would render it invisible to python, causing the problem we're seein Cheers, DAve On 14/09/06, John Hassler wrote: > I got the "No module named matplotlib.pylab" error. I'd installed from > the windows installer, and after some considerable head-scratching, > figured out that the __init__.py file was missing from the matplotlib > module ... so Python didn't recognize it as a module. I found the file > in the unzipped .egg, copied it to matplotlib, and everybody seems to be > happy. > john > > > > David Andrews wrote: > > Part of the problem would appear to have been me borking the install > > of gfortran - reinstalling the binaries for it has fixed the scipy > > issue. The issue with pylab not working persists - should this > > superpack provide a full working install of matplotlib/pylab? I > > remember a while back having to fight with backends for it, though > > that doesnt appear to have a bearing on the current problem i'm > > having. > > > > Thanks for the quick reply Conor, > > > > Cheers, > > > > Dave > > > > > > > >>> "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > >>> line 1, in ? > >>> from matplotlib.pylab import * > >>> ImportError: No module named matplotlib.pylab > >>> > >>> So any suggestions? > >>> > >>> > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From irbdavid at gmail.com Thu Sep 14 17:30:59 2006 From: irbdavid at gmail.com (David Andrews) Date: Thu, 14 Sep 2006 23:30:59 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> Message-ID: <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> The stuff in the .mpkg should remain after the install, yeah. If its not there now, it never was :) Looks like it's been missed out somewhere a long the way... Cheers, Dave On 14/09/06, Conor Robinson wrote: > I looked in the unzipped egg and didn't see the init, is it left in > the folder after the install? > > thanks > > On 9/14/06, Conor Robinson wrote: > > David, > > > > I just tried "from pylab import *" and got the same error. It looks > > to me that python is not understanding matplotlib. as the dir for > > pylab.py I tried copying the actual pylab into the python2.4 dir or > > site-packages dir (replace pylab.py int version) so it finds it first > > before the current pylab init and then it can't find cm.py and im > > assuming the rest of the modules. I've currently been unsing > > gnuplot.py so I never bothered testing this, but it would be nice to > > make it work. Fairly sure this is a dir issue. Any ideas? > > > > Conor > > > > On 9/14/06, David Andrews wrote: > > > Part of the problem would appear to have been me borking the install > > > of gfortran - reinstalling the binaries for it has fixed the scipy > > > issue. The issue with pylab not working persists - should this > > > superpack provide a full working install of matplotlib/pylab? I > > > remember a while back having to fight with backends for it, though > > > that doesnt appear to have a bearing on the current problem i'm > > > having. > > > > > > Thanks for the quick reply Conor, > > > > > > Cheers, > > > > > > Dave > > > > > > On 14/09/06, Conor Robinson wrote: > > > > Strange, I used the superpack on a Mac Pro and everything installed > > > > fine. Did you install the pre-compiled gfortran for intel compilers > > > > on your macine first? Just a thought. I think they're included in the > > > > superpack. > > > > > > > > Conor > > > > > > > > On 9/14/06, David Andrews wrote: > > > > > Hi, > > > > > > > > > > I'm a little lost here - i've d/led and installed everything in the > > > > > scipy superpack, yet i am still unable to import certain components of > > > > > scipy and matplotlib / pylab. The only thing that does appear to be > > > > > fully installed is numpy, which i guess is a start. > > > > > > > > > > So here is what I get when i try 'from scipy import *': > > > > > > > > > > Traceback (most recent call last): > > > > > File "", line 1, in ? > > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/__init__.py", > > > > > line 5, in ? > > > > > from linsolve import * > > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linsolve/linsolve.py", > > > > > line 1, in ? > > > > > from scipy.sparse import isspmatrix_csc, isspmatrix_csr, isspmatrix, spdiags > > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/__init__.py", > > > > > line 5, in ? > > > > > from sparse import * > > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparse.py", > > > > > line 12, in ? > > > > > import sparsetools > > > > > ImportError: Failure linking new module: > > > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so: > > > > > Library not loaded: /usr/local/lib/libgfortran.1.dylib > > > > > Referenced from: > > > > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/sparse/sparsetools.so > > > > > Reason: image not found > > > > > > > > > > And with 'from pylab import *': > > > > > > > > > > Traceback (most recent call last): > > > > > File "", line 1, in ? > > > > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > > > > line 1, in ? > > > > > from matplotlib.pylab import * > > > > > ImportError: No module named matplotlib.pylab > > > > > > > > > > So any suggestions? > > > > > > > > > > I previously had a working install of matplotlib/pylab and numpy, if > > > > > that suggest anything to you? > > > > > > > > > > I think sticking a readme file in the superpack might be a good idea. > > > > > > > > > > Cheers, > > > > > > > > > > Dave > > > > > _______________________________________________ > > > > > SciPy-user mailing list > > > > > SciPy-user at scipy.org > > > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > > > > > _______________________________________________ > > > > SciPy-user mailing list > > > > SciPy-user at scipy.org > > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From matthew.brett at gmail.com Thu Sep 14 17:38:46 2006 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 14 Sep 2006 22:38:46 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> Message-ID: <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> Hi, > > (The ld filter error remains, but has been discussed elsewhere.) > > I am getting that error too on an x86_64 machine. Has it in fact been > discussed elsewhere? I think the thread you pointed to was just the > error report wasn't it? Actually I've appended a test script that reproduces it; the first call to ndimage.generic_filter1d works, but the second (in my script: identical) call generates a memory error, only on my x86_64 machine: import numpy as N import scipy.ndimage as ndimage def _filter_func(input, output, fltr, total): fltr = fltr / total for ii in range(input.shape[0] - 2): output[ii] = input[ii] * fltr[0] output[ii] += input[ii + 1] * fltr[1] output[ii] += input[ii + 2] * fltr[2] weights = N.array([1.1, 2.2, 3.3]) type = N.uint8 a = N.arange(12, dtype = type) a.shape = (3,4) total = weights.sum() # No error first time, error second time for i in range(2): print "Iteration %d" % i r2 = ndimage.generic_filter1d(a, _filter_func, 3, axis = 0, origin = -1, extra_arguments = (weights,), extra_keywords = {'total': total}) Gives: [mb312 at foundmachine tmp]$ python nd_debug.py Iteration 0 Iteration 1 Traceback (most recent call last): File "nd_debug.py", line 22, in ? extra_keywords = {'total': total}) File "/home/mb312/lib64/python2.4/site-packages/scipy/ndimage/filters.py", line 632, in generic_filter1d mode, cval, origin, extra_arguments, extra_keywords) MemoryError Has anyone got any tips on how I might go about debugging this? Sorry, I am a novice at debugging extension code. Best, Matthew From stefan at sun.ac.za Thu Sep 14 18:38:15 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 15 Sep 2006 00:38:15 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> Message-ID: <20060914223815.GM963@mentat.za.net> Hi Matthew On Thu, Sep 14, 2006 at 10:38:46PM +0100, Matthew Brett wrote: > > > (The ld filter error remains, but has been discussed elsewhere.) > > > > I am getting that error too on an x86_64 machine. Has it in fact been > > discussed elsewhere? I think the thread you pointed to was just the > > error report wasn't it? > > Actually I've appended a test script that reproduces it; the first > call to ndimage.generic_filter1d works, but the second (in my script: > identical) call generates a memory error, only on my x86_64 machine: > Has anyone got any tips on how I might go about debugging this? > Sorry, I am a novice at debugging extension code. Could you please run this through valgrind? To use valgrind, I do the following: 1. Generate a suppression file. This file tells valgrind which memory errors are expected. Since the latest numpy testsuite seems to be free of those gremlins (let's hold thumbs), you can generate the suppressions by doing: $ valgrind --gen-suppressions=all python -c 'import numpy; numpy.test()' 2>&1 | grep -v '==' > valgrind.supp Just check valgrind.supp over to make sure it doesn't contain any python errors or other garbage. You only need to generate this file once. 2. Run valgrind on the test script. I have the following in a file, valgrind-py, that I got from Albert: valgrind \ --tool=memcheck \ --leak-check=yes \ --error-limit=no \ --suppressions=valgrind-python.supp \ --num-callers=10 \ -v \ python $* so that I can simply run $ valgrind-py The resulting errors should point us in the right direction! Thanks for your effort. St?fan From lanceboyle at qwest.net Fri Sep 15 00:52:24 2006 From: lanceboyle at qwest.net (Jerry) Date: Thu, 14 Sep 2006 21:52:24 -0700 Subject: [SciPy-user] Scipy and fink Message-ID: Can Scipy be installed from Fink? Is there any reason not to do this, assuming the usual "fink" search path /sw is in place? Jerry From gael.varoquaux at normalesup.org Fri Sep 15 01:59:14 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 15 Sep 2006 07:59:14 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> Message-ID: <20060915055914.GA16683@clipper.ens.fr> On Thu, Sep 14, 2006 at 11:30:59PM +0200, David Andrews wrote: > The stuff in the .mpkg should remain after the install, yeah. If its > not there now, it never was :) Looks like it's been missed out > somewhere a long the way... On the matplotlib-dev list there has been some mentions of a forgotten __init__.py in the latest release. This has been fixed, but maybe not in the package you are using. You can grab this __init__.py from the .tar.gz from the matplotlib site, and this should fix your problem (I think). Ga?l From nwagner at iam.uni-stuttgart.de Fri Sep 15 03:30:38 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 15 Sep 2006 09:30:38 +0200 Subject: [SciPy-user] Usage of fmin_tnc and fmin_l_bfgs_b Message-ID: <450A569E.4080200@iam.uni-stuttgart.de> Hi all, I would like to solve a constrained optimization problem with scipy. As far as I understand it there exists two possible functions for my problem in scipy - fmin_tnc and fmin_l_bfgs_b. The problem is given by min f(x) subjected to \theta_1 \le theta \le theta_2 and r_1 \le r \le r_2 where x is a vector \in \mathds{R}^{2n+1}. theta is the last entry in x. r = \| x[:2*n] \| = linalg.norm(x[:2*n]) How do I specify the bounds for my problem ? I mean it's easy to define the bounds for the l a s t parameter (\theta) but I am at a loss how to formulate the bounds for x[0],...,x[2n-1] s e p a r a t e l y. bounds -- a list of (min, max) pairs for each element in x, defining the bounds on that parameter. Use None for one of min or max when there is no bound in that direction Any hint would be appreciated. Nils From lfriedri at imtek.de Fri Sep 15 03:33:59 2006 From: lfriedri at imtek.de (Lars Friedrich) Date: Fri, 15 Sep 2006 09:33:59 +0200 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> Message-ID: <1158305639.6586.8.camel@localhost> Hello scipy users, for a short time now, I am using python, numpy and scipy as a replacement for matlab. My setup at the moment is the "scite"-editor and the "ipython"-shell. I am happy using the "run -i" command to access the current workspace with my scripts written with scite. There is one feature that I know from matlab that I miss a little. Maybe its not crucial but it would be nice to have: When I made a change to my script in scite, I have to *save it *change scope to ipython *recall the "run -i myScript.py" command *press In matlab it is possible to hit to get the same effect. How can I achieve this with scite/ipython? I think, I need a way to tell ipython "from extern" to execute the "run"-command. I would put this to my scite-config-file to connect it with, say, the button and would be fine. Is this possible? Thanks for every comment Lars From nwagner at iam.uni-stuttgart.de Fri Sep 15 04:45:44 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 15 Sep 2006 10:45:44 +0200 Subject: [SciPy-user] Usage of fmin_tnc and fmin_l_bfgs_b In-Reply-To: <450A569E.4080200@iam.uni-stuttgart.de> References: <450A569E.4080200@iam.uni-stuttgart.de> Message-ID: <450A6838.9060209@iam.uni-stuttgart.de> Nils Wagner wrote: > Hi all, > > I would like to solve a constrained optimization problem with scipy. > As far as I understand it there exists two possible functions for my > problem in scipy - fmin_tnc and fmin_l_bfgs_b. > > The problem is given by > > min f(x) > > subjected to > > \theta_1 \le theta \le theta_2 > > and > > r_1 \le r \le r_2 > > where x is a vector \in \mathds{R}^{2n+1}. > > theta is the last entry in x. > > r = \| x[:2*n] \| = linalg.norm(x[:2*n]) > > > How do I specify the bounds for my problem ? I mean > it's easy to define the bounds for the l a s t parameter (\theta) but > I am at a loss how to formulate > the bounds for x[0],...,x[2n-1] s e p a r a t e l y. > > bounds -- a list of (min, max) pairs for each element in x, defining > the bounds on that parameter. Use None for one of min or max > when there is no bound in that direction > > Any hint would be appreciated. > > Nils > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > Sorry for replying to myself. I guess I can use fmin_cobyla with the following constraints cons1= x[-1]-\theta_1 cons2=\theta_2-x[-1] cons3=linalg.norm(x[:2*n])-r_1 cons4=r_2-linalg.norm(x[:2*n]) Is that correct ? Is there a better way to implement the problem ? Nils From gael.varoquaux at normalesup.org Fri Sep 15 04:49:03 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 15 Sep 2006 10:49:03 +0200 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: <1158305639.6586.8.camel@localhost> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> <1158305639.6586.8.camel@localhost> Message-ID: <20060915084901.GB21871@clipper.ens.fr> Hello Lars, I don't think there is an obvious way of doing this with ipython (though you might get a different answer from a ipython wizard). I have a suggestion. I haven't tested this thoroughly, but you can have a look. Use SPE (Stany's Python Editor) as an editor. Spe has a python shell. To use pylab in the python shell you should use the WXAgg backend (edit your ~/.matplotlib/.matplotlibrc and set backend : WXAgg Then in the shell you can use python interactively or you can run the scripts you are editing in it. One big caveat: you create an display a figure before plot to it: You should do from numpy import * from pylab import * t = arange(1,10) figure() # Important: create the figure before attempting to plot to it show() # Important: display the figure before attempting to plot to it plot(t,cos(t)) show() The python shell embedded in SPE isn't as nice as ipython (Robert, if you are listening, didn't I hear you say that you were working on a front-end independent ipython that could replace Wx's pyshell one day). One big feature that you will miss is the interactivity of pylab that you have in ipython (not have to enter the show() command, and the ability to zoom. Maybe it is possible to get matplotlib to do this here to, you can try asking on the matplotlib list. Using SPE is just an idea an needs to be explored a bit more, but maybe you can do something out of it. Ga?l On Fri, Sep 15, 2006 at 09:33:59AM +0200, Lars Friedrich wrote: > Hello scipy users, > for a short time now, I am using python, numpy and scipy as a > replacement for matlab. My setup at the moment is the "scite"-editor and > the "ipython"-shell. I am happy using the "run -i" command to access the > current workspace with my scripts written with scite. > There is one feature that I know from matlab that I miss a little. Maybe > its not crucial but it would be nice to have: > When I made a change to my script in scite, I have to > *save it > *change scope to ipython > *recall the "run -i myScript.py" command > *press > In matlab it is possible to hit to get the same effect. How can I > achieve this with scite/ipython? I think, I need a way to tell ipython > "from extern" to execute the "run"-command. I would put this to my > scite-config-file to connect it with, say, the button and would be > fine. Is this possible? > Thanks for every comment > Lars > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From a.u.r.e.l.i.a.n at gmx.net Fri Sep 15 05:21:38 2006 From: a.u.r.e.l.i.a.n at gmx.net (Johannes Loehnert) Date: Fri, 15 Sep 2006 11:21:38 +0200 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: <1158305639.6586.8.camel@localhost> References: <1158305639.6586.8.camel@localhost> Message-ID: <200609151121.39096.a.u.r.e.l.i.a.n@gmx.net> On Friday 15 September 2006 09:33, Lars Friedrich wrote: > Hello scipy users, > > for a short time now, I am using python, numpy and scipy as a > replacement for matlab. My setup at the moment is the "scite"-editor and > the "ipython"-shell. I am happy using the "run -i" command to access the > current workspace with my scripts written with scite. > > There is one feature that I know from matlab that I miss a little. Maybe > its not crucial but it would be nice to have: > > When I made a change to my script in scite, I have to > *save it > *change scope to ipython > *recall the "run -i myScript.py" command > *press > > In matlab it is possible to hit to get the same effect. How can I > achieve this with scite/ipython? I think, I need a way to tell ipython > "from extern" to execute the "run"-command. I would put this to my > scite-config-file to connect it with, say, the button and would be > fine. Is this possible? > > Thanks for every comment If you use KDE, maybe KHotKeys (under KControl -> Region and Accessibility) will be of use. It is quite powerful and can simulate arbitrary keyboard input. However I do not know if it is possible to activate a particular window using keyboard. Johannes From edin.salkovic at gmail.com Fri Sep 15 06:22:56 2006 From: edin.salkovic at gmail.com (=?UTF-8?Q?Edin_Salkovi=C4=87?=) Date: Fri, 15 Sep 2006 12:22:56 +0200 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: <200609151121.39096.a.u.r.e.l.i.a.n@gmx.net> References: <1158305639.6586.8.camel@localhost> <200609151121.39096.a.u.r.e.l.i.a.n@gmx.net> Message-ID: <63eb7fa90609150322m7047d393o550caddebf5f6f3f@mail.gmail.com> Have you tried editing the file python.properties in the scite's installatio dir? I'm refering to this part of the file: if PLAT_WIN command.go.*.py=python -u "$(FileNameExt)" command.go.subsystem.*.py=0 command.go.*.pyw=pythonw -u "$(FileNameExt)" command.go.subsystem.*.pyw=1 command.build.SConscript=scons.bat --up . command.build.SConstruct=scons.bat . if PLAT_GTK command.go.*.py=python -u $(FileNameExt) command.build.SConscript=scons --up . command.build.SConstruct=scons . You can probably set: command.go.*.py=python -u $(FileNameExt) to command.go.*.py=ipython -i $(FileNameExt) HTH, Edin On 9/15/06, Johannes Loehnert wrote: > On Friday 15 September 2006 09:33, Lars Friedrich wrote: > > Hello scipy users, > > > > for a short time now, I am using python, numpy and scipy as a > > replacement for matlab. My setup at the moment is the "scite"-editor and > > the "ipython"-shell. I am happy using the "run -i" command to access the > > current workspace with my scripts written with scite. > > > > There is one feature that I know from matlab that I miss a little. Maybe > > its not crucial but it would be nice to have: > > > > When I made a change to my script in scite, I have to > > *save it > > *change scope to ipython > > *recall the "run -i myScript.py" command > > *press > > > > In matlab it is possible to hit to get the same effect. How can I > > achieve this with scite/ipython? I think, I need a way to tell ipython > > "from extern" to execute the "run"-command. I would put this to my > > scite-config-file to connect it with, say, the button and would be > > fine. Is this possible? > > > > Thanks for every comment > > If you use KDE, maybe KHotKeys (under KControl -> Region and Accessibility) > will be of use. It is quite powerful and can simulate arbitrary keyboard > input. However I do not know if it is possible to activate a particular > window using keyboard. > > Johannes > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From lorrmann at physik.uni-wuerzburg.de Fri Sep 15 06:56:39 2006 From: lorrmann at physik.uni-wuerzburg.de (Volker Lorrmann) Date: Fri, 15 Sep 2006 12:56:39 +0200 Subject: [SciPy-user] =?iso-8859-1?q?Can=B4t_build_numpy_from_svn?= Message-ID: <450A86E7.7060508@physik.uni-wuerzburg.de> Hi guys and girls, i?m trying to compile numpy from svn. So i check out the current svn and typed "python setup.py install" Than this error appears. : Running from numpy source directory. No module named __svn_version__ Traceback (most recent call last): File "setup.py", line 89, in ? setup_package() File "setup.py", line 82, in setup_package configuration=configuration ) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/core.py", line 144, in setup config = configuration() File "setup.py", line 48, in configuration config.add_subpackage('numpy') File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 753, in add_subpackage caller_level = 2) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 736, in get_subpackage caller_level = caller_level + 1) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 683, in _get_configuration_from_setup_py config = setup_module.configuration(*args) File "./numpy/setup.py", line 8, in configuration config.add_subpackage('f2py') File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 753, in add_subpackage caller_level = 2) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 736, in get_subpackage caller_level = caller_level + 1) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 683, in _get_configuration_from_setup_py config = setup_module.configuration(*args) File "numpy/f2py/setup.py", line 39, in configuration config.make_svn_version_py() File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 1298, in make_svn_version_py self.add_data_files(('', generate_svn_version_py())) File "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", line 1281, in generate_svn_version_py assert revision is not None,'hmm, why I am not inside SVN tree???' AssertionError: hmm, why I am not inside SVN tree??? hope you can help me. Thanks so far. Volker -- ----------------------- Volker Lorrmann Universit?t W?rzburg Experimentelle Physik 6 Am Hubland 97074 Wuerzburg Germany Tel: +49 931 888 5770 volker.lorrmann at physik.uni-wuerzburg.de From irbdavid at gmail.com Fri Sep 15 08:30:13 2006 From: irbdavid at gmail.com (David Andrews) Date: Fri, 15 Sep 2006 14:30:13 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <20060915055914.GA16683@clipper.ens.fr> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> <20060915055914.GA16683@clipper.ens.fr> Message-ID: <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> This is getting messy now. Pulled down the latest version of matplotlib (0.87.5), and installed it. Now experiencing issues due to incompatibility between this latest version of matplotlib and the version of numpy in the superpack: >>> from pylab import * RuntimeError: module compiled against version 1000002 of C-API but this version of numpy is 1000000 The import of the numpy version of the nxutils module, _nsnxutils, failed. This is is either because numpy was unavailable when matplotlib was compiled, because a dependency of _nsnxutils could not be satisfied, or because the build flag for this module was turned off in setup.py. If it appears that _nsnxutils was not built, make sure you have a working copy of numpy and then re-install matplotlib. Otherwise, the following traceback gives more details: Traceback (most recent call last): File "", line 1, in ? File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", line 1, in ? from matplotlib.pylab import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", line 198, in ? import mlab #so I can override hist, psd, etc... File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/mlab.py", line 64, in ? import nxutils File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/nxutils.py", line 17, in ? from matplotlib._ns_nxutils import * ImportError: numpy.core.multiarray failed to import Scipy and numpy can be imported okay. This is sounding more like a matplotlib issue than a numpy/scipy issue isn't it? At any rate I think the packages in the superpack need to be updated and tested together. Cheers, Dave On 15/09/06, Gael Varoquaux wrote: > On Thu, Sep 14, 2006 at 11:30:59PM +0200, David Andrews wrote: > > The stuff in the .mpkg should remain after the install, yeah. If its > > not there now, it never was :) Looks like it's been missed out > > somewhere a long the way... > > On the matplotlib-dev list there has been some mentions of a > forgotten __init__.py in the latest release. This has been fixed, but > maybe not in the package you are using. You can grab this __init__.py > from the .tar.gz from the matplotlib site, and this should fix your > problem (I think). > > Ga?l > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From gael.varoquaux at normalesup.org Fri Sep 15 08:48:26 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 15 Sep 2006 14:48:26 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> <20060915055914.GA16683@clipper.ens.fr> <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> Message-ID: <20060915124826.GD21871@clipper.ens.fr> Just grab the __init__.py out of this release and use the package you where using before. I think that's the simplest way to get this working. Ga?l On Fri, Sep 15, 2006 at 02:30:13PM +0200, David Andrews wrote: > This is getting messy now. > Pulled down the latest version of matplotlib (0.87.5), and installed > it. Now experiencing issues due to incompatibility between this > latest version of matplotlib and the version of numpy in the > superpack: > >>> from pylab import * > RuntimeError: module compiled against version 1000002 of C-API but > this version of numpy is 1000000 > The import of the numpy version of the nxutils module, > _nsnxutils, failed. This is is either because numpy was > unavailable when matplotlib was compiled, because a dependency of > _nsnxutils could not be satisfied, or because the build flag for > this module was turned off in setup.py. If it appears that > _nsnxutils was not built, make sure you have a working copy of > numpy and then re-install matplotlib. Otherwise, the following > traceback gives more details: > Traceback (most recent call last): > File "", line 1, in ? > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > line 1, in ? > from matplotlib.pylab import * > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", > line 198, in ? > import mlab #so I can override hist, psd, etc... > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/mlab.py", > line 64, in ? > import nxutils > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/nxutils.py", > line 17, in ? > from matplotlib._ns_nxutils import * > ImportError: numpy.core.multiarray failed to import > Scipy and numpy can be imported okay. This is sounding more like a > matplotlib issue than a numpy/scipy issue isn't it? At any rate I > think the packages in the superpack need to be updated and tested > together. > Cheers, > Dave > On 15/09/06, Gael Varoquaux wrote: > > On Thu, Sep 14, 2006 at 11:30:59PM +0200, David Andrews wrote: > > > The stuff in the .mpkg should remain after the install, yeah. If its > > > not there now, it never was :) Looks like it's been missed out > > > somewhere a long the way... > > On the matplotlib-dev list there has been some mentions of a > > forgotten __init__.py in the latest release. This has been fixed, but > > maybe not in the package you are using. You can grab this __init__.py > > from the .tar.gz from the matplotlib site, and this should fix your > > problem (I think). > > Ga?l > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From irbdavid at gmail.com Fri Sep 15 08:52:43 2006 From: irbdavid at gmail.com (David Andrews) Date: Fri, 15 Sep 2006 14:52:43 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> <20060915055914.GA16683@clipper.ens.fr> <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> Message-ID: <8edec5ec0609150552m5933f351k476e8bb4cb9f7f7d@mail.gmail.com> Currently the superpack contains numpy 1.1.2880, though the numpy sourceforge site lists only 0.9.8 or 1.0b5 as current releases of numpy - perhaps this is causing the compatibility issues? On 15/09/06, David Andrews wrote: > This is getting messy now. > Pulled down the latest version of matplotlib (0.87.5), and installed > it. Now experiencing issues due to incompatibility between this > latest version of matplotlib and the version of numpy in the > superpack: > > >>> from pylab import * > RuntimeError: module compiled against version 1000002 of C-API but > this version of numpy is 1000000 > > The import of the numpy version of the nxutils module, > _nsnxutils, failed. This is is either because numpy was > unavailable when matplotlib was compiled, because a dependency of > _nsnxutils could not be satisfied, or because the build flag for > this module was turned off in setup.py. If it appears that > _nsnxutils was not built, make sure you have a working copy of > numpy and then re-install matplotlib. Otherwise, the following > traceback gives more details: > > Traceback (most recent call last): > File "", line 1, in ? > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > line 1, in ? > from matplotlib.pylab import * > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", > line 198, in ? > import mlab #so I can override hist, psd, etc... > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/mlab.py", > line 64, in ? > import nxutils > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/nxutils.py", > line 17, in ? > from matplotlib._ns_nxutils import * > ImportError: numpy.core.multiarray failed to import > > Scipy and numpy can be imported okay. This is sounding more like a > matplotlib issue than a numpy/scipy issue isn't it? At any rate I > think the packages in the superpack need to be updated and tested > together. > > Cheers, > > Dave > > On 15/09/06, Gael Varoquaux wrote: > > On Thu, Sep 14, 2006 at 11:30:59PM +0200, David Andrews wrote: > > > The stuff in the .mpkg should remain after the install, yeah. If its > > > not there now, it never was :) Looks like it's been missed out > > > somewhere a long the way... > > > > On the matplotlib-dev list there has been some mentions of a > > forgotten __init__.py in the latest release. This has been fixed, but > > maybe not in the package you are using. You can grab this __init__.py > > from the .tar.gz from the matplotlib site, and this should fix your > > problem (I think). > > > > Ga?l > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > From dd55 at cornell.edu Fri Sep 15 08:55:24 2006 From: dd55 at cornell.edu (Darren Dale) Date: Fri, 15 Sep 2006 08:55:24 -0400 Subject: [SciPy-user] =?iso-8859-1?q?Can=B4t_build_numpy_from_svn?= In-Reply-To: <450A86E7.7060508@physik.uni-wuerzburg.de> References: <450A86E7.7060508@physik.uni-wuerzburg.de> Message-ID: <200609150855.25102.dd55@cornell.edu> On Friday 15 September 2006 06:56, Volker Lorrmann wrote: > Hi guys and girls, > > i?m trying to compile numpy from svn. So i check out the current svn and > typed "python setup.py install" > Than this error appears. : [...] > "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", > line 1281, in generate_svn_version_py > assert revision is not None,'hmm, why I am not inside SVN tree???' > AssertionError: hmm, why I am not inside SVN tree??? I posted about this on the numpy discussion list a few days ago. NumPy's distutil's doesnt work with working copies created with subversion-1.4. A ticket along with a proposed solution is on the trac website: http://projects.scipy.org/scipy/numpy/ticket/276 Darren From irbdavid at gmail.com Fri Sep 15 10:00:18 2006 From: irbdavid at gmail.com (David Andrews) Date: Fri, 15 Sep 2006 16:00:18 +0200 Subject: [SciPy-user] Trouble installing on OSX 10.4 Intel In-Reply-To: <20060915124826.GD21871@clipper.ens.fr> References: <8edec5ec0609140927k11faf5e0i936fe5ffbd2e4ed0@mail.gmail.com> <8edec5ec0609141330u5152e14ar14211696e6397299@mail.gmail.com> <8edec5ec0609141430v4fc08ba4x9236fe44ff0cac6a@mail.gmail.com> <20060915055914.GA16683@clipper.ens.fr> <8edec5ec0609150530m7984c642mc693c5b6b727f54c@mail.gmail.com> <20060915124826.GD21871@clipper.ens.fr> Message-ID: <8edec5ec0609150700v5b9c5c29vce90274bd3b72f2@mail.gmail.com> Hi Ga?l, Doing as you suggested produces errors with regards to libfreetype as follows: >>> from pylab import * Traceback (most recent call last): File "", line 1, in ? File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", line 1, in ? from matplotlib.pylab import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", line 200, in ? from axes import Axes, PolarAxes File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axes.py", line 15, in ? from axis import XAxis, YAxis File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axis.py", line 25, in ? from font_manager import FontProperties File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/font_manager.py", line 39, in ? from matplotlib import ft2font ImportError: Failure linking new module: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so: Library not loaded: /usr/local/lib/libfreetype.6.dylib Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so Reason: image not found I think that simply copying the __init__.py from a 'working' release of matplotlib isn't enough. As I understand it freetype is only a requirement in the compilation of matplotlib. On 15/09/06, Gael Varoquaux wrote: > Just grab the __init__.py out of this release and use the package > you where using before. I think that's the simplest way to get this > working. > > Ga?l > > On Fri, Sep 15, 2006 at 02:30:13PM +0200, David Andrews wrote: > > This is getting messy now. > > Pulled down the latest version of matplotlib (0.87.5), and installed > > it. Now experiencing issues due to incompatibility between this > > latest version of matplotlib and the version of numpy in the > > superpack: > > > >>> from pylab import * > > RuntimeError: module compiled against version 1000002 of C-API but > > this version of numpy is 1000000 > > > The import of the numpy version of the nxutils module, > > _nsnxutils, failed. This is is either because numpy was > > unavailable when matplotlib was compiled, because a dependency of > > _nsnxutils could not be satisfied, or because the build flag for > > this module was turned off in setup.py. If it appears that > > _nsnxutils was not built, make sure you have a working copy of > > numpy and then re-install matplotlib. Otherwise, the following > > traceback gives more details: > > > Traceback (most recent call last): > > File "", line 1, in ? > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > line 1, in ? > > from matplotlib.pylab import * > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", > > line 198, in ? > > import mlab #so I can override hist, psd, etc... > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/mlab.py", > > line 64, in ? > > import nxutils > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/nxutils.py", > > line 17, in ? > > from matplotlib._ns_nxutils import * > > ImportError: numpy.core.multiarray failed to import > > > Scipy and numpy can be imported okay. This is sounding more like a > > matplotlib issue than a numpy/scipy issue isn't it? At any rate I > > think the packages in the superpack need to be updated and tested > > together. > > > Cheers, > > > Dave > > > On 15/09/06, Gael Varoquaux wrote: > > > On Thu, Sep 14, 2006 at 11:30:59PM +0200, David Andrews wrote: > > > > The stuff in the .mpkg should remain after the install, yeah. If its > > > > not there now, it never was :) Looks like it's been missed out > > > > somewhere a long the way... > > > > On the matplotlib-dev list there has been some mentions of a > > > forgotten __init__.py in the latest release. This has been fixed, but > > > maybe not in the package you are using. You can grab this __init__.py > > > from the .tar.gz from the matplotlib site, and this should fix your > > > problem (I think). > > > > Ga?l > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From barrywark at gmail.com Fri Sep 15 10:33:55 2006 From: barrywark at gmail.com (Barry Wark) Date: Fri, 15 Sep 2006 07:33:55 -0700 Subject: [SciPy-user] Scipy and fink In-Reply-To: References: Message-ID: Jerry, Scipy does install from Fink, at least on PowerPC (haven't tried it on Intel). I've used it successfully. The Fink-installed scipy will not work with the "Framework" version of python on OS X(1). If you are willing to stick with Fink's python as well, you're all set. Barry (1) In case you don't know, python on OS X comes in two flavors, the unix version and the "Framework" version. Fink installs the unix version of python, compiling its shared libraries as dynamic link libraries. The "Framework" version is like the one shipped with the OS by Apple. All the python shared libraries are compiled as OS X frameworks. Although the apple-supplied python is 2.3, 2.4 versions of the framework build are available from macpython.org or ActiveState. The only reason the distinction matters is that extensions compiled against one type of macpython will not work with the other. Just FYI. On 9/14/06, Jerry wrote: > Can Scipy be installed from Fink? Is there any reason not to do this, > assuming the usual "fink" search path /sw is in place? > > Jerry > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From jdhunter at ace.bsd.uchicago.edu Fri Sep 15 10:38:38 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 15 Sep 2006 09:38:38 -0500 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: <20060915084901.GB21871@clipper.ens.fr> (Gael Varoquaux's message of "Fri, 15 Sep 2006 10:49:03 +0200") References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> <1158305639.6586.8.camel@localhost> <20060915084901.GB21871@clipper.ens.fr> Message-ID: <87wt85jitt.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Gael" == Gael Varoquaux writes: Gael> Hello Lars, I don't think there is an obvious way of Gael> doing this with ipython (though you might get a different Gael> answer from a ipython wizard). I have a suggestion. I Gael> haven't tested this thoroughly, but you can have a look. Gael> Use SPE (Stany's Python Editor) as an editor. Spe has a Gael> python shell. To use pylab in the python shell you should Gael> use the WXAgg backend (edit your ~/.matplotlib/.matplotlibrc Gael> and set Gael> backend : WXAgg Gael> Then in the shell you can use python interactively or you Gael> can run the scripts you are editing in it. One big caveat: Gael> you create an display a figure before plot to it: Gael> You should do from numpy import * from pylab import * t = Gael> arange(1,10) figure() # Important: create the figure before Gael> attempting to plot to it show() # Important: display the Gael> figure before attempting to plot to it plot(t,cos(t)) show() This use of show should not be necessary (and is explicitly not supported) if you follow the instructions at http://matplotlib.sourceforge.net/interactive.html and http://matplotlib.sourceforge.net/faq.html#SHOW JDH From wangxj.uc at gmail.com Fri Sep 15 11:59:09 2006 From: wangxj.uc at gmail.com (Xiaojian Wang) Date: Fri, 15 Sep 2006 08:59:09 -0700 Subject: [SciPy-user] Usage of fmin_tnc and fmin_l_bfgs_b In-Reply-To: <450A6838.9060209@iam.uni-stuttgart.de> References: <450A569E.4080200@iam.uni-stuttgart.de> <450A6838.9060209@iam.uni-stuttgart.de> Message-ID: Thanks Nils, yes, fmin_cobyla can handle the general constraints, however I also still don't know how to write the constraints: "a list of callable function" cons and there are no examples in this optimize module!, I also appreciate any helps from any of you. for fmin_l_bfgs_b, see examples in this module, I have run this module without difficulty including lower and upper boundary for each design variables to solve my problems. Xiaojian On 9/15/06, Nils Wagner wrote: > > Nils Wagner wrote: > > Hi all, > > > > I would like to solve a constrained optimization problem with scipy. > > As far as I understand it there exists two possible functions for my > > problem in scipy - fmin_tnc and fmin_l_bfgs_b. > > > > The problem is given by > > > > min f(x) > > > > subjected to > > > > \theta_1 \le theta \le theta_2 > > > > and > > > > r_1 \le r \le r_2 > > > > where x is a vector \in \mathds{R}^{2n+1}. > > > > theta is the last entry in x. > > > > r = \| x[:2*n] \| = linalg.norm(x[:2*n]) > > > > > > How do I specify the bounds for my problem ? I mean > > it's easy to define the bounds for the l a s t parameter (\theta) but > > I am at a loss how to formulate > > the bounds for x[0],...,x[2n-1] s e p a r a t e l y. > > > > bounds -- a list of (min, max) pairs for each element in x, > defining > > the bounds on that parameter. Use None for one of min or > max > > when there is no bound in that direction > > > > Any hint would be appreciated. > > > > Nils > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > Sorry for replying to myself. > > I guess I can use fmin_cobyla with the following constraints > > cons1= x[-1]-\theta_1 > cons2=\theta_2-x[-1] > cons3=linalg.norm(x[:2*n])-r_1 > cons4=r_2-linalg.norm(x[:2*n]) > > Is that correct ? Is there a better way to implement the problem ? > > Nils > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Fri Sep 15 12:06:45 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 15 Sep 2006 18:06:45 +0200 Subject: [SciPy-user] Usage of fmin_tnc and fmin_l_bfgs_b In-Reply-To: References: <450A569E.4080200@iam.uni-stuttgart.de> <450A6838.9060209@iam.uni-stuttgart.de> Message-ID: On Fri, 15 Sep 2006 08:59:09 -0700 "Xiaojian Wang" wrote: > Thanks Nils, > > yes, fmin_cobyla can handle the general constraints, >however I also still > don't know how to > write the constraints: "a list of callable function" >cons > and there are no examples in this optimize module!, I >also appreciate any > helps from any of you. > > for fmin_l_bfgs_b, see examples in this module, I have >run this module > without difficulty including > lower and upper boundary for each design variables to >solve my problems. > > Xiaojian > > > > > > On 9/15/06, Nils Wagner >wrote: >> >> Nils Wagner wrote: >> > Hi all, >> > >> > I would like to solve a constrained optimization >>problem with scipy. >> > As far as I understand it there exists two possible >>functions for my >> > problem in scipy - fmin_tnc and fmin_l_bfgs_b. >> > >> > The problem is given by >> > >> > min f(x) >> > >> > subjected to >> > >> > \theta_1 \le theta \le theta_2 >> > >> > and >> > >> > r_1 \le r \le r_2 >> > >> > where x is a vector \in \mathds{R}^{2n+1}. >> > >> > theta is the last entry in x. >> > >> > r = \| x[:2*n] \| = linalg.norm(x[:2*n]) >> > >> > >> > How do I specify the bounds for my problem ? I mean >> > it's easy to define the bounds for the l a s t >>parameter (\theta) but >> > I am at a loss how to formulate >> > the bounds for x[0],...,x[2n-1] s e p a r a t e l y. >> > >> > bounds -- a list of (min, max) pairs for each >>element in x, >> defining >> > the bounds on that parameter. Use None >>for one of min or >> max >> > when there is no bound in that >>direction >> > >> > Any hint would be appreciated. >> > >> > Nils >> > >> > _______________________________________________ >> > SciPy-user mailing list >> > SciPy-user at scipy.org >> > http://projects.scipy.org/mailman/listinfo/scipy-user >> > >> Sorry for replying to myself. >> >> I guess I can use fmin_cobyla with the following >>constraints >> >> cons1= x[-1]-\theta_1 >> cons2=\theta_2-x[-1] >> cons3=linalg.norm(x[:2*n])-r_1 >> cons4=r_2-linalg.norm(x[:2*n]) >> >> Is that correct ? Is there a better way to implement the >>problem ? >> >> Nils >> >> >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> You can use something like this def cons1(x): return x[-1]-theta_1 def cons2(x): return theta_2-x[-1] def cons3(x): return linalg.norm(x[:2*n])-r_1 def cons4(x): return r_2-linalg.norm(x[:2*n]) xopt = optimize.fmin_cobyla(F_p,x_0,cons=(cons1,cons2,cons3,cons4,),rhobeg=1.0,rhoend=1.e-7,maxfun=9000) Nils From oliphant.travis at ieee.org Fri Sep 15 13:19:19 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 15 Sep 2006 11:19:19 -0600 Subject: [SciPy-user] =?iso-8859-1?q?Can=B4t_build_numpy_from_svn?= In-Reply-To: <200609150855.25102.dd55@cornell.edu> References: <450A86E7.7060508@physik.uni-wuerzburg.de> <200609150855.25102.dd55@cornell.edu> Message-ID: <450AE097.1000906@ieee.org> Darren Dale wrote: > On Friday 15 September 2006 06:56, Volker Lorrmann wrote: > >> Hi guys and girls, >> >> i?m trying to compile numpy from svn. So i check out the current svn and >> typed "python setup.py install" >> Than this error appears. : >> > [...] > >> "/var/abs/local/python-numpy-svn/src/numpy/numpy/distutils/misc_util.py", >> line 1281, in generate_svn_version_py >> assert revision is not None,'hmm, why I am not inside SVN tree???' >> AssertionError: hmm, why I am not inside SVN tree??? >> > > I posted about this on the numpy discussion list a few days ago. NumPy's > distutil's doesnt work with working copies created with subversion-1.4. A > ticket along with a proposed solution is on the trac website: > http://projects.scipy.org/scipy/numpy/ticket/276 > Will running 'svn info' work for a user who has the Tortoise SVN client on Windows? -Travis From fullung at gmail.com Fri Sep 15 13:33:18 2006 From: fullung at gmail.com (Albert Strasheim) Date: Fri, 15 Sep 2006 19:33:18 +0200 Subject: [SciPy-user] =?iso-8859-1?q?Can=B4t_build_numpy_from_svn?= In-Reply-To: <450AE097.1000906@ieee.org> Message-ID: Hello all > -----Original Message----- > From: scipy-user-bounces at scipy.org [mailto:scipy-user-bounces at scipy.org] > On Behalf Of Travis Oliphant > Sent: 15 September 2006 19:19 > To: SciPy Users List > Subject: Re: [SciPy-user] Can?t build numpy from svn > > Darren Dale wrote: > > On Friday 15 September 2006 06:56, Volker Lorrmann wrote: > > > I posted about this on the numpy discussion list a few days ago. NumPy's > > distutil's doesnt work with working copies created with subversion-1.4. > A > > ticket along with a proposed solution is on the trac website: > > http://projects.scipy.org/scipy/numpy/ticket/276 > > > > Will running 'svn info' work for a user who has the Tortoise SVN client > on Windows? Probably not. Another example: using Eclipse with Pydev and Subclipse (Eclipse's SVN plugin). Cheers, Albert From matthew.brett at gmail.com Fri Sep 15 13:39:15 2006 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 15 Sep 2006 18:39:15 +0100 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> Message-ID: <1e2af89e0609151039r4eb886dr847fca12b36591af@mail.gmail.com> Hi, > > I am getting that error too on an x86_64 machine. Has it in fact been > > discussed elsewhere? I think the thread you pointed to was just the > > error report wasn't it? I think I tracked this one down and fixed it on my system: http://projects.scipy.org/scipy/scipy/ticket/264 Best, Matthew From fperez.net at gmail.com Fri Sep 15 15:42:12 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 15 Sep 2006 13:42:12 -0600 Subject: [SciPy-user] python/scipy as matlab, using ipython In-Reply-To: <1158305639.6586.8.camel@localhost> References: <45021503.90307@ee.byu.edu> <4505F06C.1040703@ee.byu.edu> <1158037447.6722.4.camel@gdur.breisach> <1158305639.6586.8.camel@localhost> Message-ID: On 9/15/06, Lars Friedrich wrote: > Hello scipy users, > > for a short time now, I am using python, numpy and scipy as a > replacement for matlab. My setup at the moment is the "scite"-editor and > the "ipython"-shell. I am happy using the "run -i" command to access the > current workspace with my scripts written with scite. > > There is one feature that I know from matlab that I miss a little. Maybe > its not crucial but it would be nice to have: > > When I made a change to my script in scite, I have to > *save it > *change scope to ipython > *recall the "run -i myScript.py" command > *press > > In matlab it is possible to hit to get the same effect. How can I > achieve this with scite/ipython? I think, I need a way to tell ipython > "from extern" to execute the "run"-command. I would put this to my > scite-config-file to connect it with, say, the button and would be > fine. Is this possible? Unfortunately not currently in an automatic way. Here are a few comments that may help to some degree: * keep in mind that ipython uses a partial-matching search on your history, so if you type 'r', up-arrow, it will probably find the 'run' command quickly without having to up-arrow too many times. This can save some time (that's how I work: C-s, alt-tab, r-up, ). * you can save a macro: macro m and then invoke it again by just typing 'm' (or whatever you call it). You can save your macros across sessions by typing store m Macros are a little-used but /extremely/ useful feature of ipython. * (X)Emacs can sort of do what you want, and it can probably be extended to do exactly what you want. If you configure ipython/emacs to work together as explained in http://ipython.scipy.org/doc/manual/node3.html#SECTION00034000000000000000 running your script is one keystroke away. Note that this does NOT do 'run -i', but rather a more pedestrian execfile. But it does support ipdb source tracking, which can be extremely useful. And I'm sure a tiny bit of hacking in the ipython.el file could provide options to call run/run -i instead of just execfile(). * We could probably implement something like this using a named pipe or a socket, but I've really stopped touching the trunk codebase as the chainsaw branch is really far better suited to do this. I created this ticket so we don't forget: http://projects.scipy.org/ipython/ipython/ticket/88 Note that it /can/ be done in the trunk, but I don't have the bandwith to work there right now. If you cook up a patch, by all means send it in. Cheers, f From stefan at sun.ac.za Fri Sep 15 16:50:46 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 15 Sep 2006 22:50:46 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <1e2af89e0609151039r4eb886dr847fca12b36591af@mail.gmail.com> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> <1e2af89e0609151039r4eb886dr847fca12b36591af@mail.gmail.com> Message-ID: <20060915205046.GT963@mentat.za.net> On Fri, Sep 15, 2006 at 06:39:15PM +0100, Matthew Brett wrote: > Hi, > > > > I am getting that error too on an x86_64 machine. Has it in fact been > > > discussed elsewhere? I think the thread you pointed to was just the > > > error report wasn't it? > > I think I tracked this one down and fixed it on my system: > > http://projects.scipy.org/scipy/scipy/ticket/264 Hi Matthew I have to hand it to you; I quickly went over the code this morning and couldn't figure out what the heck was going on. I applied your patch. I'll file a numpy ticket for the remaining issues. Cheers St?fan From stefan at sun.ac.za Fri Sep 15 17:03:22 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 15 Sep 2006 23:03:22 +0200 Subject: [SciPy-user] scipy.test errors (generic ld filter, loadmat case callnest) In-Reply-To: <20060915205046.GT963@mentat.za.net> References: <20060914145538.GB8453@astro.ox.ac.uk> <20060914151647.GI963@mentat.za.net> <20060914161610.GG8453@astro.ox.ac.uk> <1e2af89e0609141412l2bef0413t1d03e7260ae5ca37@mail.gmail.com> <1e2af89e0609141438i1681e2lcbfbc0ab50bc341e@mail.gmail.com> <1e2af89e0609151039r4eb886dr847fca12b36591af@mail.gmail.com> <20060915205046.GT963@mentat.za.net> Message-ID: <20060915210322.GA13303@mentat.za.net> On Fri, Sep 15, 2006 at 10:50:46PM +0200, Stefan van der Walt wrote: > On Fri, Sep 15, 2006 at 06:39:15PM +0100, Matthew Brett wrote: > > Hi, > > > > > > I am getting that error too on an x86_64 machine. Has it in fact been > > > > discussed elsewhere? I think the thread you pointed to was just the > > > > error report wasn't it? > > > > I think I tracked this one down and fixed it on my system: > > > > http://projects.scipy.org/scipy/scipy/ticket/264 > > I have to hand it to you; I quickly went over the code this morning > and couldn't figure out what the heck was going on. I applied your > patch. I'll file a numpy ticket for the remaining issues. Erm, yes, that's how we do it: foot in mouth and echo world-wide. Oh well, I can just as well finish the job. No need to worry about the "remaining issues", those ended up being spurious valgrind warnings. Regards St?fan From haase at msg.ucsf.edu Fri Sep 15 17:10:37 2006 From: haase at msg.ucsf.edu (Sebastian Haase) Date: Fri, 15 Sep 2006 14:10:37 -0700 Subject: [SciPy-user] scipy.optimize.leastsq with float32 Message-ID: <200609151410.37779.haase@msg.ucsf.edu> Hi, I have data in float32 dtype. leastsq() just returns the same parameter values that I give as initial guess. Same happens for int16. (float64 and int32 work) Is this a fundamental limitation of the underlying fortran lib !? Thanks, Sebastian Haase From f.braennstroem at gmx.de Sat Sep 16 03:23:13 2006 From: f.braennstroem at gmx.de (Fabian Braennstroem) Date: Sat, 16 Sep 2006 09:23:13 +0200 Subject: [SciPy-user] 3d interpolation Message-ID: Hi, I would like to do some 3d interpolations. Is it possible with scipy or should I use something different? Greetings! Fabian From amcmorl at gmail.com Sat Sep 16 06:37:54 2006 From: amcmorl at gmail.com (Angus McMorland) Date: Sat, 16 Sep 2006 22:37:54 +1200 Subject: [SciPy-user] 3d interpolation In-Reply-To: References: Message-ID: Hi Fabian, On 16/09/06, Fabian Braennstroem wrote: > > Hi, > > I would like to do some 3d interpolations. Is it possible > with scipy or should I use something different? > I haven't found any specifically 3-D interpolation routines in scipy, but have used 1-D interpolations successively over each axis to achieve something similar. Have a look at the congrid routine listed in the cookbook (http://www.scipy.org/Cookbook/Rebinning) for an example of how this can be done. Disclaimers: I wrote this when I was just starting to understand numpy so I'm afraid it's ugly and deserving of a rewrite. Also, as I've said there, I'm not actually sure how technically appropriate it is to do interpolation this way, but the results looks about right. Can anyone offer a more educated opinion? In any case, I hope that helps, Angus. -- AJC McMorland, PhD Student Physiology, University of Auckland Armourer, Auckland University Fencing Secretary, Fencing North Inc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lorrmann at physik.uni-wuerzburg.de Sat Sep 16 07:14:10 2006 From: lorrmann at physik.uni-wuerzburg.de (Volker Lorrmann) Date: Sat, 16 Sep 2006 13:14:10 +0200 Subject: [SciPy-user] SciPy-user Digest, Vol 37, Issue 18 In-Reply-To: References: Message-ID: <450BDC82.2060808@physik.uni-wuerzburg.de> > Hello scipy users, > for a short time now, I am using python, numpy and scipy as a > replacement for matlab. My setup at the moment is the "scite"-editor and > the "ipython"-shell. I am happy using the "run -i" command to access the > current workspace with my scripts written with scite. > There is one feature that I know from matlab that I miss a little. Maybe > its not crucial but it would be nice to have: > When I made a change to my script in scite, I have to > *save it > *change scope to ipython > *recall the "run -i myScript.py" command > *press > In matlab it is possible to hit to get the same effect. How can I > achieve this with scite/ipython? I think, I need a way to tell ipython > "from extern" to execute the "run"-command. I would put this to my > scite-config-file to connect it with, say, the button and would be > fine. Is this possible? > Thanks for every comment > Lars Hi Lars, i?m using pida (http://pida.berlios.de/) kate or scite. All can do what you want. Volker From grante at visi.com Sat Sep 16 10:22:08 2006 From: grante at visi.com (Grant Edwards) Date: Sat, 16 Sep 2006 14:22:08 +0000 (UTC) Subject: [SciPy-user] 3d interpolation References: Message-ID: On 2006-09-16, Fabian Braennstroem wrote: > I would like to do some 3d interpolations. Is it possible > with scipy or should I use something different? I find that Scientific.Functions.LeastSquares.leastSquaresFit works well. -- Grant Edwards grante Yow! Hold the MAYO & pass at the COSMIC AWARENESS... visi.com From f.braennstroem at gmx.de Sat Sep 16 13:05:01 2006 From: f.braennstroem at gmx.de (Fabian Braennstroem) Date: Sat, 16 Sep 2006 19:05:01 +0200 Subject: [SciPy-user] 3d interpolation References: Message-ID: Hi Grant, * Grant Edwards wrote: > On 2006-09-16, Fabian Braennstroem wrote: > >> I would like to do some 3d interpolations. Is it possible >> with scipy or should I use something different? > > I find that Scientific.Functions.LeastSquares.leastSquaresFit > works well. Do you have a small working example? Would be nice! Greetings! Fabian From steveire at gmail.com Sat Sep 16 14:36:14 2006 From: steveire at gmail.com (Stephen Kelly) Date: Sat, 16 Sep 2006 20:36:14 +0200 Subject: [SciPy-user] Ubuntu installation requirements, and wetting my feet. Message-ID: <18fbbe5a0609161136o3e3ce156ic002b3f08e3b09e3@mail.gmail.com> Hi. I emailed this list a while ago about possible subject matter for a final year project in engineering ( http://news.gmane.org/find-root.php?group=gmane.comp.python.scientific.user&article=8498). There was no project resulting directly from the mailing list, but I have now started a project concerning the correction of xray diffraction data using python. It will mainly be concerned with designing a good user interface, but could also involve interpolation and plotting, so I want to see what I can do with scipy in a reasonable time to do this. The installation instructions on the wiki are a bit confusing. I'm inclined to get the optimised packages, so: > To build SciPy using optimized lapack and blas on the ubuntu system you should install atlas-3dnow-dev or atlas-sse2-dev or atlas-sse3-dev depending on your system. But those packages are not available in the repos available to me. There are others with similar names, but I'm not certain which to choose. What characteristics of my system will affect this? > atlas2 atlas2-sse-dev atlas3-headers atlas-dev > atlas2-3dnow-dev atlas3-3dnow atlas3-sse atlas-doc > atlas2-base-dev atlas3-3dnow-dev atlas3-sse2 atlas-test > atlas2-dev atlas3-base atlas3-sse2-dev > atlas2-headers atlas3-base-dev atlas3-sse-dev > atlas2-sse2-dev atlas3-doc atlas3-test On an unrelated matter, I'm already using the numpy deb package provided at astraw.com, and as a learning excercise started making a mechanics of materials module for tensor calculations. I'm not sure if numpy/scipy already provides such functions, so I don't know if this is reinventing the wheel. To calculate the sencond scalar invariant, I thought an easy way would be to calculate the trace of the adjoint of the stress tensor I define. When I went to do it though, I couldn't find any functions in numpy to get an adjoint of a matrix, or get cofactors. Are these functions available in numpy or scipy? I've got it working now by multiplying the inverse of the matrix by the determinant, but if there's a better way I'd like to know. Also, I'd like to assess the validity of such a stress matrix by checking if it's symmetric, but I haven't seen any way to do that either. I thought there might be a foo.isSymmetric() bool, but there doesn't seem to be, so I tried doing a comparison, if foo == foo.T: etc., but comparison of matrices doesn't seem to work either. Is the only way to compare them to loop over them using regular python methods? Thanks for help on this. I may be seen a bit more on this mailing list once I get into the computation parts of this project. Is there any interest in using #scipy on freenode? Thanks, Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From strawman at astraw.com Sat Sep 16 18:00:43 2006 From: strawman at astraw.com (Andrew Straw) Date: Sat, 16 Sep 2006 15:00:43 -0700 Subject: [SciPy-user] Ubuntu installation requirements, and wetting my feet. In-Reply-To: <18fbbe5a0609161136o3e3ce156ic002b3f08e3b09e3@mail.gmail.com> References: <18fbbe5a0609161136o3e3ce156ic002b3f08e3b09e3@mail.gmail.com> Message-ID: <450C740B.6060704@astraw.com> Stephen Kelly wrote: > The installation instructions on the wiki are a bit confusing. I'm > inclined to get the optimised packages, so: > > > To build SciPy using optimized lapack and > blas on the ubuntu system you should install atlas-3dnow-dev or > atlas-sse2-dev or atlas-sse3-dev depending on your system. > > But those packages are not available in the repos available to me. > There are others with similar names, but I'm not certain which to > choose. What characteristics of my system will affect this? > > > atlas2 atlas2-sse-dev atlas3-headers atlas-dev > > atlas2-3dnow-dev atlas3-3dnow atlas3-sse atlas-doc > > atlas2-base-dev atlas3-3dnow-dev atlas3-sse2 atlas-test > > atlas2-dev atlas3-base atlas3-sse2-dev > > atlas2-headers atlas3-base-dev atlas3-sse-dev > > atlas2-sse2-dev atlas3-doc atlas3-test > > On an unrelated matter, I'm already using the numpy deb package > provided at astraw.com , The scipy packages at my site ( http://debs.astraw.com ) for Ubuntu Dapper now use atlas and are, to my knowledge, configured appropriately. From streycats at gmail.com Sat Sep 16 20:51:34 2006 From: streycats at gmail.com (Helmut Strey) Date: Sat, 16 Sep 2006 20:51:34 -0400 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 Message-ID: I have been trying to install the SciPy Superpack for Python. I used MacPython2.4. When I try to import pylab the following error message appears: helmut-streys-imac-g5:~ hstrey$ python Python 2.4.3 (#1, Apr 7 2006, 10:54:33) [GCC 4.0.1 (Apple Computer, Inc. build 5250)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from pylab import * Traceback (most recent call last): File "", line 1, in ? File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", line 1, in ? from matplotlib.pylab import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", line 200, in ? from axes import Axes, PolarAxes File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axes.py", line 15, in ? from axis import XAxis, YAxis File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axis.py", line 25, in ? from font_manager import FontProperties File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/font_manager.py", line 39, in ? from matplotlib import ft2font ImportError: Failure linking new module: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so: Library not loaded: /usr/local/lib/libfreetype.6.dylib Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so Reason: image not found >>> It seems that the package expects libfreetype6.dylib in /usr/local/lib. I found it in /usr/X11R6/lib. I tried to create a symbolic link but then it says that the compiler versions dont match. What should I do? Everything else works. Thanks Helmut Strey From grante at visi.com Sat Sep 16 22:21:38 2006 From: grante at visi.com (Grant Edwards) Date: Sun, 17 Sep 2006 02:21:38 +0000 (UTC) Subject: [SciPy-user] 3d interpolation References: Message-ID: On 2006-09-16, Fabian Braennstroem wrote: >>> I would like to do some 3d interpolations. Is it possible >>> with scipy or should I use something different? >> >> I find that Scientific.Functions.LeastSquares.leastSquaresFit >> works well. > > Do you have a small working example? Here's something I just tossed together... ---------------------------------demo.py--------------------------------- #!/usr/bin/python import Scientific.Functions.LeastSquares lsf = Scientific.Functions.LeastSquares.leastSquaresFit def m(c,p): a,b,c,d,e = c x,y = p return a*x + b*x*x + c*y +d*y*y +e # data for surface z = 1.2*x - 0.2*x*x -y + 0.1*y*y + 3 # [[x,y],z] d = [[[0.000000,0.000000],3.000000], [[0.000000,1.000000],2.100000], [[0.000000,2.000000],1.400000], [[0.000000,3.000000],0.900000], [[1.000000,0.000000],4.000000], [[1.000000,1.000000],3.100000], [[1.000000,2.000000],2.400000], [[1.000000,3.000000],1.900000], [[2.000000,0.000000],4.600000], [[2.000000,1.000000],3.700000], [[2.000000,2.000000],3.000000], [[2.000000,3.000000],2.500000], [[3.000000,0.000000],4.800000], [[3.000000,1.000000],3.900000], [[3.000000,2.000000],3.200000], [[3.000000,3.000000],2.700000], [[4.000000,0.000000],4.600000], [[4.000000,1.000000],3.700000], [[4.000000,2.000000],3.000000], [[4.000000,3.000000],2.500000]] r = lsf(model=m, parameters=[0,0,0,0,0], data=d) print "coefficients", r[0] print "chisquared", r[1] ---------------------------------demo.py--------------------------------- $ python demo.py coefficients [1.1999998068357483, -0.19999995420210845, -0.99999983832361761, 0.099999948513050232, 3.0000000498833987] chisquared 1.75763435818e-13 -- Grant Edwards grante at visi.com From irbdavid at gmail.com Sun Sep 17 05:01:26 2006 From: irbdavid at gmail.com (David Andrews) Date: Sun, 17 Sep 2006 11:01:26 +0200 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 In-Reply-To: References: Message-ID: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Hi Helmut, I've been having precisely the same problem. As far as I can tell there is no way to get the stuff in the superpack working together properly. Numpy and Scipy install fine from it, but the matplotlib mpkg in it is broken. I had no real progress using the 'official' matplotlib packages from their site, as they expect a different version of numpy, despite the fact that its supposed to be fine with all 1.0+ versions of numpy. I'm not sure how to resolve this problem, It has been suggested that its best to install packages from http://pythonmac.org/packages/py24-fat/index.html though scipy is currently not available there. Let me know if you make any progress :) Cheers, Dave On 17/09/06, Helmut Strey wrote: > I have been trying to install the SciPy Superpack for Python. I used > MacPython2.4. When I try to import pylab the following error message > appears: > > helmut-streys-imac-g5:~ hstrey$ python > Python 2.4.3 (#1, Apr 7 2006, 10:54:33) > [GCC 4.0.1 (Apple Computer, Inc. build 5250)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > >>> from pylab import * > Traceback (most recent call last): > File "", line 1, in ? > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > line 1, in ? > from matplotlib.pylab import * > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", > line 200, in ? > from axes import Axes, PolarAxes > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axes.py", > line 15, in ? > from axis import XAxis, YAxis > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axis.py", > line 25, in ? > from font_manager import FontProperties > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/font_manager.py", > line 39, in ? > from matplotlib import ft2font > ImportError: Failure linking new module: > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so: > Library not loaded: /usr/local/lib/libfreetype.6.dylib > Referenced from: > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so > Reason: image not found > >>> > > It seems that the package expects libfreetype6.dylib in > /usr/local/lib. I found it in /usr/X11R6/lib. I tried to create a > symbolic link but then it says that the compiler versions dont match. > What should I do? > > Everything else works. > > Thanks > > Helmut Strey > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From lists.steve at arachnedesign.net Sun Sep 17 09:10:22 2006 From: lists.steve at arachnedesign.net (Steve Lianoglou) Date: Sun, 17 Sep 2006 09:10:22 -0400 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 In-Reply-To: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Message-ID: <6A790CC8-D6ED-4532-8F61-C119F1E20FDF@arachnedesign.net> Hi folks, > I've been having precisely the same problem. As far as I can tell > there is no way to get the stuff in the superpack working together > properly. Numpy and Scipy install fine from it, but the matplotlib > mpkg in it is broken. I had no real progress using the 'official' > matplotlib packages from their site, as they expect a different > version of numpy, despite the fact that its supposed to be fine with > all 1.0+ versions of numpy. Did you guys try to install matplotlib via an svn checkout? That might do the trick. svn co https://svn.sourceforge.net/svnroot/matplotlib/trunk matplotlib -steve From streycats at gmail.com Sun Sep 17 09:26:53 2006 From: streycats at gmail.com (Helmut Strey) Date: Sun, 17 Sep 2006 09:26:53 -0400 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 In-Reply-To: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Message-ID: Yesterday night, I tried to comile scipy myself with mixed results. The main problem is that the superpack is compiled using gcc 3.3 and that the install packages from pythonmac.org are all 4.0.1. So I used numpy, matplotlib from pythonmac.org and compiled everything according to the scipy website. I also installed gfortran from hpc.sourceforge.org because I read somewhere that this is the only fortran 77 compiler that works with gcc 4.01. There is only one problem. Not all scipy tests pass. ERROR: check loadmat case cellnest ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 80, in cc self._check_case(name, expected) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 75, in _check_case self._check_level(k_label, expected, matdict[k]) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 33, in _check_level self._check_level(level_label, ev, actual[i]) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 30, in _check_level assert len(expected) == len(actual), "Different list lengths at %s" % label TypeError: len() of unsized object ====================================================================== FAIL: check loadmat case vec ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 80, in cc self._check_case(name, expected) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/test_mio.py", line 74, in _check_case assert k in matdict, "Missing key at %s" % k_label AssertionError: Missing key at Test 'vec', file:/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/tests/./data/testvec_4_GLNX86.mat, variable fit_params ---------------------------------------------------------------------- Ran 1583 tests in 125.600s FAILED (failures=1, errors=1) On 9/17/06, David Andrews wrote: > Hi Helmut, > > I've been having precisely the same problem. As far as I can tell > there is no way to get the stuff in the superpack working together > properly. Numpy and Scipy install fine from it, but the matplotlib > mpkg in it is broken. I had no real progress using the 'official' > matplotlib packages from their site, as they expect a different > version of numpy, despite the fact that its supposed to be fine with > all 1.0+ versions of numpy. > > I'm not sure how to resolve this problem, It has been suggested that > its best to install packages from > http://pythonmac.org/packages/py24-fat/index.html though scipy is > currently not available there. > > Let me know if you make any progress :) > > Cheers, > > Dave > > On 17/09/06, Helmut Strey wrote: > > I have been trying to install the SciPy Superpack for Python. I used > > MacPython2.4. When I try to import pylab the following error message > > appears: > > > > helmut-streys-imac-g5:~ hstrey$ python > > Python 2.4.3 (#1, Apr 7 2006, 10:54:33) > > [GCC 4.0.1 (Apple Computer, Inc. build 5250)] on darwin > > Type "help", "copyright", "credits" or "license" for more information. > > >>> from pylab import * > > Traceback (most recent call last): > > File "", line 1, in ? > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/pylab.py", > > line 1, in ? > > from matplotlib.pylab import * > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/pylab.py", > > line 200, in ? > > from axes import Axes, PolarAxes > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axes.py", > > line 15, in ? > > from axis import XAxis, YAxis > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/axis.py", > > line 25, in ? > > from font_manager import FontProperties > > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/font_manager.py", > > line 39, in ? > > from matplotlib import ft2font > > ImportError: Failure linking new module: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so: > > Library not loaded: /usr/local/lib/libfreetype.6.dylib > > Referenced from: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/matplotlib/ft2font.so > > Reason: image not found > > >>> > > > > It seems that the package expects libfreetype6.dylib in > > /usr/local/lib. I found it in /usr/X11R6/lib. I tried to create a > > symbolic link but then it says that the compiler versions dont match. > > What should I do? > > > > Everything else works. > > > > Thanks > > > > Helmut Strey > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From brendansimons at yahoo.ca Sun Sep 17 11:50:46 2006 From: brendansimons at yahoo.ca (Brendan Simons) Date: Sun, 17 Sep 2006 11:50:46 -0400 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 In-Reply-To: References: Message-ID: <2EC7B788-0597-4F1A-ACF7-63A52E3E2A33@yahoo.ca> David, Helmut I also had problems with the Matplotlib included in the superpack. The solution was to download a new matplotlib binary (one compatible with numpy 1.0b5) from here: http://euclid.uits.iupui.edu/mplfiles/ Hopefully these kind of compatibility issues will subside once numpy reaches a stable 1.0. Brendan > ------------------------------ > > Message: 6 > Date: Sun, 17 Sep 2006 11:01:26 +0200 > From: "David Andrews" > Subject: Re: [SciPy-user] Installation problems Mac OS X 10.4.7 > To: "SciPy Users List" > Message-ID: > <8edec5ec0609170201p3ecb5021y4eda8506f71b9607 at mail.gmail.com> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > Hi Helmut, > > I've been having precisely the same problem. As far as I can tell > there is no way to get the stuff in the superpack working together > properly. Numpy and Scipy install fine from it, but the matplotlib > mpkg in it is broken. I had no real progress using the 'official' > matplotlib packages from their site, as they expect a different > version of numpy, despite the fact that its supposed to be fine with > all 1.0+ versions of numpy. > > I'm not sure how to resolve this problem, It has been suggested that > its best to install packages from > http://pythonmac.org/packages/py24-fat/index.html though scipy is > currently not available there. > > Let me know if you make any progress :) > > Cheers, > > Dave -------------- next part -------------- An HTML attachment was scrubbed... URL: From elcorto at gmx.net Sun Sep 17 19:14:14 2006 From: elcorto at gmx.net (Steve Schmerler) Date: Mon, 18 Sep 2006 01:14:14 +0200 Subject: [SciPy-user] loadmat test fails (numpy 1.0rc1.dev3175, scipy 0.5.2.dev2199) In-Reply-To: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Message-ID: <450DD6C6.7080500@gmx.net> Just a quick note: The test of the latest scipy build fails at several loadmat tests with permission issues. -- cheers, steve Random number generation is the art of producing pure gibberish as quickly as possible. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: log.txt URL: From willemjagter at gmail.com Mon Sep 18 03:51:49 2006 From: willemjagter at gmail.com (William Hunter) Date: Mon, 18 Sep 2006 09:51:49 +0200 Subject: [SciPy-user] Ubuntu installation requirements, and wetting my feet. In-Reply-To: <18fbbe5a0609161136o3e3ce156ic002b3f08e3b09e3@mail.gmail.com> References: <18fbbe5a0609161136o3e3ce156ic002b3f08e3b09e3@mail.gmail.com> Message-ID: <8b3894bc0609180051o38e6fee4ic5fb2ba69af0bed8@mail.gmail.com> On 16/09/06, Stephen Kelly wrote: > Hi. I emailed this list a while ago about possible subject matter for a > final year project in engineering > (http://news.gmane.org/find-root.php?group=gmane.comp.python.scientific.user&article=8498 > ). There was no project resulting directly from the mailing list, but I have > now started a project concerning the correction of xray diffraction data > using python. It will mainly be concerned with designing a good user > interface, but could also involve interpolation and plotting, so I want to > see what I can do with scipy in a reasonable time to do this. > > The installation instructions on the wiki are a bit confusing. I'm inclined > to get the optimised packages, so: > > > To build SciPy using optimized lapack and blas on the ubuntu system you > should install atlas-3dnow-dev or atlas-sse2-dev or atlas-sse3-dev depending > on your system. > > But those packages are not available in the repos available to me. There are > others with similar names, but I'm not certain which to choose. What > characteristics of my system will affect this? > > > atlas2 atlas2-sse-dev atlas3-headers atlas-dev > > atlas2-3dnow-dev atlas3-3dnow atlas3-sse atlas-doc > > atlas2-base-dev atlas3-3dnow-dev atlas3-sse2 atlas-test > > atlas2-dev atlas3-base atlas3-sse2-dev > > atlas2-headers atlas3-base-dev atlas3-sse-dev > > atlas2-sse2-dev atlas3-doc atlas3-test On Dapper, I used atlas3-base-dev: scipy.test() and numpy.test() didn't report any errors after compilation. If I remember correctly it puts the libraries in /usr/lib/atlas/, and you have to specify this in the site.cfg file. > > On an unrelated matter, I'm already using the numpy deb package provided at > astraw.com, and as a learning excercise started making a mechanics of > materials module for tensor calculations. I'm not sure if numpy/scipy > already provides such functions, so I don't know if this is reinventing the > wheel. To calculate the sencond scalar invariant, I thought an easy way > would be to calculate the trace of the adjoint of the stress tensor I > define. When I went to do it though, I couldn't find any functions in numpy > to get an adjoint of a matrix, or get cofactors. Are these functions > available in numpy or scipy? I've got it working now by multiplying the > inverse of the matrix by the determinant, but if there's a better way I'd > like to know. Also, I'd like to assess the validity of such a stress matrix > by checking if it's symmetric, but I haven't seen any way to do that either. > I thought there might be a foo.isSymmetric() bool, but there doesn't seem to > be, so I tried doing a comparison, if foo == foo.T: etc., but comparison of > matrices doesn't seem to work either. Is the only way to compare them to > loop over them using regular python methods? How about this, which should == 0: abs(A-A.T).max() > Thanks for help on this. I may be seen a bit more on this mailing list once > I get into the computation parts of this project. Is there any interest in > using #scipy on freenode? > > Thanks, > > Steve > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From matthew.brett at gmail.com Tue Sep 19 00:24:51 2006 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 19 Sep 2006 05:24:51 +0100 Subject: [SciPy-user] loadmat test fails (numpy 1.0rc1.dev3175, scipy 0.5.2.dev2199) In-Reply-To: <450DD6C6.7080500@gmx.net> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> <450DD6C6.7080500@gmx.net> Message-ID: <1e2af89e0609182124uf88225r6b39f6ddd2ff185d@mail.gmail.com> Hi, On 9/18/06, Steve Schmerler wrote: > Just a quick note: The test of the latest scipy build fails at several > loadmat tests with permission issues. ... > IOError: [Errno 13] Permission denied: '/usr/local/lib/python2.4/site-packages/scipy/io/tests/./data/testcell_6.5.1_GLNX86.mat' Thanks for this report. Just to ask - what are the permissions on your system for the file above? Best, Matthew From ahoover at eecs.berkeley.edu Tue Sep 19 00:44:15 2006 From: ahoover at eecs.berkeley.edu (Aaron Hoover) Date: Mon, 18 Sep 2006 21:44:15 -0700 Subject: [SciPy-user] sparsetools intel mac install problem and 2 failed tests Message-ID: <87ABE7A8-709B-48BE-AB40-999AA68BD518@eecs.berkeley.edu> Hi All, I just finished building 0.5.1 on my MacBook Pro with NumPy 1.0b5. I followed the instructions on the "Installing SciPy/Mac OS X" page on the wiki and everything seemed to come off without a hitch. That is, the build and installation completed without errors. However, a couple of things aren't working quite right. First off, whenever I do 'from scipy import *' I get the following error that comes from trying to import sparse: ImportError: Failure linking new module: /Library/Frameworks/ Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/ sparse/sparsetools.so: Symbol not found: ___dso_handle Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/ lib/python2.4/site-packages/scipy/sparse/sparsetools.so Expected in: dynamic lookup For the time being, I can do without the sparse module by just commenting out its inclusion in the setup.py file, but I just thought I'd see if anyone else has figured out how to fix this issue. I saw earlier posts, but no definitive solution. Also, when I run scipy.test(10), I get the following 2 errors: ====================================================================== ERROR: check_integer (scipy.io.tests.test_array_import.test_read_array) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/io/tests/test_array_import.py", line 55, in check_integer from scipy import stats File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/stats/__init__.py", line 7, in ? from stats import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/stats/stats.py", line 191, in ? import scipy.special as special File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/special/__init__.py", line 8, in ? from basic import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/special/basic.py", line 8, in ? from _cephes import * ImportError: Failure linking new module: /Library/Frameworks/ Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/ special/_cephes.so: Symbol not found: ___dso_handle Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/ lib/python2.4/site-packages/scipy/special/_cephes.so Expected in: dynamic lookup ====================================================================== ERROR: check_simple_todense (scipy.io.tests.test_mmio.test_mmio_coordinate) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/io/tests/test_mmio.py", line 151, in check_simple_todense b = mmread(fn).todense() AttributeError: 'numpy.ndarray' object has no attribute 'todense' ---------------------------------------------------------------------- Ran 413 tests in 1.450s FAILED (errors=2) However, none of the functionality I use regularly seems to be affected, so it's not really an urgent issue for me. Lastly, I just want to say thanks to everyone who has contributed to SciPy. It's really exciting for me to see a truly viable open source, free alternative to Matlab. Cheers, Aaron From f.braennstroem at gmx.de Tue Sep 19 01:49:00 2006 From: f.braennstroem at gmx.de (Fabian Braennstroem) Date: Tue, 19 Sep 2006 07:49:00 +0200 Subject: [SciPy-user] 3d interpolation References: Message-ID: Hi Grant, thanks for the small demo! * Grant Edwards wrote: > On 2006-09-16, Fabian Braennstroem wrote: > >>>> I would like to do some 3d interpolations. Is it possible >>>> with scipy or should I use something different? >>> >>> I find that Scientific.Functions.LeastSquares.leastSquaresFit >>> works well. >> >> Do you have a small working example? > > Here's something I just tossed together... > > ---------------------------------demo.py--------------------------------- > #!/usr/bin/python > import Scientific.Functions.LeastSquares > lsf = Scientific.Functions.LeastSquares.leastSquaresFit > > def m(c,p): > a,b,c,d,e = c > x,y = p > return a*x + b*x*x + c*y +d*y*y +e > > # data for surface z = 1.2*x - 0.2*x*x -y + 0.1*y*y + 3 > # [[x,y],z] > > d = [[[0.000000,0.000000],3.000000], > [[0.000000,1.000000],2.100000], > [[0.000000,2.000000],1.400000], > [[0.000000,3.000000],0.900000], > [[1.000000,0.000000],4.000000], > [[1.000000,1.000000],3.100000], > [[1.000000,2.000000],2.400000], > [[1.000000,3.000000],1.900000], > [[2.000000,0.000000],4.600000], > [[2.000000,1.000000],3.700000], > [[2.000000,2.000000],3.000000], > [[2.000000,3.000000],2.500000], > [[3.000000,0.000000],4.800000], > [[3.000000,1.000000],3.900000], > [[3.000000,2.000000],3.200000], > [[3.000000,3.000000],2.700000], > [[4.000000,0.000000],4.600000], > [[4.000000,1.000000],3.700000], > [[4.000000,2.000000],3.000000], > [[4.000000,3.000000],2.500000]] > > > r = lsf(model=m, parameters=[0,0,0,0,0], data=d) > > print "coefficients", r[0] > print "chisquared", r[1] > ---------------------------------demo.py--------------------------------- > > > $ python demo.py > coefficients [1.1999998068357483, -0.19999995420210845, -0.99999983832361761, 0.099999948513050232, 3.0000000498833987] > chisquared 1.75763435818e-13 Greetings! Fabian From nwagner at iam.uni-stuttgart.de Tue Sep 19 02:37:31 2006 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 19 Sep 2006 08:37:31 +0200 Subject: [SciPy-user] loadmat test fails (numpy 1.0rc1.dev3175, scipy 0.5.2.dev2199) In-Reply-To: <1e2af89e0609182124uf88225r6b39f6ddd2ff185d@mail.gmail.com> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> <450DD6C6.7080500@gmx.net> <1e2af89e0609182124uf88225r6b39f6ddd2ff185d@mail.gmail.com> Message-ID: <450F902B.7090307@iam.uni-stuttgart.de> Matthew Brett wrote: > Hi, > > On 9/18/06, Steve Schmerler wrote: > >> Just a quick note: The test of the latest scipy build fails at several >> loadmat tests with permission issues. >> > ... > >> IOError: [Errno 13] Permission denied: '/usr/local/lib/python2.4/site-packages/scipy/io/tests/./data/testcell_6.5.1_GLNX86.mat' >> > > Thanks for this report. Just to ask - what are the permissions on > your system for the file above? > > Best, > > Matthew > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > I have filed a bug report. (including permissions) http://projects.scipy.org/scipy/scipy/ticket/263 Nils From meesters at uni-mainz.de Tue Sep 19 04:59:49 2006 From: meesters at uni-mainz.de (Christian Meesters) Date: Tue, 19 Sep 2006 10:59:49 +0200 Subject: [SciPy-user] "catching warnings" Message-ID: <200609191059.49651.meesters@uni-mainz.de> Hi, In some cases calling interpolate.splrep results in the following warning: Warning: The required storage space exceeds the available strorage space. Probably causes: nest to small or s is too small. (fp>s) That's a really nice information to prevent the user from making more serious mistakes. I'm now trying to use the function in combination with some GUI-code. In my particular case, smoothing some curves might make sense. For pysical reasons it only makes sense, if the user applies only a small smoothing factor. Yet, I'd like to inform the user in case she / he is overdoing it, but have no chance to calculate a good factor in advance, because this is dataset dependend. Anyway, I should certainly inform the user about this warning. Is there a way to "catch" this warning, almost like an exception? TIA, Christian PS Suggestion: Perhaps one could change the lines above into: Warning: The required storage space exceeds the available storage space. Probable causes: nest to small or s is too small. (fp>s) From elcorto at gmx.net Tue Sep 19 06:22:19 2006 From: elcorto at gmx.net (Steve Schmerler) Date: Tue, 19 Sep 2006 12:22:19 +0200 Subject: [SciPy-user] loadmat test fails (numpy 1.0rc1.dev3175, scipy 0.5.2.dev2199) In-Reply-To: <450F902B.7090307@iam.uni-stuttgart.de> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> <450DD6C6.7080500@gmx.net> <1e2af89e0609182124uf88225r6b39f6ddd2ff185d@mail.gmail.com> <450F902B.7090307@iam.uni-stuttgart.de> Message-ID: <450FC4DB.70008@gmx.net> Nils Wagner wrote: > Matthew Brett wrote: >> Hi, >> >> On 9/18/06, Steve Schmerler wrote: >> >>> Just a quick note: The test of the latest scipy build fails at several >>> loadmat tests with permission issues. >>> >> ... >> >>> IOError: [Errno 13] Permission denied: '/usr/local/lib/python2.4/site-packages/scipy/io/tests/./data/testcell_6.5.1_GLNX86.mat' >>> >> Thanks for this report. Just to ask - what are the permissions on >> your system for the file above? >> >> Best, >> >> Matthew >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> > I have filed a bug report. (including permissions) > > http://projects.scipy.org/scipy/scipy/ticket/263 > > Nils > > On my machine: elcorto at ramrod:~$ ll /usr/local/lib/python2.4/site-packages/scipy/io/tests/data total 248 -rw-r--r-- 1 root staff 270 Sep 15 16:40 japanese_utf8.txt -rw-r--r-- 1 root staff 232 Sep 15 16:40 test3dmatrix_6.1_SOL2.mat -rw-r--r-- 1 root staff 232 Sep 15 16:40 test3dmatrix_6.5.1_GLNX86.mat -rw-r--r-- 1 root staff 213 Sep 15 16:40 test3dmatrix_7.1_GLNX86.mat -rw-r--r-- 1 root staff 536 Sep 15 16:40 testcell_6.1_SOL2.mat [...] -- cheers, steve Random number generation is the art of producing pure gibberish as quickly as possible. From matthew.brett at gmail.com Tue Sep 19 06:56:38 2006 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 19 Sep 2006 11:56:38 +0100 Subject: [SciPy-user] loadmat test fails (numpy 1.0rc1.dev3175, scipy 0.5.2.dev2199) In-Reply-To: <450FC4DB.70008@gmx.net> References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> <450DD6C6.7080500@gmx.net> <1e2af89e0609182124uf88225r6b39f6ddd2ff185d@mail.gmail.com> <450F902B.7090307@iam.uni-stuttgart.de> <450FC4DB.70008@gmx.net> Message-ID: <1e2af89e0609190356p57b55bc8t382bbde386294561@mail.gmail.com> Hi, > >>> Just a quick note: The test of the latest scipy build fails at several > >>> loadmat tests with permission issues. > >> > >>> IOError: [Errno 13] Permission denied: '/usr/local/lib/python2.4/site-packages/scipy/io/tests/./data/testcell_6.5.1_GLNX86.mat' Thanks - I've tracked it down now, will submit to SVN today if I can. Matthew. From tom.denniston at alum.dartmouth.org Tue Sep 19 11:04:29 2006 From: tom.denniston at alum.dartmouth.org (Tom Denniston) Date: Tue, 19 Sep 2006 10:04:29 -0500 Subject: [SciPy-user] numpy 1.0b5 and scupy 0.5.1 build problems Message-ID: The web pages say they are but I have had a number of problems. I am using gcc version 3.2.3 and g77. I had to patch numpy distutils fcompiler/gnu.py file with the following patch to get past an undefined symbol error: changing the line : < if cpu.has_sse2(): opt.append('-msse2') to: > #if cpu.has_sse2(): opt.append('-msse2') Then once I did that I am getting Fortran compile errors. Does anyone know what causes this? I have googled a lot and also look over the build instructions. Am I doing something stupid or using the wrong compiler or something obvious like that? build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xb65):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:507: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xb9e):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:511: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xc09):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:514: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xc94):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:517: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xcd0):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:524: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xcf8):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:534: undefined reference to `Py_BuildValue' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xd36):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:501: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xd56):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:502: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.o(.text+0xd92): In function `f2py_rout__fftpack_zfftnd': build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:593: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xe0e):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:613: undefined reference to `PyArg_ParseTupleAndKeywords' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xe98):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:638: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xf1b):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:653: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xf51):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:668: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xfe6):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:674: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0xfeb):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:676: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1013):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:686: undefined reference to `Py_BuildValue' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1149):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:662: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1171):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:663: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1178):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:648: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1198):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:649: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x11b0):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:632: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.o(.text+0x1214): In function `f2py_rout__fftpack_destroy_zfft_cache': build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:741: undefined reference to `PyArg_ParseTupleAndKeywords' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x122b):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:752: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1253):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:762: undefined reference to `Py_BuildValue' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.o(.text+0x12a4): In function `f2py_rout__fftpack_destroy_zfftnd_cache': build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:802: undefined reference to `PyArg_ParseTupleAndKeywords' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x12bb):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:813: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x12e3):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:823: undefined reference to `Py_BuildValue' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.o(.text+0x1334): In function `f2py_rout__fftpack_destroy_drfft_cache': build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:863: undefined reference to `PyArg_ParseTupleAndKeywords' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x134b):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:874: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /Lib/fftpack/_fftpackmodule.o(.text+0x1373):build/src.linux-i686-2.4/Lib/fftpack/_fftpackmodule.c:884: undefined reference to `Py_BuildValue' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x2d): In function `F2PyDict_SetItemString': build/src.linux-i686-2.4/fortranobject.c:26: undefined reference to `PyDict_SetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x57):build/src.linux-i686-2.4/fortranobject.c:20: undefined reference to `PyErr_Occurred' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x67):build/src.linux-i686-2.4/fortranobject.c:21: undefined reference to `PyErr_Print' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x6c):build/src.linux-i686-2.4/fortranobject.c:22: undefined reference to `PyErr_Clear' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0xb1): In function `PyFortranObject_New': build/src.linux-i686-2.4/fortranobject.c:40: undefined reference to `_PyObject_New' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0xc2):build/src.linux-i686-2.4/fortranobject.c:41: undefined reference to `PyDict_New' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1d5):build/src.linux-i686-2.4/fortranobject.c:65: undefined reference to `PyDict_SetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x301): In function `PyFortranObject_NewAsAttr': build/src.linux-i686-2.4/fortranobject.c:77: undefined reference to `_PyObject_New' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x30e):build/src.linux-i686-2.4/fortranobject.c:79: undefined reference to `PyDict_New' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x38c): In function `array_from_pyobj': build/src.linux-i686-2.4/fortranobject.c:526: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x3a1):build/src.linux-i686-2.4/fortranobject.c:526: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x3c8):build/src.linux-i686-2.4/fortranobject.c:552: undefined reference to `PyType_IsSubtype' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x4da):build/src.linux-i686-2.4/fortranobject.c:608: undefined reference to `PyExc_ValueError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x4e4):build/src.linux-i686-2.4/fortranobject.c:608: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x92d):build/src.linux-i686-2.4/fortranobject.c:643: undefined reference to `PyObject_Type' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x935):build/src.linux-i686-2.4/fortranobject.c:643: undefined reference to `PyObject_Str' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x93d):build/src.linux-i686-2.4/fortranobject.c:643: undefined reference to `PyString_AsString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x959):build/src.linux-i686-2.4/fortranobject.c:647: undefined reference to `PyExc_TypeError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x110e): In function `fortran_dealloc': build/src.linux-i686-2.4/fortranobject.c:90: undefined reference to `PyObject_Free' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x1273): In function `fortran_getattr': build/src.linux-i686-2.4/fortranobject.c:186: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1366):build/src.linux-i686-2.4/fortranobject.c:208: undefined reference to `Py_FindMethod' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1384):build/src.linux-i686-2.4/fortranobject.c:203: undefined reference to `PyCObject_FromVoidPtr' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x139f):build/src.linux-i686-2.4/fortranobject.c:204: undefined reference to `PyDict_SetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x13c1):build/src.linux-i686-2.4/fortranobject.c:195: undefined reference to `PyString_FromString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1409):build/src.linux-i686-2.4/fortranobject.c:197: undefined reference to `PyString_ConcatAndDel' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x143a):build/src.linux-i686-2.4/fortranobject.c:198: undefined reference to `PyDict_SetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1468):build/src.linux-i686-2.4/fortranobject.c:160: undefined reference to `PyDict_GetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x151d): In function `fortran_setattr': build/src.linux-i686-2.4/fortranobject.c:225: undefined reference to `_Py_NoneStruct' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x183f):build/src.linux-i686-2.4/fortranobject.c:218: undefined reference to `PyExc_AttributeError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1853):build/src.linux-i686-2.4/fortranobject.c:218: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1887):build/src.linux-i686-2.4/fortranobject.c:271: undefined reference to `PyDict_SetItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x189d):build/src.linux-i686-2.4/fortranobject.c:265: undefined reference to `PyDict_DelItemString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x18b0):build/src.linux-i686-2.4/fortranobject.c:267: undefined reference to `PyExc_AttributeError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x18c4):build/src.linux-i686-2.4/fortranobject.c:267: undefined reference to `PyErr_SetString' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x18d1):build/src.linux-i686-2.4/fortranobject.c:260: undefined reference to `PyDict_New' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x191e): In function `fortran_call': build/src.linux-i686-2.4/fortranobject.c:292: undefined reference to `PyExc_TypeError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1932):build/src.linux-i686-2.4/fortranobject.c:292: undefined reference to `PyErr_Format' build/temp.linux-i686-2.4/build/src.linux-i686-2.4 /fortranobject.o(.text+0x1982):build/src.linux-i686-2.4/fortranobject.c:282: undefined reference to `PyExc_RuntimeError' build/temp.linux-i686-2.4/build/src.linux-i686-2.4/fortranobject.o(.text+0x1bb7): In function `fortran_doc': build/src.linux-i686-2.4/fortranobject.c:141: undefined reference to `PyString_FromString' ../gcc/3.2.3/bin/../lib/gcc-lib/i686-pc-linux-gnu/3.2.3/../../../libfrtbegin.a( frtbegin.o)(.text+0x32): In function `main': : undefined reference to `MAIN__' collect2: ld returned 1 exit status -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Tue Sep 19 08:40:50 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 19 Sep 2006 14:40:50 +0200 Subject: [SciPy-user] "catching warnings" In-Reply-To: <200609191059.49651.meesters@uni-mainz.de> References: <200609191059.49651.meesters@uni-mainz.de> Message-ID: <20060919124050.GE12696@mentat.za.net> On Tue, Sep 19, 2006 at 10:59:49AM +0200, Christian Meesters wrote: > Hi, > > In some cases calling interpolate.splrep results in the following warning: > > Warning: The required storage space exceeds the available strorage space. > Probably causes: nest to small or s is too small. (fp>s) > > That's a really nice information to prevent the user from making more serious > mistakes. > > I'm now trying to use the function in combination with some GUI-code. In my > particular case, smoothing some curves might make sense. For pysical reasons > it only makes sense, if the user applies only a small smoothing factor. Yet, > I'd like to inform the user in case she / he is overdoing it, but have no > chance to calculate a good factor in advance, because this is dataset > dependend. Anyway, I should certainly inform the user about this warning. > > Is there a way to "catch" this warning, almost like an exception? You can change warnings into exceptions, afaik. Take a look at the Python documentation on warning filters at http://docs.python.org/lib/module-warnings.html Regards St?fan From chanley at stsci.edu Tue Sep 19 12:29:22 2006 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 19 Sep 2006 12:29:22 -0400 Subject: [SciPy-user] PYFITS 1.1 "beta 3" RELEASE Message-ID: <45101AE2.3060207@stsci.edu> ------------------ | PYFITS Release | ------------------ Space Telescope Science Institute is pleased to announce the "beta 3" release of PyFITS 1.1. This release includes support for both the NUMPY and NUMARRAY array packages. This software can be downloaded at: http://www.stsci.edu/resources/software_hardware/pyfits/Download The NUMPY support in PyFITS is not nearly as well tested as the NUMARRAY support. We expect that you will encounter bugs. Please send bug reports to "help at stsci.edu". We intend to support NUMARRAY and NUMPY simultaneously for a transition period of no less than 1 year. Eventually, however, support for NUMARRAY will disappear. During this period, it is likely that new features will appear only for NUMPY. The support for NUMARRAY will primarily be to fix serious bugs and handle platform updates. ----------- | Version | ----------- Version 1.1b3; September 19, 2006 ------------------------------- | Major Changes since v1.1b2 | ------------------------------- * Pyfits will now read fits files that have been compressed using either gzip or zip. Fits files compressed with gzip are expected to have the extension .gz. Files compressed using zip are expected to have the extension .zip. * A user can now provide a url (either http or ftp) instead of a file specification as the file name argument to fitsopen, or any other pyfits function that opens a fits file (getdata(), getheader(),info()). * Modified the numpy version of pyfits to recognize the numpy data types. This allows the int32scalar object to be treated like a python int. The same goes for the numpy floating and complex types. * Removed dependencies on deprecated data type names from the numpy version of pyfits. * Replaced ".dtype.fields[-1]" with ".dtype.names" to reflect numpy interface change. * Created a fix for bug in creating new tables with no rows and no input data. * Added the _gap attribute to the FITS_rec class. This missing attribute was causing problems when trying to write tables. * Create a fix for bug that prevented the use of index arrays with FITS_rec objects. * Added the __setitem__() method to the FITS_rec and FITS_record classes to facilitate assignment between tables. * Added a ".names" attribute to the numarray version of the FITS_rec class. ------------------------- | Software Requirements | ------------------------- PyFITS Version 1.1b3 REQUIRES: * Python 2.3 or later * NUMPY 1.0b5(or later) or NUMARRAY --------------------- | Installing PyFITS | --------------------- PyFITS 1.1b is distributed as a Python distutils module. Installation simply involves unpacking the package and executing % python setup.py install to install it in Python's site-packages directory. Alternatively the command %python setup.py install --local="/destination/directory/" will install PyFITS in an arbitrary directory which should be placed on PYTHONPATH. Once numarray or numpy has been installed, then PyFITS should be available for use under Python. ----------------- | Download Site | ----------------- http://www.stsci.edu/resources/software_hardware/pyfits/Download ---------- | Usage | ---------- Users will issue an "import pyfits" command as in the past. However, the use of the NUMPY or NUMARRAY version of PyFITS will be controlled by an environment variable called NUMERIX. Set NUMERIX to 'numarray' for the NUMARRAY version of PyFITS. Set NUMERIX to 'numpy' for the NUMPY version of pyfits. If only one array package is installed, that package's version of PyFITS will be imported. If both packages are installed the NUMERIX value is used to decide which version to import. If no NUMERIX value is set then the NUMARRAY version of PyFITS will be imported. Anything else will raise an exception upon import. --------------- | Bug Reports | --------------- Please send all PyFITS bug reports to help at stsci.edu ------------------ | Advanced Users | ------------------ Users who would like the "bleeding" edge of PyFITS can retrieve the software from our SUBVERSION repository hosted at: http://astropy.scipy.org/svn/pyfits/trunk We also provide a Trac site at: http://projects.scipy.org/astropy/pyfits/wiki -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From massimo.sandal at unibo.it Tue Sep 19 12:21:29 2006 From: massimo.sandal at unibo.it (massimo sandal) Date: Tue, 19 Sep 2006 18:21:29 +0200 Subject: [SciPy-user] numpy 1.0b5 and scupy 0.5.1 build problems In-Reply-To: References: Message-ID: <45101909.1060401@unibo.it> Tom Denniston ha scritto: > The web pages say they are but I have had a number of problems. I am > using gcc version 3.2.3 and g77. I don't know if it's the problem, but isn't GCC 3.2.3 quite outdated? m. -- Massimo Sandal University of Bologna Department of Biochemistry "G.Moruzzi" snail mail: Via Irnerio 48, 40126 Bologna, Italy email: massimo.sandal at unibo.it tel: +39-051-2094388 fax: +39-051-2094387 -------------- next part -------------- A non-text attachment was scrubbed... Name: massimo.sandal.vcf Type: text/x-vcard Size: 274 bytes Desc: not available URL: From tom.denniston at alum.dartmouth.org Tue Sep 19 13:45:13 2006 From: tom.denniston at alum.dartmouth.org (Tom Denniston) Date: Tue, 19 Sep 2006 12:45:13 -0500 Subject: [SciPy-user] numpy 1.0b5 and scupy 0.5.1 build problems In-Reply-To: <45101909.1060401@unibo.it> References: <45101909.1060401@unibo.it> Message-ID: Problem is i'm stuck with that version of the compiler On 9/19/06, massimo sandal wrote: > > Tom Denniston ha scritto: > > The web pages say they are but I have had a number of problems. I am > > using gcc version 3.2.3 and g77. > > I don't know if it's the problem, but isn't GCC 3.2.3 quite outdated? > > m. > > -- > Massimo Sandal > University of Bologna > Department of Biochemistry "G.Moruzzi" > > snail mail: > Via Irnerio 48, 40126 Bologna, Italy > > email: > massimo.sandal at unibo.it > > tel: +39-051-2094388 > fax: +39-051-2094387 > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryanlists at gmail.com Tue Sep 19 17:17:43 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 19 Sep 2006 16:17:43 -0500 Subject: [SciPy-user] where is signal? Message-ID: I just upgraded to scipy svn and now I can't find the signal stuff. Is this hidden somewhere? Do I have to install it seperately? Ryan From peridot.faceted at gmail.com Tue Sep 19 17:26:12 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Tue, 19 Sep 2006 17:26:12 -0400 Subject: [SciPy-user] "catching warnings" In-Reply-To: <20060919124050.GE12696@mentat.za.net> References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> Message-ID: On 19/09/06, Stefan van der Walt wrote: > You can change warnings into exceptions, afaik. Take a look at the > Python documentation on warning filters at Unfortunately, they're not warnings at all, they're just print statements - so not only can you not control them, they appear on stdout (and not stderr), contaminating the output of your program. This is far from the only module that does this, either (for example, odepack.py. quadrature.py, optimize.py... grepping for "print" should turn them up) It would be nice if scipy provided some consistent way to signal non-errors to the surrounding program. Would it work for all modules to use warnings? A. M. Archibald From robert.kern at gmail.com Tue Sep 19 17:40:57 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Sep 2006 16:40:57 -0500 Subject: [SciPy-user] where is signal? In-Reply-To: References: Message-ID: <451063E9.5000308@gmail.com> Ryan Krauss wrote: > I just upgraded to scipy svn and now I can't find the signal stuff. > Is this hidden somewhere? Do I have to install it seperately? Nope, still there. http://projects.scipy.org/scipy/scipy/browser/trunk/Lib/signal -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Tue Sep 19 17:41:53 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Sep 2006 16:41:53 -0500 Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> Message-ID: <45106421.60705@gmail.com> A. M. Archibald wrote: > It would be nice if scipy provided some consistent way to signal > non-errors to the surrounding program. Would it work for all modules > to use warnings? Yes. Patches are welcome. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From rclewley at cam.cornell.edu Tue Sep 19 17:49:01 2006 From: rclewley at cam.cornell.edu (Robert Clewley) Date: Tue, 19 Sep 2006 17:49:01 -0400 (EDT) Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> Message-ID: The situation is worse than that. Some of the print statements are actually in legacy Fortran code that is wrapped as DLLs, e.g. the VODE integrator in odepack.py. This is quite unfortunate when it comes to using these modules as part of more sophisticated algorithms, for which more control over the output to the user or internal detection of problems is desired. The idea of legacy code in this context is not to have to change it and then worry about providing the edited source with your distribution, etc.... So I'm considering an exploration of whether I can suppress output to stdout or stderr in a simple and platform-independent way when I call DLLs like VODE, and then have my code reconstruct information about the warnings from the verbose output options that many of those codes have. Does anyone have any experience of trying this? -Rob On Tue, 19 Sep 2006, A. M. Archibald wrote: > On 19/09/06, Stefan van der Walt wrote: > >> You can change warnings into exceptions, afaik. Take a look at the >> Python documentation on warning filters at > > Unfortunately, they're not warnings at all, they're just print > statements - so not only can you not control them, they appear on > stdout (and not stderr), contaminating the output of your program. > This is far from the only module that does this, either (for example, > odepack.py. quadrature.py, optimize.py... grepping for "print" should > turn them up) > ----------------------------------- Rob Clewley Research Associate Department of Mathematics and Center for Applied Mathematics Cornell University Ithaca, NY 14853 www.cam.cornell.edu/~rclewley Tel: 607-255-7760 Fax: 607-255-9860 From ryanlists at gmail.com Tue Sep 19 17:56:35 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 19 Sep 2006 16:56:35 -0500 Subject: [SciPy-user] where is signal? In-Reply-To: <451063E9.5000308@gmail.com> References: <451063E9.5000308@gmail.com> Message-ID: I can see it in site-packages/scipy, but import scipy doesn't include scipy.signal >>> scipy.signal Traceback (most recent call last): File "", line 1, in ? AttributeError: 'module' object has no attribute 'signal' If I try to import scipy.signal I get: >>> import scipy.signal Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy/signal/__init__.py", line 9, in ? from bsplines import * File "/usr/lib/python2.4/site-packages/scipy/signal/bsplines.py", line 3, in ? import scipy.special File "/usr/lib/python2.4/site-packages/scipy/special/__init__.py", line 8, in ? from basic import * File "/usr/lib/python2.4/site-packages/scipy/special/basic.py", line 8, in ? from _cephes import * ImportError: /usr/lib/python2.4/site-packages/scipy/special/_cephes.so: undefined symbol: cephes_fabs >>> On 9/19/06, Robert Kern wrote: > Ryan Krauss wrote: > > I just upgraded to scipy svn and now I can't find the signal stuff. > > Is this hidden somewhere? Do I have to install it seperately? > > Nope, still there. > > http://projects.scipy.org/scipy/scipy/browser/trunk/Lib/signal > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless enigma > that is made terrible by our own mad attempt to interpret it as though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From robert.kern at gmail.com Tue Sep 19 18:05:14 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Sep 2006 17:05:14 -0500 Subject: [SciPy-user] where is signal? In-Reply-To: References: <451063E9.5000308@gmail.com> Message-ID: <4510699A.2040702@gmail.com> Ryan Krauss wrote: > I can see it in site-packages/scipy, but import scipy doesn't include > scipy.signal >>>> scipy.signal > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: 'module' object has no attribute 'signal' That's right. It's not supposed to. > If I try to import scipy.signal I get: > >>>> import scipy.signal > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.4/site-packages/scipy/signal/__init__.py", > line 9, in ? from bsplines import * > File "/usr/lib/python2.4/site-packages/scipy/signal/bsplines.py", > line 3, in ? import scipy.special > File "/usr/lib/python2.4/site-packages/scipy/special/__init__.py", > line 8, in ? > from basic import * > File "/usr/lib/python2.4/site-packages/scipy/special/basic.py", line 8, in ? > from _cephes import * > ImportError: /usr/lib/python2.4/site-packages/scipy/special/_cephes.so: > undefined symbol: cephes_fabs Okay, it looks like you have a build problem. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Tue Sep 19 18:07:46 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 19 Sep 2006 17:07:46 -0500 Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> Message-ID: <45106A32.2080503@gmail.com> Robert Clewley wrote: > The situation is worse than that. Some of the print statements are > actually in legacy Fortran code that is wrapped as DLLs, e.g. the VODE > integrator in odepack.py. > > This is quite unfortunate when it comes to using these modules as part of > more sophisticated algorithms, for which more control over the output to > the user or internal detection of problems is desired. The idea of legacy > code in this context is not to have to change it and then worry about > providing the edited source with your distribution, etc.... > > So I'm considering an exploration of whether I can suppress output to > stdout or stderr in a simple and platform-independent way when I call DLLs > like VODE, and then have my code reconstruct information about the > warnings from the verbose output options that many of those codes have. > Does anyone have any experience of trying this? Yes. http://projects.scipy.org/pipermail/ipython-dev/attachments/20050804/d5f5f9fc/attachment.pl -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ryanlists at gmail.com Tue Sep 19 18:45:12 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 19 Sep 2006 17:45:12 -0500 Subject: [SciPy-user] where is signal? In-Reply-To: <4510699A.2040702@gmail.com> References: <451063E9.5000308@gmail.com> <4510699A.2040702@gmail.com> Message-ID: O.K. rebuilding made this go away. On 9/19/06, Robert Kern wrote: > Ryan Krauss wrote: > > I can see it in site-packages/scipy, but import scipy doesn't include > > scipy.signal > >>>> scipy.signal > > Traceback (most recent call last): > > File "", line 1, in ? > > AttributeError: 'module' object has no attribute 'signal' > > That's right. It's not supposed to. > > > If I try to import scipy.signal I get: > > > >>>> import scipy.signal > > Traceback (most recent call last): > > File "", line 1, in ? > > File "/usr/lib/python2.4/site-packages/scipy/signal/__init__.py", > > line 9, in ? from bsplines import * > > File "/usr/lib/python2.4/site-packages/scipy/signal/bsplines.py", > > line 3, in ? import scipy.special > > File "/usr/lib/python2.4/site-packages/scipy/special/__init__.py", > > line 8, in ? > > from basic import * > > File "/usr/lib/python2.4/site-packages/scipy/special/basic.py", line 8, in ? > > from _cephes import * > > ImportError: /usr/lib/python2.4/site-packages/scipy/special/_cephes.so: > > undefined symbol: cephes_fabs > > Okay, it looks like you have a build problem. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless enigma > that is made terrible by our own mad attempt to interpret it as though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From ryanlists at gmail.com Tue Sep 19 18:56:19 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 19 Sep 2006 17:56:19 -0500 Subject: [SciPy-user] lsim changes shape of input u Message-ID: signal.lsim is changing my input vector u from 1D to 2D: In [5]: shape(u) Out[5]: (1000,) In [6]: Y=signal.lsim(sys3,u,t) In [7]: shape(u) Out[7]: (1000, 1) I tried adding a U=squeeze(U) just before the return of lsim, but this didn't seem to fix the problem. This is causing errors else wher in my code when I try to export matrices that are columns of data, so I have lots of u=squeeze(u) sprinkled through out my code. Is there a way to make lsim do this so that I don't have to do it all the time? Thanks, Ryan From stefan at sun.ac.za Tue Sep 19 19:41:47 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 20 Sep 2006 01:41:47 +0200 Subject: [SciPy-user] lsim changes shape of input u In-Reply-To: References: Message-ID: <20060919234147.GG29662@mentat.za.net> On Tue, Sep 19, 2006 at 05:56:19PM -0500, Ryan Krauss wrote: > signal.lsim is changing my input vector u from 1D to 2D: > In [5]: shape(u) > Out[5]: (1000,) > > In [6]: Y=signal.lsim(sys3,u,t) > > In [7]: shape(u) > Out[7]: (1000, 1) > > I tried adding a U=squeeze(U) just before the return of lsim, but this > didn't seem to fix the problem. This is causing errors else wher in > my code when I try to export matrices that are columns of data, so I > have lots of u=squeeze(u) sprinkled through out my code. Is there a > way to make lsim do this so that I don't have to do it all the time? Fixed in SVN. Cheers St?fan From ryanlists at gmail.com Tue Sep 19 19:50:20 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 19 Sep 2006 18:50:20 -0500 Subject: [SciPy-user] lsim changes shape of input u In-Reply-To: <20060919234147.GG29662@mentat.za.net> References: <20060919234147.GG29662@mentat.za.net> Message-ID: Thanks. I was hoping to learn from your fix, but I don't see what is different in the code. It looks like you fixed lsim2 as well. Thanks, Ryan In [2]: shape(u) Out[2]: (1000,) In [3]: Y=signal.lsim(sys3,u,t) In [4]: shape(u) Out[4]: (1000,) In [5]: Y=signal.lsim2(sys3,u,t) In [6]: shape(u) Out[6]: (1000,) On 9/19/06, Stefan van der Walt wrote: > On Tue, Sep 19, 2006 at 05:56:19PM -0500, Ryan Krauss wrote: > > signal.lsim is changing my input vector u from 1D to 2D: > > In [5]: shape(u) > > Out[5]: (1000,) > > > > In [6]: Y=signal.lsim(sys3,u,t) > > > > In [7]: shape(u) > > Out[7]: (1000, 1) > > > > I tried adding a U=squeeze(U) just before the return of lsim, but this > > didn't seem to fix the problem. This is causing errors else wher in > > my code when I try to export matrices that are columns of data, so I > > have lots of u=squeeze(u) sprinkled through out my code. Is there a > > way to make lsim do this so that I don't have to do it all the time? > > Fixed in SVN. > > Cheers > St?fan > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From stefan at sun.ac.za Wed Sep 20 06:29:53 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 20 Sep 2006 12:29:53 +0200 Subject: [SciPy-user] lsim changes shape of input u In-Reply-To: References: <20060919234147.GG29662@mentat.za.net> Message-ID: <20060920102953.GB10514@mentat.za.net> Hi Ryan On Tue, Sep 19, 2006 at 06:50:20PM -0500, Ryan Krauss wrote: > Thanks. I was hoping to learn from your fix, but I don't see what is > different in the code. It looks like you fixed lsim2 as well. It used to be U.shape = ... instead of U = U.reshape(...) U.reshape does not modify the shape field of the original input argument, but instead returns a view on that array. Regards St?fan From ryanlists at gmail.com Wed Sep 20 07:56:42 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Wed, 20 Sep 2006 06:56:42 -0500 Subject: [SciPy-user] lsim changes shape of input u In-Reply-To: <20060920102953.GB10514@mentat.za.net> References: <20060919234147.GG29662@mentat.za.net> <20060920102953.GB10514@mentat.za.net> Message-ID: Thanks. That is good to know. On 9/20/06, Stefan van der Walt wrote: > Hi Ryan > > On Tue, Sep 19, 2006 at 06:50:20PM -0500, Ryan Krauss wrote: > > Thanks. I was hoping to learn from your fix, but I don't see what is > > different in the code. It looks like you fixed lsim2 as well. > > It used to be > > U.shape = ... > > instead of > > U = U.reshape(...) > > U.reshape does not modify the shape field of the original input > argument, but instead returns a view on that array. > > Regards > St?fan > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From meesters at uni-mainz.de Wed Sep 20 08:24:43 2006 From: meesters at uni-mainz.de (Christian Meesters) Date: Wed, 20 Sep 2006 14:24:43 +0200 Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> Message-ID: <200609201424.43401.meesters@uni-mainz.de> On Tuesday 19 September 2006 23:26, A. M. Archibald wrote: > On 19/09/06, Stefan van der Walt wrote: > > You can change warnings into exceptions, afaik. Take a look at the > > Python documentation on warning filters at > > Unfortunately, they're not warnings at all, they're just print > statements - so not only can you not control them, they appear on > stdout (and not stderr), contaminating the output of your program. > This is far from the only module that does this, either (for example, > odepack.py. quadrature.py, optimize.py... grepping for "print" should > turn them up) I only now got into reading this. But, well, at least it explains my failed attempts in trying to convert these "warnings" into exceptions, yesterday. Stefan, thanks anyway for pointing me on the forgotten warnings module. Unfortunately I'm unable to provide patches as discussed later in this thread. But I'm glad that I initiated this debate ;-). Cheers, Christian From eric at enthought.com Wed Sep 20 12:29:28 2006 From: eric at enthought.com (eric) Date: Wed, 20 Sep 2006 11:29:28 -0500 Subject: [SciPy-user] ANN: Job postings -- Enthought, Inc. Message-ID: <45116C68.1070706@enthought.com> Hey group, We are growing again and have multiple positions open here at Enthought. I'd love to recruit more people out of the Scipy/numpy community. I think many of you would find the work interesting. You can look at our career page for more info: http://www.enthought.com/careers.htm thanks! eric From peridot.faceted at gmail.com Wed Sep 20 23:24:37 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Wed, 20 Sep 2006 23:24:37 -0400 Subject: [SciPy-user] "catching warnings" In-Reply-To: <45106421.60705@gmail.com> References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> <45106421.60705@gmail.com> Message-ID: On 19/09/06, Robert Kern wrote: > A. M. Archibald wrote: > > It would be nice if scipy provided some consistent way to signal > > non-errors to the surrounding program. Would it work for all modules > > to use warnings? > > Yes. Patches are welcome. The problem (that the OP had) appears to be with scipy.interpolate.fitpack. Can it be marked as deprecated (i.e., users get a rude^H^H^H^H warning telling them to use scipy.interpolate.fitpack2 instead)? As a user, I often find it difficult to locate the "correct", i.e., up-to-date, supported, interface to do various tasks (for example, interpolation and ODE solving). Would people find it helpful if I went through and included cross-references in the docstrings? A. M. Archibald From gael.varoquaux at normalesup.org Wed Sep 20 23:29:48 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 21 Sep 2006 05:29:48 +0200 Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> <45106421.60705@gmail.com> Message-ID: <20060921032948.GA29561@clipper.ens.fr> On Wed, Sep 20, 2006 at 11:24:37PM -0400, A. M. Archibald wrote: > Would people find it helpful if I went > through and included cross-references in the docstrings? _Yes_ !! Very helpful. And not only to me, my colleagues have already mentioned this to me. Ga?l From robert.kern at gmail.com Thu Sep 21 00:04:00 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 20 Sep 2006 23:04:00 -0500 Subject: [SciPy-user] "catching warnings" In-Reply-To: References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> <45106421.60705@gmail.com> Message-ID: <45120F30.7090800@gmail.com> A. M. Archibald wrote: > Would people find it helpful if I went > through and included cross-references in the docstrings? Yes, that would be wonderful. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From peridot.faceted at gmail.com Thu Sep 21 00:10:12 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Thu, 21 Sep 2006 00:10:12 -0400 Subject: [SciPy-user] "catching warnings" In-Reply-To: <45106421.60705@gmail.com> References: <200609191059.49651.meesters@uni-mainz.de> <20060919124050.GE12696@mentat.za.net> <45106421.60705@gmail.com> Message-ID: On 19/09/06, Robert Kern wrote: > A. M. Archibald wrote: > > It would be nice if scipy provided some consistent way to signal > > non-errors to the surrounding program. Would it work for all modules > > to use warnings? > > Yes. Patches are welcome. Here's a simple initial patch, fixing only scipy.interpolate.fitpack, just to see what people think of this approach. On a first pass through the code, I'm just changing print str to warn(ScipyWarning(str)); by default, this just prints a warning on stderr, including line number information. Users can, using the package "warnings", control what is done with them - treat them as exceptions, add them to a big list, feed them to a logfile, put them in a text box... The reason for using the class ScipyWarning is so that users can trap specifically Scipy warnings. There is also a subclass, LossOfAccuracy, not currently used (and it should be named LossOfPrecision, I now realize), intended to indicate when a warning is signaling numerical difficulties. I decided against a scipy-specific wrapper function for warn() because I want to encourage people to pick the correct subclass of ScipyWarning (although perhaps defaulting to ScipyWarning instead of UserWarning might be a good idea). I have also included Robert Kern's Redirector; it is not used yet but (if he gives permission) I thought of including it for use in testing and for collecting output from talkative FORTRAN and C. A. M. Archibald -------------- next part -------------- A non-text attachment was scrubbed... Name: initial-warningization Type: application/octet-stream Size: 5752 bytes Desc: not available URL: From cimrman3 at ntc.zcu.cz Thu Sep 21 05:35:15 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 21 Sep 2006 11:35:15 +0200 Subject: [SciPy-user] umfpack module update Message-ID: <45125CD3.9010203@ntc.zcu.cz> Thanks to Nathan Bell, happy UMFPACK users can now access the LU factors of their sparse matrices using the scipy.linsolve.umfpack module, as an alternative to scipy.linsolve.splu() based on SuperLU. r. From cimrman3 at ntc.zcu.cz Thu Sep 21 05:40:30 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 21 Sep 2006 11:40:30 +0200 Subject: [SciPy-user] umfpack module update In-Reply-To: <45125CD3.9010203@ntc.zcu.cz> References: <45125CD3.9010203@ntc.zcu.cz> Message-ID: <45125E0E.3010107@ntc.zcu.cz> Robert Cimrman wrote: > Thanks to Nathan Bell, happy UMFPACK users can now access the LU factors > of their sparse matrices using the scipy.linsolve.umfpack module, as an > alternative to scipy.linsolve.splu() based on SuperLU. SVN version, of course... From cmaloney at kitp.ucsb.edu Thu Sep 21 14:17:41 2006 From: cmaloney at kitp.ucsb.edu (Craig Maloney) Date: Thu, 21 Sep 2006 14:17:41 -0400 Subject: [SciPy-user] cookbook example on irregularly spaced data fails Message-ID: Hi all. I'm trying to use the delaunay module in the sandbox to do natural neighbor interpolation. I'm trying to follow the cookbook example at: http://www.scipy.org/Cookbook/Matplotlib/ Gridding_irregularly_spaced_data The "zi's" I get at the end of the day are all NaN. If I play around with the parameters, I can get *some* of the zi's to be nan. Any ideas of what might be wrong? numpy-1.0rc1 scipy-0.5.1 Thanks, Craig ------------------------------------------------------ Craig Maloney http://www.pha.jhu.edu/~cmaloney/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Sep 21 14:31:13 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 21 Sep 2006 13:31:13 -0500 Subject: [SciPy-user] cookbook example on irregularly spaced data fails In-Reply-To: References: Message-ID: <4512DA71.8020002@gmail.com> Craig Maloney wrote: > Hi all. > > I'm trying to use the delaunay module in the sandbox to do natural > neighbor interpolation. I'm trying to follow the cookbook example at: > > http://www.scipy.org/Cookbook/Matplotlib/Gridding_irregularly_spaced_data > > The "zi's" I get at the end of the day are all NaN. If I play around > with the parameters, I can get *some* of the zi's to be nan. > > Any ideas of what might be wrong? Not really. It is working fine for me. Some of the values must be nan (or whatever you use for default_value when calling nn_interpolator()) because the requested points are outside of the convex hull. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From cmaloney at kitp.ucsb.edu Thu Sep 21 14:44:58 2006 From: cmaloney at kitp.ucsb.edu (Craig Maloney) Date: Thu, 21 Sep 2006 14:44:58 -0400 Subject: [SciPy-user] cookbook example on irregularly spaced data fails In-Reply-To: <4512DA71.8020002@gmail.com> References: <4512DA71.8020002@gmail.com> Message-ID: <889BA671-DBF6-44C5-8817-12A320A0545F@kitp.ucsb.edu> Yep. That was it. Smaller grid has no NaN's. (Apparently not *all* of them were NaN on the big grid... guess the boundary is the first thing which I looked at.). Thanks. On Sep 21, 2006, at 2:31 PM, Robert Kern wrote: > Craig Maloney wrote: >> Hi all. >> >> I'm trying to use the delaunay module in the sandbox to do natural >> neighbor interpolation. I'm trying to follow the cookbook example >> at: >> >> http://www.scipy.org/Cookbook/Matplotlib/ >> Gridding_irregularly_spaced_data >> >> The "zi's" I get at the end of the day are all NaN. If I play around >> with the parameters, I can get *some* of the zi's to be nan. >> >> Any ideas of what might be wrong? > > Not really. It is working fine for me. Some of the values must be > nan (or > whatever you use for default_value when calling nn_interpolator()) > because the > requested points are outside of the convex hull. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a > harmless enigma > that is made terrible by our own mad attempt to interpret it as > though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wolfgang.keller.nospam at gmx.de Thu Sep 21 14:44:08 2006 From: wolfgang.keller.nospam at gmx.de (Wolfgang Keller) Date: Thu, 21 Sep 2006 20:44:08 +0200 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Message-ID: <0001HW.C138AA1800CEBE7DF0305530@news.gmane.org> Hello, > I've been having precisely the same problem. As far as I can tell > there is no way to get the stuff in the superpack working together > properly. Numpy and Scipy install fine from it, but the matplotlib > mpkg in it is broken. Strange. As far as I remember, I installed the entire Superpack and it worked. At least I have everything here in my site-packages folder. On a Powermac G5 with 10.4.7, if that matters. But, it could be that I had the Matplotlib package from http://pythonmac.org/packages/py24-fat/mpkg/matplotlib-0.87.5-py2.4- macosx10.4.mpkg.zip installed before... Sincerely, Wolfgang Keller -- My email-address is correct. Do NOT remove ".nospam" to reply. From fredmfp at gmail.com Thu Sep 21 17:34:57 2006 From: fredmfp at gmail.com (fred) Date: Thu, 21 Sep 2006 23:34:57 +0200 Subject: [SciPy-user] matplotlib fails to run with latest scipy/numpy release Message-ID: <45130581.4020704@gmail.com> Hi all, I have just updated scipy (0.5.2.dev2209) and numpy (1.0.dev3204) and now, matplotlib (0.87-4) fails to run :-( ./> python Python 2.4.3 (#1, Aug 10 2006, 09:25:31) [GCC 3.3.5 (Debian 1:3.3.5-13)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import pylab Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/pylab.py", line 1, in ? from matplotlib.pylab import * File "/usr/local/lib/python2.4/site-packages/matplotlib/pylab.py", line 196, in ? import cm File "/usr/local/lib/python2.4/site-packages/matplotlib/cm.py", line 5, in ? import colors File "/usr/local/lib/python2.4/site-packages/matplotlib/colors.py", line 33, in ? from numerix import array, arange, take, put, Float, Int, where, \ File "/usr/local/lib/python2.4/site-packages/matplotlib/numerix/__init__.py", line 74, in ? Matrix = matrix NameError: name 'matrix' is not defined What's wrong ? Cheers, -- ? Python, c'est un peu comme la ba?onnette en 14, faut toujours sortir avec, et pas h?siter d'en mettre un coup en cas de besoin. ? J.-Ph. D. From oliphant at ee.byu.edu Thu Sep 21 18:08:22 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 21 Sep 2006 16:08:22 -0600 Subject: [SciPy-user] matplotlib fails to run with latest scipy/numpy release In-Reply-To: <45130581.4020704@gmail.com> References: <45130581.4020704@gmail.com> Message-ID: <45130D56.1070507@ee.byu.edu> fred wrote: >Hi all, > >I have just updated scipy (0.5.2.dev2209) and numpy (1.0.dev3204) and >now, matplotlib (0.87-4) > > You need to upgrade matplotlib to 0.87-5 -Travis From zufus at zufus.org Thu Sep 21 18:10:00 2006 From: zufus at zufus.org (Marco Presi) Date: Fri, 22 Sep 2006 00:10:00 +0200 Subject: [SciPy-user] matplotlib fails to run with latest scipy/numpy release In-Reply-To: <45130581.4020704@gmail.com> (fredmfp@gmail.com's message of "Thu, 21 Sep 2006 23:34:57 +0200") References: <45130581.4020704@gmail.com> Message-ID: <87slik50sn.fsf@zufus.org> || On Thu, 21 Sep 2006 23:34:57 +0200 || fred wrote: f> Hi all, f> I have just updated scipy (0.5.2.dev2209) and numpy (1.0.dev3204) and f> now, matplotlib (0.87-4) f> fails to run :-( You have to use matplotlib-0.87.5 Regards Marco -- "I videogiochi non influenzano i bambini. Voglio dire, se Pac-Man avesse influenzato la nostra generazione, staremmo tutti saltando in sale scure, masticando pillole magiche e ascoltando musica elettronica ripetitiva." "Videogames do not influence kids. I mean, if Pac-Man influenced our generation, we were all jumping in dark rooms, chomping pills and listening to electronic repeating music." Kristian Wilson, Nintendo Inc. 1989 From fredmfp at gmail.com Thu Sep 21 18:50:19 2006 From: fredmfp at gmail.com (fred) Date: Fri, 22 Sep 2006 00:50:19 +0200 Subject: [SciPy-user] matplotlib fails to run with latest scipy/numpy release In-Reply-To: <45130D56.1070507@ee.byu.edu> References: <45130581.4020704@gmail.com> <45130D56.1070507@ee.byu.edu> Message-ID: <4513172B.9040704@gmail.com> Travis Oliphant a ?crit : >fred wrote: > > > >>Hi all, >> >>I have just updated scipy (0.5.2.dev2209) and numpy (1.0.dev3204) and >>now, matplotlib (0.87-4) >> >> >> >> >You need to upgrade matplotlib to 0.87-5 > > Oh, thanks ! It is FAQ ? I see nothing about it on google. -- ? Python, c'est un peu comme la ba?onnette en 14, faut toujours sortir avec, et pas h?siter d'en mettre un coup en cas de besoin. ? J.-Ph. D. From johnlichtenstein at mac.com Thu Sep 21 20:44:23 2006 From: johnlichtenstein at mac.com (John Lichtenstein) Date: Thu, 21 Sep 2006 17:44:23 -0700 Subject: [SciPy-user] matplotlib and scipy.stats on Tiger In-Reply-To: <4513172B.9040704@gmail.com> References: <45130581.4020704@gmail.com> <45130D56.1070507@ee.byu.edu> <4513172B.9040704@gmail.com> Message-ID: Has anyone found a way to get both scipy.stats and matplot lib to work on Tiger? If I use the package from scipy.org, matplot lib (0.87.4_r2587) doesn't work. If I install matplotlib to 0.87.5 from undefined.org, matplotlib it complains that it was compiled against a newer version of numpy. Installing Numpy 1.0b5 from undefined.org fixes matplotlib but breaks scipy, which then complains it was built on an older version of numpy. Installing scipy 0.5.1 from undefined.org lets me import scipy, but if i try to import scipy.stats I get Traceback (most recent call last): File "", line 1, in -toplevel- import scipy.stats File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/stats/__init__.py", line 7, in -toplevel- from stats import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/stats/stats.py", line 191, in -toplevel- import scipy.special as special File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/special/__init__.py", line 8, in - toplevel- from basic import * File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/special/basic.py", line 8, in -toplevel- from _cephes import * ImportError: Inappropriate file type for dynamic loading -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists.steve at arachnedesign.net Thu Sep 21 21:56:31 2006 From: lists.steve at arachnedesign.net (Steve Lianoglou) Date: Thu, 21 Sep 2006 21:56:31 -0400 Subject: [SciPy-user] matplotlib and scipy.stats on Tiger In-Reply-To: References: <45130581.4020704@gmail.com> <45130D56.1070507@ee.byu.edu> <4513172B.9040704@gmail.com> Message-ID: <31AFAB95-7562-40F6-A3FB-FEB54B3D40C9@arachnedesign.net> Hi John, > Has anyone found a way to get both scipy.stats and matplot lib to > work on Tiger? I've been installing matplotlib/scipy/numpy from SVN and all seems to work fine -- numpy passes all tests, scipy does not (but this has been brought up in the list before). Honestly I haven't had much need to use scipy, as most of what I've been doing has been focused on numpy, so I can't say how bad those scipy test failures are (I believe there are 2). > Installing scipy 0.5.1 from undefined.org lets me import scipy, but > if i try to import scipy.stats I get > > Traceback (most recent call last): > File "", line 1, in -toplevel- > import scipy.stats ... Importing scipy.stats on my machine gives me no errors/warnings. I'm running OS X.4.7 on a MacBook Pro. You might think it's slightly more inconvenient working from svn, but I like it since numpy is constantly improving and I like to get those in my workspace asap. Usually I don't have to update matplotlib, but I do every now and again just because it gives me something to do to take a break from my 'real work' and still makes me feel like I'm being productive :-) HTH, -steve From josegomez at gmx.net Fri Sep 22 04:27:05 2006 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Fri, 22 Sep 2006 10:27:05 +0200 Subject: [SciPy-user] Selecting array elements using other array Message-ID: <20060922082705.47290@gmx.net> Hi! For the time being, I am still trapped with Numeric, and an old-ish version of Enthought. i am aware that with numpy, this question is trivial, but I need to keep using this version of Enthought. Essentially, I have two arrays: one is X (shape Nxd), and the other is id (shape N). I want to get an array which give me the rows of X where id is equal to some value. I can do this with for loops, but it is quite slow. I am sure there must be a way to do this efficiently. Note that d can vary. I have tried to build a new array and use compress or where. The new array has a new column with the id appended: tmp=c_[X[:,0,NewAxis],X[:,1,NewAxis],X[:,2,NewAxis],id[:,NewAxis]] However, this is just as messy, and i need to hard-code the second dimension of X. There has to be a more elegant solution to this!!! Many thanks! Jos? -- Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! Ideal f?r Modem und ISDN: http://www.gmx.net/de/go/smartsurfer From a.u.r.e.l.i.a.n at gmx.net Fri Sep 22 04:34:26 2006 From: a.u.r.e.l.i.a.n at gmx.net (Johannes Loehnert) Date: Fri, 22 Sep 2006 10:34:26 +0200 Subject: [SciPy-user] Selecting array elements using other array In-Reply-To: <20060922082705.47290@gmx.net> References: <20060922082705.47290@gmx.net> Message-ID: <200609221034.26976.a.u.r.e.l.i.a.n@gmx.net> On Friday 22 September 2006 10:27, Jose Luis Gomez Dans wrote: > Hi! > For the time being, I am still trapped with Numeric, and an old-ish version > of Enthought. i am aware that with numpy, this question is trivial, but I > need to keep using this version of Enthought. > > Essentially, I have two arrays: one is X (shape Nxd), and the other is id > (shape N). I want to get an array which give me the rows of X where id is > equal to some value. I can do this with for loops, but it is quite slow. I > am sure there must be a way to do this efficiently. Note that d can vary. result = take(X, nonzero(id==value)) might be what you are looking for. HTH, Johannes From josegomez at gmx.net Fri Sep 22 05:43:56 2006 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Fri, 22 Sep 2006 11:43:56 +0200 Subject: [SciPy-user] Selecting array elements using other array In-Reply-To: <200609221034.26976.a.u.r.e.l.i.a.n@gmx.net> References: <20060922082705.47290@gmx.net> <200609221034.26976.a.u.r.e.l.i.a.n@gmx.net> Message-ID: <20060922094356.47320@gmx.net> Johannes, -------- Original-Nachricht -------- > > am sure there must be a way to do this efficiently. Note that d can > vary. > > result = take(X, nonzero(id==value)) See? I was sure there was a simple way of achieving this Many thanks! Jose -- Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! Ideal f?r Modem und ISDN: http://www.gmx.net/de/go/smartsurfer From jdhunter at ace.bsd.uchicago.edu Fri Sep 22 09:29:23 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 22 Sep 2006 08:29:23 -0500 Subject: [SciPy-user] matplotlib and scipy.stats on Tiger In-Reply-To: (John Lichtenstein's message of "Thu, 21 Sep 2006 17:44:23 -0700") References: <45130581.4020704@gmail.com> <45130D56.1070507@ee.byu.edu> <4513172B.9040704@gmail.com> Message-ID: <87slikrpvw.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "John" == John Lichtenstein writes: John> Has anyone found a way to get both scipy.stats and matplot John> lib to work on Tiger? If I use the package from scipy.org, John> matplot lib (0.87.4_r2587) doesn't work. If I install John> matplotlib to 0.87.5 from undefined.org, matplotlib it John> complains that it was compiled against a newer version of John> numpy. Installing Numpy 1.0b5 from undefined.org fixes John> matplotlib but breaks scipy, which then complains it was John> built on an older version of numpy. Installing scipy 0.5.1 John> from undefined.org lets me import scipy, but if i try to John> import scipy.stats I get If you are compiling from src, you might just get the latest svn of all three. They tend to work together. JDH From jallikattu at googlemail.com Tue Sep 5 02:56:25 2006 From: jallikattu at googlemail.com (babu) Date: Tue, 05 Sep 2006 06:56:25 -0000 Subject: [SciPy-user] global fitting of data sets with some parameters linked Message-ID: <72da94d60609042356h363db83k3a41ed3ca1644250@mail.gmail.com> Hello I have data sets and would like to fit them globally with common parameters. Is there a way to do this using existing scipy modules leastsq. Any pointers for algo and suggestions are welcome. Thanks Babu. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jallikattu at googlemail.com Thu Sep 21 09:15:20 2006 From: jallikattu at googlemail.com (morovia morovia) Date: Thu, 21 Sep 2006 15:15:20 +0200 Subject: [SciPy-user] global fitting - some common or linked parameters. Message-ID: <72da94d60609210615u2181d16bpe1a5e5109fedf88e@mail.gmail.com> Hello I have data sets and would like to fit them globally with common parameters. Is there a way to do this using existing scipy modules leastsq. Any pointers for algo and suggestions are welcome. Thanks Morovia -------------- next part -------------- An HTML attachment was scrubbed... URL: From jallikattu at googlemail.com Fri Sep 22 02:52:10 2006 From: jallikattu at googlemail.com (morovia morovia) Date: Fri, 22 Sep 2006 08:52:10 +0200 Subject: [SciPy-user] global fitting - some common or linked parameters. Message-ID: <72da94d60609212352y762cf3bfk9cd04860c9e6cd43@mail.gmail.com> Hello I have data sets and would like to fit them globally with common parameters. Is there a way to do this using existing scipy modules leastsq. Any pointers for algo and suggestions are welcome. Thanks Morovia -------------- next part -------------- An HTML attachment was scrubbed... URL: From jallikattu at googlemail.com Fri Sep 22 02:56:28 2006 From: jallikattu at googlemail.com (morovia morovia) Date: Fri, 22 Sep 2006 08:56:28 +0200 Subject: [SciPy-user] global fitting - some common or linked parameters. Message-ID: <72da94d60609212356j170e99adke4555b16b89fd73a@mail.gmail.com> Hello I have data sets and would like to fit them globally with common parameters. Is there a way to do this using existing scipy modules leastsq. Any pointers for algo and suggestions are welcome. Thanks Morovia -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Fri Sep 22 10:24:35 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 22 Sep 2006 16:24:35 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> Message-ID: <20060922142433.GD24708@clipper.ens.fr> On Wed, Aug 09, 2006 at 10:15:04AM +0200, morovia morovia wrote: > I have data sets and would like to fit them globally with common > parameters. > Is there a way to do this using existing scipy modules leastsq. > Any pointers for algo and suggestions are welcome. Hi, I just uploaded a example to do this in the cookbook, on scipy.org . Does it suit your needs ? Ga?l From jdhunter at ace.bsd.uchicago.edu Fri Sep 22 10:11:46 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 22 Sep 2006 09:11:46 -0500 Subject: [SciPy-user] on global fitting for sets of data In-Reply-To: <72da94d60608080159x580f710n38b28fab3932e794@mail.gmail.com> ("morovia morovia"'s message of "Tue, 8 Aug 2006 10:59:07 +0200") References: <72da94d60608080159x580f710n38b28fab3932e794@mail.gmail.com> Message-ID: <87psdoxa71.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "morovia" == morovia morovia writes: morovia> Hello I have data sets and would like to fit them morovia> globally with common parameters. Is there a way to do morovia> this using existing scipy modules leastsq. morovia> Any pointers for algo and suggestions are welcome. Are you kidding? 9 emails on the same subject with two different subject headers and two different sender names within minutes? Either this is a novel form of spam or a very anxious poster. JDH From robert.kern at gmail.com Fri Sep 22 12:47:52 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 22 Sep 2006 11:47:52 -0500 Subject: [SciPy-user] on global fitting for sets of data In-Reply-To: <87psdoxa71.fsf@peds-pc311.bsd.uchicago.edu> References: <72da94d60608080159x580f710n38b28fab3932e794@mail.gmail.com> <87psdoxa71.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <451413B8.1050203@gmail.com> John Hunter wrote: >>>>>> "morovia" == morovia morovia writes: > > morovia> Hello I have data sets and would like to fit them > morovia> globally with common parameters. Is there a way to do > morovia> this using existing scipy modules leastsq. > > morovia> Any pointers for algo and suggestions are welcome. > > Are you kidding? 9 emails on the same subject with two different > subject headers and two different sender names within minutes? Either > this is a novel form of spam or a very anxious poster. He had trouble subscribing. Now that he is subscribed, Mailman delivered the messages he had previously tried to send. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From rowen at cesmail.net Fri Sep 22 15:08:12 2006 From: rowen at cesmail.net (Russell E. Owen) Date: Fri, 22 Sep 2006 12:08:12 -0700 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 References: Message-ID: In article , "Helmut Strey" wrote: > I have been trying to install the SciPy Superpack for Python. I used > MacPython2.4. When I try to import pylab the following error message > appears:... > > It seems that the package expects libfreetype6.dylib in > /usr/local/lib. I found it in /usr/X11R6/lib. I tried to create a > symbolic link but then it says that the compiler versions dont match. > What should I do? It sounds as if the matplotlib included with the SciPy SuperPack is dynamically linked to libfreetype. Oops. I suggest you install SciPy SuperPack first, then install matplotlib from . (Or, if you want to use TkAgg with a decent version of Tcl/Tk, I can send you an installer; the one presently at pythonmac only works with Apple's icky built-in Tcl/Tk). Or you could try installing libfreetype (and you'll probably need libpng as well). I just posted instructions for building matplotlib on the matplotlib mailing list, and it includes instructions for building both libraries. -- Russell From rowen at cesmail.net Fri Sep 22 15:10:47 2006 From: rowen at cesmail.net (Russell E. Owen) Date: Fri, 22 Sep 2006 12:10:47 -0700 Subject: [SciPy-user] Installation problems Mac OS X 10.4.7 References: <8edec5ec0609170201p3ecb5021y4eda8506f71b9607@mail.gmail.com> Message-ID: In article <8edec5ec0609170201p3ecb5021y4eda8506f71b9607 at mail.gmail.com>, "David Andrews" wrote: > Hi Helmut, > > I've been having precisely the same problem. As far as I can tell > there is no way to get the stuff in the superpack working together > properly. Numpy and Scipy install fine from it, but the matplotlib > mpkg in it is broken. I had no real progress using the 'official' > matplotlib packages from their site, as they expect a different > version of numpy, despite the fact that its supposed to be fine with > all 1.0+ versions of numpy. > > I'm not sure how to resolve this problem, It has been suggested that > its best to install packages from > http://pythonmac.org/packages/py24-fat/index.html though scipy is > currently not available there. The matplotlib 0.87.5 there certainly works with numpy 1.0b5. I've not tried it with 1.0rc1, but if matplotlib 0.87.5 itself is compatible with 1.0rc1 then I'd expect it to work. -- Russell From ryanlists at gmail.com Fri Sep 22 21:32:34 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 22 Sep 2006 20:32:34 -0500 Subject: [SciPy-user] additions to signal.ltisys.lti Message-ID: I don't know if this is a developer question or not. It isn't anything hard core developer-ish, so I will start with the regular list. I would like to propose the addition of the following methods to signal.ltisys.lti. The two lsim ones are just conveinence functions, but I find the Bode method really useful. (I may have had to add an import statement for poly1d). def Bode(self,f): num=poly1d(self.num) den=poly1d(self.den) s=2.0j*pi*f tf=num(s)/den(s) return tf def lsim(self, U, T, X0=None, interp=1): Y=signal.lsim(self, U, T, X0, interp) return Y def lsim2(self, U, T, X0=None): Y=signal.lsim2(self, U, T, X0) return Y Basically, I am teaching a class with some controls/systems modeling and want Python to offer most of the same tools for lti's as Matlab's controls toolbox (eventually). Any thoughts? Ryan From robert.kern at gmail.com Fri Sep 22 22:12:16 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 22 Sep 2006 21:12:16 -0500 Subject: [SciPy-user] additions to signal.ltisys.lti In-Reply-To: References: Message-ID: <45149800.3020502@gmail.com> Ryan Krauss wrote: > Basically, I am teaching a class with some controls/systems modeling > and want Python to offer most of the same tools for lti's as Matlab's > controls toolbox (eventually). > > Any thoughts? I'm happy for them to go in if they have docstrings and tests. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ryanlists at gmail.com Sat Sep 23 03:41:44 2006 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 23 Sep 2006 02:41:44 -0500 Subject: [SciPy-user] additions to signal.ltisys.lti In-Reply-To: <45149800.3020502@gmail.com> References: <45149800.3020502@gmail.com> Message-ID: I was already thinking they needed docstrings and I can do that easily. How would I write good tests? On 9/22/06, Robert Kern wrote: > Ryan Krauss wrote: > > Basically, I am teaching a class with some controls/systems modeling > > and want Python to offer most of the same tools for lti's as Matlab's > > controls toolbox (eventually). > > > > Any thoughts? > > I'm happy for them to go in if they have docstrings and tests. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless enigma > that is made terrible by our own mad attempt to interpret it as though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From peridot.faceted at gmail.com Sat Sep 23 16:07:51 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Sat, 23 Sep 2006 16:07:51 -0400 Subject: [SciPy-user] docstring cross-references [PATCH] Message-ID: Hi, I've quickly gone through and done some cross-referencing of docstrings in scipy. Only the modules interpolate, integrate, and optimize have cross-references so far. I looked at special, but there are so many special functions I'm not sure what to do (for example, should the docstring on Bessel functions list modified Bessel functions? spherical Bessel functions?). That's all the packages I've had occasion to use, so the others would take longer. I'm afraid the cross-referencing is not neccessarily very consistent in formatting, but then neither are the docstrings themselves. A. M. Archibald -------------- next part -------------- A non-text attachment was scrubbed... Name: crossrefs Type: application/octet-stream Size: 36066 bytes Desc: not available URL: From robert.kern at gmail.com Sat Sep 23 16:40:16 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 23 Sep 2006 15:40:16 -0500 Subject: [SciPy-user] docstring cross-references [PATCH] In-Reply-To: References: Message-ID: <45159BB0.6000003@gmail.com> A. M. Archibald wrote: > Hi, > > I've quickly gone through and done some cross-referencing of > docstrings in scipy. Thank you! For future patches, please create a ticket in our Trac. Due to spam, you will have to create an account, first. The option to attach a file is, confusingly, exposed after the ticket is created. http://projects.scipy.org/scipy/scipy http://projects.scipy.org/scipy/scipy/register -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From peridot.faceted at gmail.com Sat Sep 23 16:53:36 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Sat, 23 Sep 2006 16:53:36 -0400 Subject: [SciPy-user] docstring cross-references [PATCH] In-Reply-To: <45159BB0.6000003@gmail.com> References: <45159BB0.6000003@gmail.com> Message-ID: On 23/09/06, Robert Kern wrote: > A. M. Archibald wrote: > > Hi, > > > > I've quickly gone through and done some cross-referencing of > > docstrings in scipy. > > Thank you! For future patches, please create a ticket in our Trac. Due to spam, > you will have to create an account, first. The option to attach a file is, > confusingly, exposed after the ticket is created. Ah, thanks. I wasn't sure anybody read the trac. A. M. Archibald From robert.kern at gmail.com Sat Sep 23 16:56:35 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 23 Sep 2006 15:56:35 -0500 Subject: [SciPy-user] docstring cross-references [PATCH] In-Reply-To: References: <45159BB0.6000003@gmail.com> Message-ID: <45159F83.4090203@gmail.com> A. M. Archibald wrote: > On 23/09/06, Robert Kern wrote: >> A. M. Archibald wrote: >>> Hi, >>> >>> I've quickly gone through and done some cross-referencing of >>> docstrings in scipy. >> Thank you! For future patches, please create a ticket in our Trac. Due to spam, >> you will have to create an account, first. The option to attach a file is, >> confusingly, exposed after the ticket is created. > > Ah, thanks. I wasn't sure anybody read the trac. We do. We don't necessarily have time to act on the tickets, but posting patches to the list doesn't change that. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Sat Sep 23 17:54:31 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 23 Sep 2006 16:54:31 -0500 Subject: [SciPy-user] docstring cross-references [PATCH] In-Reply-To: References: Message-ID: <4515AD17.6040603@gmail.com> A. M. Archibald wrote: > Hi, > > I've quickly gone through and done some cross-referencing of > docstrings in scipy. Only the modules interpolate, integrate, and > optimize have cross-references so far. I looked at special, but there > are so many special functions I'm not sure what to do (for example, > should the docstring on Bessel functions list modified Bessel > functions? spherical Bessel functions?). I'd suggest just leaving most of it alone. The categorized info.py is probably more useful. > That's all the packages I've > had occasion to use, so the others would take longer. I've checked in most of your patch. I have local modifications to interpolate.py that I will check in later today with your cross-reference information. > I'm afraid the cross-referencing is not neccessarily very consistent > in formatting, but then neither are the docstrings themselves. Heh. You ain't kidding. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Sat Sep 23 20:18:03 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 23 Sep 2006 19:18:03 -0500 Subject: [SciPy-user] additions to signal.ltisys.lti In-Reply-To: References: <45149800.3020502@gmail.com> Message-ID: <4515CEBB.5000304@gmail.com> Ryan Krauss wrote: > How would I write good tests? Whole books have been written on this subject, so I will only try to distill some key points. A good, science- and Python-oriented resource about this is given by Greg Wilson's excellent Software Carpentry lectures available here: http://swc.scipy.org/ http://swc.scipy.org/lec/unit.html The first kind of tests you need to write are "Did I write my algorithm correctly?" tests (or as I like to call them, "bozo tests"). Test for everything that could possibly go wrong. For this wave, you can assume that setting an attribute works, for example. Check that the obvious boundary cases work. Check that simple cases that you can do in your head work. Check the general case output for consistency if it is untenable to give the precise value in the test case (e.g. if you were testing a sorting function, generate a large shuffled list, sort it, and check that all of the values are increasing). Check that input which is supposed to fail, fails in the way that you expect. If there are analytical solutions to simple inputs that the numerical solution should be able to match within a given accuracy, check that they do. One thing that you should never do at this point is run the code and copy its output. You are trying to verify that the code you wrote is correct. If you can't specify what the output for these tests should be without running the code, then you don't understand the algorithm. At this point, you should stop and take a look at your tests. Make sure they cover everything that needs to be covered. At the very bare minimum, every line of code under test should be exercised. There are several tools for helping you do this, coverage.py is fairly canonical: http://www.nedbatchelder.com/code/modules/coverage.html but I like the HTML reports and command-line options of trace2html better: http://cheeseshop.python.org/pypi/trace2html The algorithms for determining what lines are "executable" and which lines actually got executed are not perfect, so you might want to try both. If you find that there is some paragraph of code should be tested separately, but it's in the middle of a method, then now would be a good time to refactor it out into a separate method. When writing new code, you should be asking yourself, "How can I test this?" and writing your code such that it is easy to test. This is called Designing for Testability. If you find that you rely on other components, you might want to stub them out with mock objects that return hard-coded values. For example, you might want to replace poly1d() in order to test just the bode() function. Mock objects can also be written to record operations that are performed on them in order to test that your method actually did certain things in the middle of the method. Tests written with this kind of focus in mind are called "unit tests" because each test case just tests the specific units involved. Tests which are written in order to test the interaction of several units are called "integration tests". Sadly, the term "unit test" often gets thrown around pretty freely to mean any kind of test. The Python unittest module is really a framework for doing all of these tests. The next kind of test that you should consider are "regression tests." These tests are written to make sure that future changes to the code haven't affected anything important. If your bozo tests were thorough, you probably won't have to add much, if anything, here. Now is the time to fill in things which "couldn't possibly go wrong" but are part of the interface. *Now* you can run your code and copy the output because you've already verified that the implementation is correct (right?). If you didn't get around to separating your tests into clean unit tests, now's the time to write tests that give you some protection (well, not protection, but perhaps diagnostics) against changes in other units. *Make sure that everything that is guaranteed by the public interface is correct.* *Any semantic changes to your code should cause one or more of your tests to fail.* As for the specifics of writing tests in scipy, there are several utilities in numpy.testing that you should use for making assertions: assert_equal -- assert equality assert_almost_equal -- assert equality with decimal tolerance assert_approx_equal -- assert equality with significant digits tolerance assert_array_compare -- assert arrays obey some comparison relationship assert_array_equal -- assert arrays equality assert_array_almost_equal -- assert arrays equality with decimal tolerance assert_array_less -- assert arrays less-ordering These functions will print out useful information when the assertions fail. # Bad! self.assert_((computed == known).all()) # Good! assert_array_equal(computed, known) You can look at the tests that I just added for interp1d for some examples. Of course, your bode() method is simple enough that your tests won't be nearly as extensive as those. I'd recommend moving the implementations of lsim() and lsim2() into the methods and make the functions simply call the method on the system object (or construct the system object and call the method). Whichever way you go, the one that calls the other only needs to test that it produces the same output as the "real" implementation. The interface guarantee that they are exposing is really "I call the real thing" rather than "I simulate the output of a continuous-time linear system". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From keelert at gmail.com Sun Sep 24 01:03:10 2006 From: keelert at gmail.com (Todd Keeler) Date: Sat, 23 Sep 2006 22:03:10 -0700 Subject: [SciPy-user] installing numpy Message-ID: <2a5330b70609232203q52db101cp9fffc4046d84596d@mail.gmail.com> Hi, I'm attempting to install numpy/scipy on my computer. I've installed lapack / atlas and have the following libs on my system: atlas-lapack /usr/lib/libatlas.a atlas-lapack /usr/lib/libcblas.a atlas-lapack /usr/lib/libf77blas.a atlas-lapack /usr/lib/liblapack.a The site.cfg reads: [atlas] library_dirs = /usr/lib/ atlas_libs = lapack, blas I'm having serious problems getting the numpy install to find the atlas and blas libraries: Any tips would be greatly appreciated. Thanks Todd heres the output of $ python setup.py config_fc --fcompiler=gnu95 build Running from numpy source directory. F2PY Version 2_3198 blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in /usr/local/lib libraries mkl,vml,guide not found in /usr/lib NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries lapack,blas not found in /usr/local/lib libraries lapack,blas not found in /usr/lib NOT AVAILABLE atlas_blas_info: libraries lapack,blas not found in /usr/local/lib libraries lapack,blas not found in /usr/lib NOT AVAILABLE /home/todd/pkg/python-numpy/src/numpy-1.0rc1/numpy/distutils/system_info.py:1296: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) blas_info: libraries blas not found in /usr/local/lib libraries blas not found in /usr/lib NOT AVAILABLE /home/todd/pkg/python-numpy/src/numpy-1.0rc1/numpy/distutils/system_info.py:1305: UserWarning: Blas (http://www.netlib.org/blas/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [blas]) or by setting the BLAS environment variable. warnings.warn(BlasNotFoundError.__doc__) blas_src_info: NOT AVAILABLE /home/todd/pkg/python-numpy/src/numpy-1.0rc1/numpy/distutils/system_info.py:1308: UserWarning: Blas (http://www.netlib.org/blas/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [blas_src]) or by setting the BLAS_SRC environment variable./home/todd/pkg/python-numpy/src/numpy- 1.0rc1/numpy/distutils/system_info.py:1205: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) lapack_info: libraries lapack not found in /usr/local/lib FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 /home/todd/pkg/python-numpy/src/numpy-1.0rc1/numpy/distutils/system_info.py:1229: UserWarning: Blas (http://www.netlib.org/blas/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [blas]) or by setting the BLAS environment variable. warnings.warn(BlasNotFoundError.__doc__) /home/todd/pkg/python-numpy/src/numpy-1.0rc1/numpy/distutils/system_info.py:1232: UserWarning: Blas (http://www.netlib.org/blas/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [blas_src]) or by setting the BLAS_SRC environment variable. warnings.warn(BlasSrcNotFoundError.__doc__) NOT AVAILABLE warnings.warn(BlasSrcNotFoundError.__doc__) NOT AVAILABLE lapack_opt_info: lapack_mkl_info: mkl_info: libraries mkl,vml,guide not found in /usr/local/lib libraries mkl,vml,guide not found in /usr/lib NOT AVAILABLE NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries lapack,blas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries lapack,blas not found in /usr/lib libraries lapack_atlas not found in /usr/lib numpy.distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: libraries lapack,blas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries lapack,blas not found in /usr/lib libraries lapack_atlas not found in /usr/lib numpy.distutils.system_info.atlas_info NOT AVAILABLE -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sun Sep 24 01:11:10 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 24 Sep 2006 00:11:10 -0500 Subject: [SciPy-user] installing numpy In-Reply-To: <2a5330b70609232203q52db101cp9fffc4046d84596d@mail.gmail.com> References: <2a5330b70609232203q52db101cp9fffc4046d84596d@mail.gmail.com> Message-ID: <4516136E.1080302@gmail.com> Todd Keeler wrote: > Hi, > > I'm attempting to install numpy/scipy on my computer. I've installed > lapack / atlas and have the following libs on my system: > > atlas-lapack /usr/lib/libatlas.a > atlas-lapack /usr/lib/libcblas.a > atlas-lapack /usr/lib/libf77blas.a > atlas-lapack /usr/lib/liblapack.a > > The site.cfg reads: > > [atlas] > library_dirs = /usr/lib/ > atlas_libs = lapack, blas This is a standard configuration, so you do not need site.cfg; the libraries should be found just fine. However, I believe your problem is that you do not actually list all of the libraries, namely: lapack, cblas, f77blas, atlas. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From gael.varoquaux at normalesup.org Sun Sep 24 09:59:50 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 24 Sep 2006 15:59:50 +0200 Subject: [SciPy-user] additions to signal.ltisys.lti In-Reply-To: <4515CEBB.5000304@gmail.com> References: <45149800.3020502@gmail.com> <4515CEBB.5000304@gmail.com> Message-ID: <20060924135950.GA29878@clipper.ens.fr> Thank you Robert for such a lecture. Ga?l From wbaxter at gmail.com Sun Sep 24 10:03:44 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Sun, 24 Sep 2006 23:03:44 +0900 Subject: [SciPy-user] Summary of SciPy 2006 & Guido's keynote Message-ID: I found this decent blog summary of SciPy2006 that I thought other folks might be interested in: http://pyre.third-bit.com/blog/archives/612.html I think it's Greg Wilson's blog. Greg, if you're out there, hope you don't mind me posting this. --bill From sebastian_busch at gmx.net Mon Sep 25 04:39:51 2006 From: sebastian_busch at gmx.net (Sebastian Busch) Date: Mon, 25 Sep 2006 10:39:51 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: <20060922142433.GD24708@clipper.ens.fr> References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> Message-ID: Gael Varoquaux wrote: > On Wed, Aug 09, 2006 at 10:15:04AM +0200, morovia morovia wrote: >> ... fit ... globally with common parameters. ... > I just uploaded a example to do this in the cookbook, on scipy.org . > ... Hey Ga?l! That's absolutely great! I was about to write such a thing myself. I guess you simply add up the chi^2 of all data sets and get this minimized? Or does one have to use the reduced chi^2? Could you just please specify a bit more in detail /where/ you uploaded your example in the cookbook? Sorry for being that blind, Sebastian. From gael.varoquaux at normalesup.org Mon Sep 25 08:35:32 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 25 Sep 2006 14:35:32 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> Message-ID: <20060925123517.GB1141@clipper.ens.fr> On Mon, Sep 25, 2006 at 10:39:51AM +0200, Sebastian Busch wrote: > That's absolutely great! I was about to write such a thing myself. I > guess you simply add up the chi^2 of all data sets and get this > minimized? Or does one have to use the reduced chi^2? > Could you just please specify a bit more in detail /where/ you uploaded > your example in the cookbook? http://scipy.org/Cookbook/FittingData , linked on the main cookbook page, near the top (I thought this was a really useful example). If you think you can add chi^2 without rendering the figure unreadable, please do. Ga?l From sransom at nrao.edu Mon Sep 25 15:41:14 2006 From: sransom at nrao.edu (Scott Ransom) Date: Mon, 25 Sep 2006 15:41:14 -0400 Subject: [SciPy-user] new SVN build issue? Message-ID: <200609251541.14905.sransom@nrao.edu> Hi All, So I just svn updated to revision 2233 and successfully installed on a 64-bit Debian installation using: rm -rf build ; python setup.py install --prefix=/usr/local However, on a 32-bit Debian machine, I'm getting the following error after doing: rm -rf build ; python setup.py install --prefix=/users/sransom ---------------------------------------- Warning: Inheriting attribute 'description'='image array convolution functions' from 'scipy.stsci.convolve' Warning: Inheriting attribute 'author'='Todd Miller' from 'scipy.stsci.convolve' Warning: Inheriting attribute 'author_email'='help at stsci.edu' from 'scipy.stsci.convolve' Warning: Inheriting attribute 'description'='image array convolution functions' from 'scipy.stsci' Warning: Inheriting attribute 'author'='Todd Miller' from 'scipy.stsci' Warning: Inheriting attribute 'author_email'='help at stsci.edu' from 'scipy.stsci' Traceback (most recent call last): File "setup.py", line 55, in ? setup_package() File "setup.py", line 47, in setup_package configuration=configuration ) File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/core.py", line 144, in setup config = configuration() File "setup.py", line 19, in configuration config.add_subpackage('Lib') File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/misc_util.py", line 753, in add_subpackage caller_level = 2) File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/misc_util.py", line 736, in get_subpackage caller_level = caller_level + 1) File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/misc_util.py", line 683, in _get_configuration_from_setup_py config = setup_module.configuration(*args) File "./Lib/setup.py", line 25, in configuration config.make_svn_version_py() # installs __svn_version__.py File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/misc_util.py", line 1298, in make_svn_version_py self.add_data_files(('', generate_svn_version_py())) File "/users/sransom/lib/python2.3/site-packages/numpy/distutils/misc_util.py", line 1281, in generate_svn_version_py assert revision is not None,'hmm, why I am not inside SVN tree???' AssertionError: hmm, why I am not inside SVN tree??? ---------------------------------------- Am I doing something stupid? Thanks, Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From sebastian_busch at gmx.net Mon Sep 25 16:16:19 2006 From: sebastian_busch at gmx.net (Sebastian Busch) Date: Mon, 25 Sep 2006 22:16:19 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: <20060925123517.GB1141@clipper.ens.fr> References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> <20060925123517.GB1141@clipper.ens.fr> Message-ID: Gael Varoquaux wrote: > On Mon, Sep 25, 2006 at 10:39:51AM +0200, Sebastian Busch wrote: >> ... please specify ... /where/ you uploaded your example ... > > http://scipy.org/Cookbook/FittingData ... Hi Ga?l! Thanks for your reply. I think this is not quite what the OP wanted. In any case, it is not what I am looking for ;) You fit two data sets with two different functions. Completely independent. You could do the same in two runs of your program working on the different data sets. What makes your example similar to what I want is the frequency: it is nearly the same. Now let's assume these are two motions of which you know that the frequencies should be the same. Any differences are due to measurement errors. So you sit down and want to fit the data. Not in two runs, then averaging the two frequencies you get, but in one big fit. This is better because one measurement was more precise than the other and it's hard to account for that after the fit finished. I tried to modify your example in that way. As I am pretty much a bloody beginner in python, I failed. I post my try anyway to show you more clearly what I am thinking of. And, I admit, in the hope you can do better ;) Thanks for your time, Sebastian. +++++++++++++++++++ from pylab import * from scipy import * # Generate data points with noise num_points = 150 # as i got confused with xT, Xt, ... it's now data set 1 and 2, each set having x/y coordinates. x1 = linspace(5., 8., num_points) x2 = x1 xexp = array([x1, x2]) # i put them together. not sure whether this is the right way... y1 = 11.86*cos(2*pi/0.81*x1-1.32) + 6.*rand(num_points) y2 = -32.14*cos(2*pi/0.8*x2-1.94) + 3.*rand(num_points) yexp = [y1, y2] # same thing # Fit setup def ytheo (p,x) : # yth = zeros((x1,x1)) how to initialize? for j in range(len(xexp)): # count through data sets for k in range(len(x1)): # count through the data points within a (the first) set yth[j,k] = p[(2*j+1)]*cos(2*pi/p[0]*x1[k]+p[1*(2*j+2)]) # Target functions. note the same p[0] for all sets. it is a matrix: first index denoting the set, second index denoting a point in a set. return yth errfunc = lambda p, x, y: ytheo(p,x) - yexp # Distance to the target function # fit p0 = c_[0.8, -15., 0., -15., 0.] # Initial guess for the parameters # 0: common "frequency"; 1: 1st set ampl; 2: 1st set phase; 3: 2nd set ampl; 4: 2nd set phase p1,success = optimize.leastsq(errfunc, p0.copy(), args = (xexp, yexp)) time = linspace(x1.min(),x1.max(),100) plot(x1,y1,"ro",time,fitfunc(p1,time),"r-") # Plot of the data and the fit plot(x2,y2,"b^",time,fitfunc(p2,time),"b-") # Legend the plot title("Oscillations in the compressed trap") xlabel("time [ms]") ylabel("displacement [um]") legend(('1 position', '1 fit', '2 position', '2 fit')) ax = axes() text(0.8, 0.07,'x freq : %.3f kHz \n y freq : %.3f kHz' %(1/p1[1],1/p2[1]), fontsize = 16, horizontalalignment='center', verticalalignment='center', transform = ax.transAxes) show() From oliphant at ee.byu.edu Mon Sep 25 16:25:28 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 25 Sep 2006 14:25:28 -0600 Subject: [SciPy-user] new SVN build issue? In-Reply-To: <200609251541.14905.sransom@nrao.edu> References: <200609251541.14905.sransom@nrao.edu> Message-ID: <45183B38.6030905@ee.byu.edu> Scott Ransom wrote: >Hi All, > >So I just svn updated to revision 2233 and successfully installed on a >64-bit Debian installation using: > >rm -rf build ; python setup.py install --prefix=/usr/local > >However, on a 32-bit Debian machine, I'm getting the following error after >doing: > >rm -rf build ; python setup.py install --prefix=/users/sransom > > What version of SVN are you running on both systems. What version of NumPy? There was a change to SVN that required some fiddling with how NumPy generates the svn revision number. It looks like that is where your problem is. -Travis From sransom at nrao.edu Mon Sep 25 17:25:51 2006 From: sransom at nrao.edu (Scott Ransom) Date: Mon, 25 Sep 2006 17:25:51 -0400 Subject: [SciPy-user] new SVN build issue? In-Reply-To: <45183B38.6030905@ee.byu.edu> References: <200609251541.14905.sransom@nrao.edu> <45183B38.6030905@ee.byu.edu> Message-ID: <200609251725.51931.sransom@nrao.edu> On Monday 25 September 2006 16:25, Travis Oliphant wrote: > Scott Ransom wrote: > >Hi All, > > > >So I just svn updated to revision 2233 and successfully installed on a > >64-bit Debian installation using: > > > >rm -rf build ; python setup.py install --prefix=/usr/local > > > >However, on a 32-bit Debian machine, I'm getting the following error > > after doing: > > > >rm -rf build ; python setup.py install --prefix=/users/sransom > > What version of SVN are you running on both systems. What version of > NumPy? 32-bit machine with the problem: In [1]: numpy.__version__ Out[1]: '1.0rc1.dev3140' tamino:~$ svn --version svn, version 1.4.0 (r21228) compiled Sep 12 2006, 03:40:52 64-bit machine without the problem: In [1]: numpy.__version__ Out[1]: '1.0.dev3216' uller:/usr/local/src/scipy$ svn --version svn, version 1.4.0 (r21228) compiled Sep 13 2006, 09:04:00 Hmmm. I did numpy svn --updates within only a few minutes of one another. I'm kind of surprised they are different. Actually, how is it that those are not the same versions of numpy? Here is what I get with svn info: 32-bit machine: tamino:/export/DATA_1/sransom/src/numpy$ svn info Path: . URL: http://svn.scipy.org/svn/numpy/trunk Repository Root: http://svn.scipy.org/svn/numpy Repository UUID: 94b884b6-d6fd-0310-90d3-974f1d3f35e1 Revision: 3216 Node Kind: directory Schedule: normal Last Changed Author: oliphant Last Changed Rev: 3216 Last Changed Date: 2006-09-25 13:51:52 -0400 (Mon, 25 Sep 2006) 64-bit machine: uller:/usr/local/src/numpy$ svn info Path: . URL: http://svn.scipy.org/svn/numpy/trunk Repository Root: http://svn.scipy.org/svn/numpy Repository UUID: 94b884b6-d6fd-0310-90d3-974f1d3f35e1 Revision: 3216 Node Kind: directory Schedule: normal Last Changed Author: oliphant Last Changed Rev: 3216 Last Changed Date: 2006-09-25 13:51:52 -0400 (Mon, 25 Sep 2006) Now I'm very confused... I'll poke around to see if some old version of numpy is getting pulled in somehow... Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From oliphant at ee.byu.edu Mon Sep 25 17:33:55 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 25 Sep 2006 15:33:55 -0600 Subject: [SciPy-user] new SVN build issue? In-Reply-To: <200609251725.51931.sransom@nrao.edu> References: <200609251541.14905.sransom@nrao.edu> <45183B38.6030905@ee.byu.edu> <200609251725.51931.sransom@nrao.edu> Message-ID: <45184B43.4020607@ee.byu.edu> Scott Ransom wrote: >On Monday 25 September 2006 16:25, Travis Oliphant wrote: > > >>Scott Ransom wrote: >> >> >>>Hi All, >>> >>>So I just svn updated to revision 2233 and successfully installed on a >>>64-bit Debian installation using: >>> >>>rm -rf build ; python setup.py install --prefix=/usr/local >>> >>>However, on a 32-bit Debian machine, I'm getting the following error >>>after doing: >>> >>>rm -rf build ; python setup.py install --prefix=/users/sransom >>> >>> >>What version of SVN are you running on both systems. What version of >>NumPy? >> >> > >32-bit machine with the problem: > In [1]: numpy.__version__ > Out[1]: '1.0rc1.dev3140' > tamino:~$ svn --version > svn, version 1.4.0 (r21228) > compiled Sep 12 2006, 03:40:52 > > > That makes sense because the svn 1.4 fixes did not get checked in until r3164 and you are running numpy version r3140 on the 32-bit machine. >Here is what I get with svn info: > >32-bit machine: >tamino:/export/DATA_1/sransom/src/numpy$ svn info >Path: . >URL: http://svn.scipy.org/svn/numpy/trunk >Repository Root: http://svn.scipy.org/svn/numpy >Repository UUID: 94b884b6-d6fd-0310-90d3-974f1d3f35e1 >Revision: 3216 >Node Kind: directory >Schedule: normal >Last Changed Author: oliphant >Last Changed Rev: 3216 >Last Changed Date: 2006-09-25 13:51:52 -0400 (Mon, 25 Sep 2006) > >64-bit machine: >uller:/usr/local/src/numpy$ svn info >Path: . >URL: http://svn.scipy.org/svn/numpy/trunk >Repository Root: http://svn.scipy.org/svn/numpy >Repository UUID: 94b884b6-d6fd-0310-90d3-974f1d3f35e1 >Revision: 3216 >Node Kind: directory >Schedule: normal >Last Changed Author: oliphant >Last Changed Rev: 3216 >Last Changed Date: 2006-09-25 13:51:52 -0400 (Mon, 25 Sep 2006) > >Now I'm very confused... I'll poke around to see if some old version of >numpy is getting pulled in somehow... > > Hmm. That is confusing. Also, you should remove the build directory and install again from the same directory you ran svn info. Good luck. -Travis From sransom at nrao.edu Mon Sep 25 18:07:46 2006 From: sransom at nrao.edu (Scott Ransom) Date: Mon, 25 Sep 2006 18:07:46 -0400 Subject: [SciPy-user] new SVN build issue? In-Reply-To: <45184B43.4020607@ee.byu.edu> References: <200609251541.14905.sransom@nrao.edu> <45183B38.6030905@ee.byu.edu> <200609251725.51931.sransom@nrao.edu> <45184B43.4020607@ee.byu.edu> Message-ID: <20060925220746.GA1492@ssh.cv.nrao.edu> > >Now I'm very confused... I'll poke around to see if some old version of > >numpy is getting pulled in somehow... > > > > > > Hmm. That is confusing. Also, you should remove the build > directory and install again from the same directory you ran svn info. Yup, I always remove the build directory. My problem this time was a bad PYTHONPATH which was pulling in an older numpy for python 2.3, even though the python I'm using is 2.4. Sorry for the stupidity, Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From wjdandreta at att.net Mon Sep 25 18:11:02 2006 From: wjdandreta at att.net (Bill Dandreta) Date: Mon, 25 Sep 2006 18:11:02 -0400 Subject: [SciPy-user] Does scipy have a function to...? Message-ID: <451853F6.3060507@att.net> What I want is a smooth curve that serves as an upper limit for a data plot. I would like it to be close to the data at the local maxima. Does such a function exist? Bill From gael.varoquaux at normalesup.org Mon Sep 25 18:16:17 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 26 Sep 2006 00:16:17 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> <20060925123517.GB1141@clipper.ens.fr> Message-ID: <20060925221617.GC18281@clipper.ens.fr> > What makes your example similar to what I want is the frequency: it is > nearly the same. Now let's assume these are two motions of which you > know that the frequencies should be the same. Any differences are due to > measurement errors. OK, I'll first comment your code (I shaped it to be able to read it better. Your line are to long, especially in comments, and your comments should be more often on separate lines, indented the same way as the code). My comments are prefixed by ### +++++++++++++++++ from pylab import * from scipy import * # Generate data points with noise num_points = 150 # as i got confused with xT, Xt, ... it's now data set 1 and 2, each set # having x/y coordinates. x1 = linspace(5., 8., num_points) x2 = x1 xexp = array([x1, x2]) # i put them together. not sure whether this is the right way... ### You probably want to you use c_[x1,x2] y1 = 11.86*cos(2*pi/0.81*x1-1.32) + 6.*rand(num_points) y2 = -32.14*cos(2*pi/0.8*x2-1.94) + 3.*rand(num_points) yexp = [y1, y2] # same thing ### !!!! I thing you misstyped here you meant yexp = c_[y1,y2] # Fit setup def ytheo (p,x) : # yth = zeros((x1,x1)) how to initialize? for j in range(len(xexp)): # count through data sets for k in range(len(x1)): # count through the data points within a (the first) set yth[j,k] = p[(2*j+1)]*cos(2*pi/p[0]*x1[k]+p[1*(2*j+2)]) # Target functions. note the same p[0] for all sets. it is a matrix: # first index denoting the set, second index denoting a point in a set. return yth ### A for loop !! Your are kidding. You should try to avoid this as much ### as possible, and work only on arrays. For loops cost a lot of time. errfunc = lambda p, x, y: ytheo(p,x) - yexp # Distance to the target function # fit p0 = c_[0.8, -15., 0., -15., 0.] # Initial guess for the parameters # 0: common "frequency"; 1: 1st set ampl; 2: 1st set phase; # 3: 2nd set ampl; 4: 2nd set phase p1,success = optimize.leastsq(errfunc, p0.copy(), args = (xexp, yexp)) time = linspace(x1.min(),x1.max(),100) plot(x1,y1,"ro",time,fitfunc(p1,time),"r-") # Plot of the data and the fit plot(x2,y2,"b^",time,fitfunc(p2,time),"b-") # Legend the plot title("Oscillations in the compressed trap") xlabel("time [ms]") ylabel("displacement [um]") legend(('1 position', '1 fit', '2 position', '2 fit')) ax = axes() text(0.8, 0.07,'x freq : %.3f kHz \n y freq : %.3f kHz' %(1/p1[1],1/p2[1]), fontsize = 16, horizontalalignment='center', verticalalignment='center', transform = ax.transAxes) show() +++++++++++++++++ I'll upload an example show you how I would what you are trying to do. The wiki is just so much more readable than mail :->. I don't exactly see what your are trying to do, but it does not seems to good to me. Ga?l From robert.kern at gmail.com Mon Sep 25 18:32:25 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 25 Sep 2006 17:32:25 -0500 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451853F6.3060507@att.net> References: <451853F6.3060507@att.net> Message-ID: <451858F9.30809@gmail.com> Bill Dandreta wrote: > What I want is a smooth curve that serves as an upper limit for a data > plot. I would like it to be close to the data at the local maxima. > > Does such a function exist? No, not really. It would be quite tricky to construct such a beast. At the one end would be a complete interpolating curve that goes through every point; however, this will almost certainly not be as "smooth" as you want it. You would need to choose the length scale at which variations in the data are ignored. E.g. you need to find some means of determining why the curve should skip over A but try to get close to B. ________ /* *\ / A \ * \_______ *B Possibly limiting the order of an interpolating polynomial or spline will be sufficient. You might be able to formulate this as a constrained minimization problem using scipy.optimize.fmin_cobyla(). Take some curve f(x), minimize (f(x) - y) under the constraint (f(x) - y >= 0). -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From clintonwallen at gmail.com Mon Sep 25 18:57:44 2006 From: clintonwallen at gmail.com (Clinton Allen) Date: Mon, 25 Sep 2006 15:57:44 -0700 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451858F9.30809@gmail.com> References: <451853F6.3060507@att.net> <451858F9.30809@gmail.com> Message-ID: You might look at Chebyshev polynomials. They have a "min-max of the error" property. One place to start is http://mathworld.wolfram.com/ChebyshevApproximationFormula.html Don't know if there's any python code for using these. --Clinton On 9/25/06, Robert Kern wrote: > > Bill Dandreta wrote: > > What I want is a smooth curve that serves as an upper limit for a data > > plot. I would like it to be close to the data at the local maxima. > > > > Does such a function exist? > > No, not really. It would be quite tricky to construct such a beast. At the > one > end would be a complete interpolating curve that goes through every point; > however, this will almost certainly not be as "smooth" as you want it. You > would > need to choose the length scale at which variations in the data are > ignored. > E.g. you need to find some means of determining why the curve should skip > over A > but try to get close to B. > > ________ > /* *\ > / A \ > * \_______ > *B > > Possibly limiting the order of an interpolating polynomial or spline will > be > sufficient. > > You might be able to formulate this as a constrained minimization problem > using > scipy.optimize.fmin_cobyla(). Take some curve f(x), minimize (f(x) - y) > under > the constraint (f(x) - y >= 0). > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma > that is made terrible by our own mad attempt to interpret it as though > it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Mon Sep 25 19:09:03 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 26 Sep 2006 01:09:03 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: <20060925221617.GC18281@clipper.ens.fr> References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> <20060925123517.GB1141@clipper.ens.fr> <20060925221617.GC18281@clipper.ens.fr> Message-ID: <20060925230903.GD18281@clipper.ens.fr> OK, I completed to cookbook page. This may not be the most efficient way to do things, but I find it quite readable. Ga?l On Tue, Sep 26, 2006 at 12:16:17AM +0200, Gael Varoquaux wrote: > > What makes your example similar to what I want is the frequency: it is > > nearly the same. Now let's assume these are two motions of which you > > know that the frequencies should be the same. Any differences are due to > > measurement errors. > OK, I'll first comment your code (I shaped it to be able to read it > better. Your line are to long, especially in comments, and your comments > should be more often on separate lines, indented the same way as the > code). My comments are prefixed by ### > +++++++++++++++++ > from pylab import * > from scipy import * > # Generate data points with noise > num_points = 150 > # as i got confused with xT, Xt, ... it's now data set 1 and 2, each set > # having x/y coordinates. > x1 = linspace(5., 8., num_points) > x2 = x1 > xexp = array([x1, x2]) > # i put them together. not sure whether this is the right way... > ### You probably want to you use c_[x1,x2] > y1 = 11.86*cos(2*pi/0.81*x1-1.32) + 6.*rand(num_points) > y2 = -32.14*cos(2*pi/0.8*x2-1.94) + 3.*rand(num_points) > yexp = [y1, y2] # same thing > ### !!!! I thing you misstyped here you meant yexp = c_[y1,y2] > # Fit setup > def ytheo (p,x) : > # yth = zeros((x1,x1)) how to initialize? > for j in range(len(xexp)): # count through data sets > for k in range(len(x1)): > # count through the data points within a (the first) set > yth[j,k] = p[(2*j+1)]*cos(2*pi/p[0]*x1[k]+p[1*(2*j+2)]) > # Target functions. note the same p[0] for all sets. it is a matrix: > # first index denoting the set, second index denoting a point in a set. > return yth > ### A for loop !! Your are kidding. You should try to avoid this as much > ### as possible, and work only on arrays. For loops cost a lot of time. > errfunc = lambda p, x, y: ytheo(p,x) - yexp # Distance to the target function > # fit > p0 = c_[0.8, -15., 0., -15., 0.] # Initial guess for the parameters > # 0: common "frequency"; 1: 1st set ampl; 2: 1st set phase; > # 3: 2nd set ampl; 4: 2nd set phase > p1,success = optimize.leastsq(errfunc, p0.copy(), args = (xexp, yexp)) > time = linspace(x1.min(),x1.max(),100) > plot(x1,y1,"ro",time,fitfunc(p1,time),"r-") # Plot of the data and the fit > plot(x2,y2,"b^",time,fitfunc(p2,time),"b-") > # Legend the plot > title("Oscillations in the compressed trap") > xlabel("time [ms]") > ylabel("displacement [um]") > legend(('1 position', '1 fit', '2 position', '2 fit')) > ax = axes() > text(0.8, 0.07,'x freq : %.3f kHz \n y freq : %.3f kHz' > %(1/p1[1],1/p2[1]), fontsize = 16, > horizontalalignment='center', verticalalignment='center', > transform = ax.transAxes) > show() > +++++++++++++++++ > I'll upload an example show you how I would what you are trying to do. > The wiki is just so much more readable than mail :->. I don't exactly > see what your are trying to do, but it does not seems to good to me. > Ga?l > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From robert.kern at gmail.com Mon Sep 25 19:10:50 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 25 Sep 2006 18:10:50 -0500 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: References: <451853F6.3060507@att.net> <451858F9.30809@gmail.com> Message-ID: <451861FA.8090405@gmail.com> Clinton Allen wrote: > You might look at Chebyshev polynomials. They have a "min-max of the error" > property. One place to start is > http://mathworld.wolfram.com/ChebyshevApproximationFormula.html > Unfortunately, that's two-sided error. It won't do what he wants. > Don't know if there's any python code for using these. scipy.special.cheby{t,u,c,s} -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From haase at msg.ucsf.edu Tue Sep 26 00:05:36 2006 From: haase at msg.ucsf.edu (Sebastian Haase) Date: Mon, 25 Sep 2006 21:05:36 -0700 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451861FA.8090405@gmail.com> References: <451853F6.3060507@att.net> <451858F9.30809@gmail.com> <451861FA.8090405@gmail.com> Message-ID: <4518A710.8080200@msg.ucsf.edu> Hi, we have used scipy cobyla minimaziation successfully to minimize square error of a parameterized function to data points while obeying the constrain that "function - datapoint" is strictly non-negative. "function" could be anything that can be given by a python function (you might need scipy.vectorize) !! It worked grate. Just used from scipy.omtmize import cobyla (this is from memory - but should be close) and follow the example in the source code. Sebastian Haase Robert Kern wrote: > Clinton Allen wrote: >> You might look at Chebyshev polynomials. They have a "min-max of the error" >> property. One place to start is >> http://mathworld.wolfram.com/ChebyshevApproximationFormula.html >> > > Unfortunately, that's two-sided error. It won't do what he wants. > >> Don't know if there's any python code for using these. > > scipy.special.cheby{t,u,c,s} > From sebastian_busch at gmx.net Tue Sep 26 04:16:29 2006 From: sebastian_busch at gmx.net (Sebastian Busch) Date: Tue, 26 Sep 2006 10:16:29 +0200 Subject: [SciPy-user] global fitting of data sets with common parameters In-Reply-To: <20060925230903.GD18281@clipper.ens.fr> References: <72da94d60608090115j371a6c46odfd1a8c384adba2c@mail.gmail.com> <20060922142433.GD24708@clipper.ens.fr> <20060925123517.GB1141@clipper.ens.fr> <20060925221617.GC18281@clipper.ens.fr> <20060925230903.GD18281@clipper.ens.fr> Message-ID: Gael Varoquaux wrote: > Gael Varoquaux wrote: >> I'll upload an example show you how I would what you are trying to do. >> The wiki is just so much more readable than mail :->. I don't exactly >> see what your are trying to do, but it does not seems to good to me. hey there! Thanks a lot! This is great! The only thing you might want to change (I am not allowed to) is the leading space in line 18. I am not sure what does not seem good to you -- the for-loops etc (I totally agree) or the common fitting (in this case, I would be happy if you would like to discuss this in more detail). Thanks again, Sebastian. From ewald.zietsman at gmail.com Tue Sep 26 08:27:42 2006 From: ewald.zietsman at gmail.com (Ewald Zietsman) Date: Tue, 26 Sep 2006 14:27:42 +0200 Subject: [SciPy-user] Interpolation using splines Message-ID: Hi all, I'm trying to fit a cubic spline to a data set. I can't seem to get the functions in scipy.interpolate (splrep and splev) to work correctly ( I'm probably being stupid). When I do the following, #! /usr/bin/python from scipy import interpolate from pylab import * x = arange(0.0,2.0*pi+pi/4.0,2.0*pi/8.0) y = sin(x) tck = interpolate.splrep(x,y,s=0.0) xnew = arange(0.0,2.0*pi+pi/4.0,pi/50.0) ynew = interpolate.splev(xnew,tck,der=0.0) plot(x,y,'x') plot(xnew,ynew,'r-') show() I should get a plot showing a sin curve made of blue x's and a smooth red line which is the spline. However, the spline seems to be zero except for one place where it spikes. I'm using scipy 0.5.0 and matplotlib 0.87.3 with python 2.4.3 on Fedora Core 5. Any help on this will be appreciated. -Ewald -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmay at ou.edu Tue Sep 26 12:19:37 2006 From: rmay at ou.edu (Ryan May) Date: Tue, 26 Sep 2006 11:19:37 -0500 Subject: [SciPy-user] Bilinear Interpolation Message-ID: <45195319.5010405@ou.edu> Hi, Is there function for doing a simple bilinear interpolation in scipy? I tried the interpolate.interp2d routines, but it appears that it uses B-splines even when kind='linear' is specified. (I wouldn't otherwise care except that my class assignment explicitly says to use bilinear.) I know I can do this using a series of interp1d calls, but I wanted to see if there was a simpler way I was missing. Thanks Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma (405)325-3791 rmay at rossby.ou.edu From peridot.faceted at gmail.com Tue Sep 26 12:31:44 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Tue, 26 Sep 2006 12:31:44 -0400 Subject: [SciPy-user] Interpolation using splines In-Reply-To: References: Message-ID: On 26/09/06, Ewald Zietsman wrote: > Hi all, > > I'm trying to fit a cubic spline to a data set. I can't seem to get the > functions in scipy.interpolate (splrep and splev) to work correctly ( I'm > probably being stupid). Your code, cut-n-pasted, works fine for me, using scipy-0.5.2 and numpy-1.0b5. Are you on a 64-bit platform? There was a bug (although these were not its symptoms). A. M. Archibald From mforbes at phys.washington.edu Tue Sep 26 12:16:16 2006 From: mforbes at phys.washington.edu (Michael McNeil Forbes) Date: Tue, 26 Sep 2006 09:16:16 -0700 Subject: [SciPy-user] Interpolation using splines References: Message-ID: Works for me using scipy 0.5.1.dev2185 on OS X. Michael. In article , "Ewald Zietsman" wrote: ... > #! /usr/bin/python > from scipy import interpolate > from pylab import * > x = arange(0.0,2.0*pi+pi/4.0,2.0*pi/8.0) > y = sin(x) > tck = interpolate.splrep(x,y,s=0.0) > xnew = arange(0.0,2.0*pi+pi/4.0,pi/50.0) > ynew = interpolate.splev(xnew,tck,der=0.0) > > plot(x,y,'x') > plot(xnew,ynew,'r-') > show() > > > I should get a plot showing a sin curve made of blue x's and a smooth red > line which is the spline... From stefan at sun.ac.za Tue Sep 26 10:21:05 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 26 Sep 2006 16:21:05 +0200 Subject: [SciPy-user] Interpolation using splines In-Reply-To: References: Message-ID: <20060926142105.GB4835@mentat.za.net> On Tue, Sep 26, 2006 at 02:27:42PM +0200, Ewald Zietsman wrote: > Hi all, > > I'm trying to fit a cubic spline to a data set. I can't seem to get the > functions in scipy.interpolate (splrep and splev) to work correctly ( I'm > probably being stupid). > > When I do the following, > > #! /usr/bin/python > from scipy import interpolate > from pylab import * > x = arange(0.0,2.0*pi+pi/4.0,2.0*pi/8.0) > y = sin(x) > tck = interpolate.splrep(x,y,s=0.0) > xnew = arange(0.0,2.0*pi+pi/4.0,pi/50.0) > ynew = interpolate.splev(xnew,tck,der=0.0) > > plot(x,y,'x') > plot(xnew,ynew,'r-') > show() > > > I should get a plot showing a sin curve made of blue x's and a smooth red line > which is the spline. However, the spline seems to be zero except for one place > where it spikes. > I'm using scipy 0.5.0 and matplotlib 0.87.3 with python 2.4.3 on Fedora Core 5. > Any help on this will be appreciated. Your code gives the correct result on my machine: numpy 1.0.dev3204 scipy 0.5.2.dev2205 Regards St?fan From stefan at sun.ac.za Tue Sep 26 13:23:52 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 26 Sep 2006 19:23:52 +0200 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <45195319.5010405@ou.edu> References: <45195319.5010405@ou.edu> Message-ID: <20060926172352.GA20771@mentat.za.net> On Tue, Sep 26, 2006 at 11:19:37AM -0500, Ryan May wrote: > Is there function for doing a simple bilinear interpolation in scipy? I > tried the interpolate.interp2d routines, but it appears that it uses > B-splines even when kind='linear' is specified. (I wouldn't otherwise > care except that my class assignment explicitly says to use > bilinear.) I'm speaking under correction, but if you choose 'linear', b-splines of degree 1 (i.e. straight lines) are used and you are doing linear interpolation. Regards St?fan From rmay at ou.edu Tue Sep 26 14:05:37 2006 From: rmay at ou.edu (Ryan May) Date: Tue, 26 Sep 2006 13:05:37 -0500 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <20060926172352.GA20771@mentat.za.net> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> Message-ID: <45196BF1.9050601@ou.edu> Stefan van der Walt wrote: > On Tue, Sep 26, 2006 at 11:19:37AM -0500, Ryan May wrote: >> Is there function for doing a simple bilinear interpolation in scipy? I >> tried the interpolate.interp2d routines, but it appears that it uses >> B-splines even when kind='linear' is specified. (I wouldn't otherwise >> care except that my class assignment explicitly says to use >> bilinear.) > > I'm speaking under correction, but if you choose 'linear', b-splines > of degree 1 (i.e. straight lines) are used and you are doing linear > interpolation. That's what I kinda thought, but using 'linear' for interp2d didn't give me the same answer as when I performed the calculation manually. I'll have to see what I can find on this... Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma rmay at rossby.ou.edu From ewald.zietsman at gmail.com Wed Sep 27 06:38:32 2006 From: ewald.zietsman at gmail.com (Ewald Zietsman) Date: Wed, 27 Sep 2006 12:38:32 +0200 Subject: [SciPy-user] Interpolation using splines -> possible solution Message-ID: Hi, Thank you for trying this out on your computers. The code I posted works on my computer at home but not on my office computer. I noticed that at home matplotlib uses numpy ( Numerix = numpy ) and at the office it is numarray I think. I recompiled matplotlib with numpy as numerix and the problem disappeared. Hope this can help someone. Cheers -Ewald -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmay at ou.edu Wed Sep 27 09:09:18 2006 From: rmay at ou.edu (Ryan May) Date: Wed, 27 Sep 2006 08:09:18 -0500 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <45196BF1.9050601@ou.edu> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> <45196BF1.9050601@ou.edu> Message-ID: <451A77FE.5050207@ou.edu> Ryan May wrote: > Stefan van der Walt wrote: >> On Tue, Sep 26, 2006 at 11:19:37AM -0500, Ryan May wrote: >>> Is there function for doing a simple bilinear interpolation in scipy? I >>> tried the interpolate.interp2d routines, but it appears that it uses >>> B-splines even when kind='linear' is specified. (I wouldn't otherwise >>> care except that my class assignment explicitly says to use >>> bilinear.) >> I'm speaking under correction, but if you choose 'linear', b-splines >> of degree 1 (i.e. straight lines) are used and you are doing linear >> interpolation. > > That's what I kinda thought, but using 'linear' for interp2d didn't give > me the same answer as when I performed the calculation manually. I'll > have to see what I can find on this... Ok, if I select the 4 points surrounding the location of interest, interp2d gives me the value I get with manually calculating a bilinear interpolation. However if I use the whole field (or even 9 points instead of 4), I get a different answer. I _know_ I wouldn't expect this for bilinear interpolation, and it would seem to imply that even _linear_ B-splines uses the information from additional points. Am I missing something here, or are the methods just not truly equivalent? Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma rmay at rossby.ou.edu From sinclaird at ukzn.ac.za Wed Sep 27 10:45:43 2006 From: sinclaird at ukzn.ac.za (Scott Sinclair) Date: Wed, 27 Sep 2006 16:45:43 +0200 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <451A77FE.5050207@ou.edu> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> <45196BF1.9050601@ou.edu> <451A77FE.5050207@ou.edu> Message-ID: <451A8E97.8020407@ukzn.ac.za> Ryan May wrote: > Ok, if I select the 4 points surrounding the location of interest, > interp2d gives me the value I get with manually calculating a bilinear > interpolation. However if I use the whole field (or even 9 points > instead of 4), I get a different answer. I _know_ I wouldn't expect > this for bilinear interpolation, and it would seem to imply that even > _linear_ B-splines uses the information from additional points. Am I > missing something here, or are the methods just not truly equivalent? > The two are equivalent when there are only 4 surrounding points (as you've found). Bilinear interpolation is simply linear interpolation, first in one and then in a second (perpendicular) direction, from what I understand the method uses 4 known points by definition. Spline interpolation is a different animal. Here you are trying to find the "smoothest" (actually minimum bending energy) function that can be constructed to fit *exactly* through *all* of the known data using a set of basis functions which may be linear (or cubic etc...). Then you evaluate the value of that function at your unknown location. Cheers, Scott From stefan at sun.ac.za Wed Sep 27 11:07:35 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 27 Sep 2006 17:07:35 +0200 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <451A77FE.5050207@ou.edu> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> <45196BF1.9050601@ou.edu> <451A77FE.5050207@ou.edu> Message-ID: <20060927150735.GA15343@mentat.za.net> Hi Ryan On Wed, Sep 27, 2006 at 08:09:18AM -0500, Ryan May wrote: > > That's what I kinda thought, but using 'linear' for interp2d didn't give > > me the same answer as when I performed the calculation manually. I'll > > have to see what I can find on this... > > Ok, if I select the 4 points surrounding the location of interest, > interp2d gives me the value I get with manually calculating a bilinear > interpolation. However if I use the whole field (or even 9 points > instead of 4), I get a different answer. I _know_ I wouldn't expect > this for bilinear interpolation, and it would seem to imply that even > _linear_ B-splines uses the information from additional points. Am I > missing something here, or are the methods just not truly > equivalent? Please send some code so we can see what you are doing. Without that, we are just discussing things in the air. Cheers St?fan From rmay at ou.edu Wed Sep 27 11:41:21 2006 From: rmay at ou.edu (Ryan May) Date: Wed, 27 Sep 2006 10:41:21 -0500 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <20060927150735.GA15343@mentat.za.net> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> <45196BF1.9050601@ou.edu> <451A77FE.5050207@ou.edu> <20060927150735.GA15343@mentat.za.net> Message-ID: <451A9BA1.2070702@ou.edu> Stefan van der Walt wrote: > Hi Ryan > > On Wed, Sep 27, 2006 at 08:09:18AM -0500, Ryan May wrote: >>> That's what I kinda thought, but using 'linear' for interp2d didn't give >>> me the same answer as when I performed the calculation manually. I'll >>> have to see what I can find on this... >> Ok, if I select the 4 points surrounding the location of interest, >> interp2d gives me the value I get with manually calculating a bilinear >> interpolation. However if I use the whole field (or even 9 points >> instead of 4), I get a different answer. I _know_ I wouldn't expect >> this for bilinear interpolation, and it would seem to imply that even >> _linear_ B-splines uses the information from additional points. Am I >> missing something here, or are the methods just not truly >> equivalent? > > Please send some code so we can see what you are doing. Without that, > we are just discussing things in the air. > Here's some example code that demonstrates what I mean, but I think Scott Sinclair hit the nail on the head in another reply. The methods are just different except in the case of 4 points, since the splines are performing a global fit and the bilinear interpolation is only local. from scipy.interpolate import interpolate from numpy.random import randn from numpy import * data = randn(16).reshape(4,4) x = y = arange(4) X,Y = meshgrid(x,y) x_loc = 1.4 y_loc = 2.7 #Do the interpolation with all points, a partial set, and only 4 points, #making sure that the deisred location is within the domain full_bilin = interpolate.interp2d(X,Y,data,kind='linear') partial_bilin = interpolate.interp2d(X[1:,1:],Y[1:,1:],data[1:,1:],kind='linear') single_bilin = interpolate.interp2d(X[2:4,1:3],Y[2:4,1:3],data[2:4,1:3],kind='linear') #If interp2d was truly doing bilinear interpolation, these should be equal print full_bilin(x_loc,y_loc) print partial_bilin(x_loc,y_loc) print single_bilin(x_loc,y_loc) #This manual calculation of the interpolation is only equal to the last one xw = 0.4 yw = 0.7 xws = array([1-xw, xw]) yws = array([1-yw, yw]) print dot(yws,dot(data[2:4,1:3],xws)) Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma rmay at rossby.ou.edu From stefan at sun.ac.za Wed Sep 27 22:48:07 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 28 Sep 2006 04:48:07 +0200 Subject: [SciPy-user] Bilinear Interpolation In-Reply-To: <451A9BA1.2070702@ou.edu> References: <45195319.5010405@ou.edu> <20060926172352.GA20771@mentat.za.net> <45196BF1.9050601@ou.edu> <451A77FE.5050207@ou.edu> <20060927150735.GA15343@mentat.za.net> <451A9BA1.2070702@ou.edu> Message-ID: <20060928024806.GA31994@mentat.za.net> Hi Ryan On Wed, Sep 27, 2006 at 10:41:21AM -0500, Ryan May wrote: > Here's some example code that demonstrates what I mean, but I think > Scott Sinclair hit the nail on the head in another reply. The methods > are just different except in the case of 4 points, since the splines are > performing a global fit and the bilinear interpolation is only > local. Thanks for the code (and Scott for the explanation). I also attach a script I whipped up to demonstrate this effect on images. If you change the spline order from 2 to 1, you will notice weird things happening. For example, compare http://mentat.za.net/refer/spline_order2.png and http://mentat.za.net/refer/spline_order1.png Notice how dark the spline interpolated image of order 1 is. Can anyone explain this? Possibly due to pre-filtering? Cheers St?fan -------------- next part -------------- A non-text attachment was scrubbed... Name: splint.py Type: text/x-python Size: 1373 bytes Desc: not available URL: From david.douard at logilab.fr Thu Sep 28 03:23:33 2006 From: david.douard at logilab.fr (David Douard) Date: Thu, 28 Sep 2006 09:23:33 +0200 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451853F6.3060507@att.net> References: <451853F6.3060507@att.net> Message-ID: <20060928072333.GA3375@crater.logilab.fr> On Mon, Sep 25, 2006 at 06:11:02PM -0400, Bill Dandreta wrote: > What I want is a smooth curve that serves as an upper limit for a data > plot. I would like it to be close to the data at the local maxima. > > Does such a function exist? I'm pretty sure such a need is quite common, even for people not very skilled in maths (for which the tips given in replies to your message are not necessarily meaningful). So I would recomend that you (or someone else that already did this) add a page to the cookbook explaining a simple way of doing this. David > > Bill > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user -- David Douard LOGILAB, Paris (France) Formations Python, Zope, Plone, Debian : http://www.logilab.fr/formations D?veloppement logiciel sur mesure : http://www.logilab.fr/services Informatique scientifique : http://www.logilab.fr/science -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: Digital signature URL: From david.douard at logilab.fr Thu Sep 28 03:23:33 2006 From: david.douard at logilab.fr (David Douard) Date: Thu, 28 Sep 2006 09:23:33 +0200 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451853F6.3060507@att.net> References: <451853F6.3060507@att.net> Message-ID: <20060928072333.GA3375@crater.logilab.fr> On Mon, Sep 25, 2006 at 06:11:02PM -0400, Bill Dandreta wrote: > What I want is a smooth curve that serves as an upper limit for a data > plot. I would like it to be close to the data at the local maxima. > > Does such a function exist? I'm pretty sure such a need is quite common, even for people not very skilled in maths (for which the tips given in replies to your message are not necessarily meaningful). So I would recomend that you (or someone else that already did this) add a page to the cookbook explaining a simple way of doing this. David > > Bill > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user -- David Douard LOGILAB, Paris (France) Formations Python, Zope, Plone, Debian : http://www.logilab.fr/formations D?veloppement logiciel sur mesure : http://www.logilab.fr/services Informatique scientifique : http://www.logilab.fr/science -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: Digital signature URL: From wjdandreta at att.net Thu Sep 28 09:07:34 2006 From: wjdandreta at att.net (Bill Dandreta) Date: Thu, 28 Sep 2006 09:07:34 -0400 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451858F9.30809@gmail.com> References: <451853F6.3060507@att.net> <451858F9.30809@gmail.com> Message-ID: <451BC916.3040305@att.net> Robert Kern wrote: > > You would > > need to choose the length scale at which variations in the data are ignored. > > E.g. you need to find some means of determining why the curve should skip over A > > but try to get close to B. > > > That is exactly the problem. I can look at the data curve and know where the boundary curve should go but trying to figure out an algorithm that works in general is non trivial. If I have to do it by hand, that will severely limit the number of datasets I can analyze. > > You might be able to formulate this as a constrained minimization problem using > > scipy.optimize.fmin_cobyla(). Take some curve f(x), minimize (f(x) - y) under > > the constraint (f(x) - y >= 0). > Once I have something to optimize, I was thinking that genetic algorithms might yield the fastest solution because I only need an approximate solution. Attached is an example. The green curve is boundary and the blue is data. Bill -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: example1.pdf Type: application/pdf Size: 38001 bytes Desc: not available URL: From ckkart at hoc.net Thu Sep 28 10:00:25 2006 From: ckkart at hoc.net (Christian Kristukat) Date: Thu, 28 Sep 2006 23:00:25 +0900 Subject: [SciPy-user] Does scipy have a function to...? In-Reply-To: <451BC916.3040305@att.net> References: <451853F6.3060507@att.net> <451858F9.30809@gmail.com> <451BC916.3040305@att.net> Message-ID: <451BD579.3030709@hoc.net> Bill Dandreta wrote: > Once I have something to optimize, I was thinking that genetic > algorithms might yield the fastest solution because I only need an > approximate solution. > > Attached is an example. The green curve is boundary and the blue is data. For data like yours the attached script works quite well. As Robert suggested it uses fmin_cobyla to fit a 4pt spline to the data subject to one constraint per data point. Therefor it will be quite slow if applied to thousands of points. Initially I tried to use it to determine the background of spectroscopic data but this seems to be more difficult as the data is usually not as *smooth* as yours. Regards, Christian -------------- next part -------------- A non-text attachment was scrubbed... Name: baseline.py Type: text/x-python Size: 1127 bytes Desc: not available URL: From oliphant.travis at ieee.org Thu Sep 28 12:44:07 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 28 Sep 2006 10:44:07 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments Message-ID: <451BFBD7.4080701@ieee.org> Hi all, I've started a possibly controversial but hopefully informative page that tries to list some of the advantages of using Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination PyLab) versus other array environments. The purpose is not to go into detail about semantic differences, but document higher-level differences that might help somebody decide whether or not they could use NumPy instead of some other environment. I've started with a comparison to MATLAB, based on an email response I sent to a friend earlier today. Additions and corrections welcome. -Travis O. From oliphant.travis at ieee.org Thu Sep 28 12:44:47 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 28 Sep 2006 10:44:47 -0600 Subject: [SciPy-user] Webpage Message-ID: <451BFBFF.8040009@ieee.org> Oh yeah! Here's the link: http://www.scipy.org/NumPyProConPage From fperez.net at gmail.com Thu Sep 28 13:09:45 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 28 Sep 2006 11:09:45 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451BFBD7.4080701@ieee.org> References: <451BFBD7.4080701@ieee.org> Message-ID: On 9/28/06, Travis Oliphant wrote: > > Hi all, > > I've started a possibly controversial but hopefully informative page > that tries to list some of the advantages of using > Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination > PyLab) versus other array environments. Great, I think this is important to have. For reference, the link is: http://www.scipy.org/NumPyProConPage Cheers, f From fperez.net at gmail.com Thu Sep 28 13:09:45 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 28 Sep 2006 11:09:45 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451BFBD7.4080701@ieee.org> References: <451BFBD7.4080701@ieee.org> Message-ID: On 9/28/06, Travis Oliphant wrote: > > Hi all, > > I've started a possibly controversial but hopefully informative page > that tries to list some of the advantages of using > Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination > PyLab) versus other array environments. Great, I think this is important to have. For reference, the link is: http://www.scipy.org/NumPyProConPage Cheers, f From djf at pdx.edu Thu Sep 28 13:37:32 2006 From: djf at pdx.edu (djf at pdx.edu) Date: Thu, 28 Sep 2006 10:37:32 -0700 Subject: [SciPy-user] python crash in leastsq Message-ID: <1159465052.451c085ca711b@webmail.pdx.edu> I receive a segmentation fault from the following code: from scipy.optimize import * from scipy import * # Fit Mean of A*x to Mean of B---------- #-------------------------------------- def fit(x): d=mean(x*A)-mean(B) return d #-------------------------------------- A=[1,2] B=[3,4] x0=[1] (x,msg)=leastsq(fit,x0) print x Any advice? thanks, djoefish From oliphant.travis at ieee.org Thu Sep 28 14:05:18 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 28 Sep 2006 12:05:18 -0600 Subject: [SciPy-user] python crash in leastsq In-Reply-To: <1159465052.451c085ca711b@webmail.pdx.edu> References: <1159465052.451c085ca711b@webmail.pdx.edu> Message-ID: <451C0EDE.3020309@ieee.org> djf at pdx.edu wrote: > I receive a segmentation fault from the following code: > > from scipy.optimize import * > from scipy import * > # Fit Mean of A*x to Mean of B---------- > #-------------------------------------- > def fit(x): > d=mean(x*A)-mean(B) > return d > #-------------------------------------- > A=[1,2] > B=[3,4] > x0=[1] > (x,msg)=leastsq(fit,x0) > print x > > Any advice? > Please give a more detailed report. What versions of scipy and numpy are you using? And if you can provide a trace-back that is even better. -Travis From stefan at sun.ac.za Thu Sep 28 14:09:21 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 28 Sep 2006 20:09:21 +0200 Subject: [SciPy-user] python crash in leastsq In-Reply-To: <1159465052.451c085ca711b@webmail.pdx.edu> References: <1159465052.451c085ca711b@webmail.pdx.edu> Message-ID: <20060928180921.GC17644@mentat.za.net> On Thu, Sep 28, 2006 at 10:37:32AM -0700, djf at pdx.edu wrote: > > I receive a segmentation fault from the following code: > > from scipy.optimize import * > from scipy import * > # Fit Mean of A*x to Mean of B---------- > #-------------------------------------- > def fit(x): > d=mean(x*A)-mean(B) > return d > #-------------------------------------- > A=[1,2] > B=[3,4] > x0=[1] > (x,msg)=leastsq(fit,x0) > print x > > Any advice? I don't see any memory errors running your code with numpy 1.0.dev3230 scipy 0.5.2.dev2235 It produces the correct answer, 2.3333333. Which versions of numpy and scipy are you running, and on which platform? Regards St?fan From hetland at tamu.edu Thu Sep 28 14:13:41 2006 From: hetland at tamu.edu (Rob Hetland) Date: Thu, 28 Sep 2006 13:13:41 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> Message-ID: I think everybody on this list is pretty familiar with the pros of PyLab (although a good set of arguments in one place is a Good Idea). Perhaps it would be productive for us to start a discussion of cons here on the list, and decide how to mitigate them. Here I give some of my impressions on how the novice might see PyLab. I am teaching a class on data assimilation this semester, and I have decided to use PyLab. I had used MATLAB in the past, and I had many students who weren't familiar with how it worked and there were licensing issues. Most students could eventually figure out how to deal with the language; vectorization is always a tough thing to get. Also, they could purchase a student version of MATLAB for about $50 -- not free, but cheaper than many college textbooks. Using PyLab, I have a different set of issues. First is installation. I recommended Enthon python to the PC users, and I helped the Mac users deal with the various distributions. Even with a number of clickable installer packages, putting python on their computers was not straightforward. Then there is the issue with Python itself. Python is a more powerful language than MATLAB's core programming language, and students have a slightly steeper learning curve figuring all of that out. They suddenly have to deal with importing packages, zero-based indexing, many different kinds of sequences, methods, and a host of other issues. Additionally, numpy is a more powerful array tool, in my opinion, but it is also harder to learn. In the final analysis, students had similar issues working with MATLAB, but my impression is that they find MATLAB slightly easier. Also, many of them come to the class with MATLAB experience -- none of them have used python, let alone any of the scientific packages, before. In the long run, python is clearly better, but in the short term, I think MATLAB might be simpler. Finally, I think that one of the reasons MATLAB is successful is that it includes everything together, in one place. It does some things very poorly, but I was always willing to put up with MATLAB's weaknesses to remain within a single environment. PyLab is starting to feel like that, but there are still some pretty clear boundaries between the packages, and an almost overwhelming array of choices if you want to look for them. This is good if you are a geek and what plotting package X or numeric package Y and you still want everything else to work, but bad if you are a novice and just want things to work simply and smoothly together. Thus, the first thing that would improve PyLab usage the most, for the majority of people migrating from MATLAB, would be a cohesive, easy to install package. Like Enthon for a wide variety of platforms. I think this is a goal that will could be achived in a year or so, given how things are going now. Also, a good tutorial would be essential. This would be less complete and less formal than actual NumPy documentation, and would include information on using all of the PyLab packages together. There is a good start to this on the scipy wiki. Third, I think it is important for all of the packages to work together. This has certainly happened in terms of compilation -- mpl and numpy have been released in lockstep so they work together despite major changes in numpy. The Python+NumPy+Scipy+Matplotlib +IPython suite seems like a great place to start, and I think more could be done to make these tools seem like one seamless superpackage. -Rob On Sep 28, 2006, at 12:09 PM, Fernando Perez wrote: > On 9/28/06, Travis Oliphant wrote: >> >> Hi all, >> >> I've started a possibly controversial but hopefully informative page >> that tries to list some of the advantages of using >> Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination >> PyLab) versus other array environments. > > Great, I think this is important to have. For reference, the link is: > > http://www.scipy.org/NumPyProConPage > > Cheers, > > f > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user ---- Rob Hetland, Associate Professor Dept. of Oceanography, Texas A&M University http://pong.tamu.edu/~rob phone: 979-458-0096, fax: 979-845-6331 From djf at pdx.edu Thu Sep 28 14:17:01 2006 From: djf at pdx.edu (Daniel Fish) Date: Thu, 28 Sep 2006 11:17:01 -0700 Subject: [SciPy-user] python crash in leastsq References: <1159465052.451c085ca711b@webmail.pdx.edu> <451C0EDE.3020309@ieee.org> Message-ID: <005a01c6e32a$4b51ee30$6401a8c0@Fish> OS: Ubuntu Dapper Drake Python info: Python 2.4, Scipy 0.3.2, Numpy 1.0b5 Output================================================= /usr/lib/python2.4/site-packages/scipy_base/ppimport.py:273: UserWarning: The pstats module is not available. Install the python2.4-profiler Debian package if you need it module = __import__(name,None,None,['*']) -2.0 -2.0 Segmentation fault ======================================================== ----- Original Message ----- From: "Travis Oliphant" To: "SciPy Users List" Sent: Thursday, September 28, 2006 11:05 AM Subject: Re: [SciPy-user] python crash in leastsq > djf at pdx.edu wrote: >> I receive a segmentation fault from the following code: >> >> from scipy.optimize import * >> from scipy import * >> # Fit Mean of A*x to Mean of B---------- >> #-------------------------------------- >> def fit(x): >> d=mean(x*A)-mean(B) >> return d >> #-------------------------------------- >> A=[1,2] >> B=[3,4] >> x0=[1] >> (x,msg)=leastsq(fit,x0) >> print x >> >> Any advice? >> > > Please give a more detailed report. What versions of scipy and numpy > are you using? And if you can provide a trace-back that is even better. > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From gael.varoquaux at normalesup.org Thu Sep 28 14:38:55 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 28 Sep 2006 20:38:55 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> Message-ID: <20060928183853.GC28690@clipper.ens.fr> I agree with Rob that python is slightly harder to figure out for complete beginners. And I agree that it lacks integration. I would like to have a application, with an icon on the Desktop, or in the menus, that you can start, and start right away typing calculations in, without importing packages, figuring out how with shell use, it might even have an editor. It would have proper branding, look pretty, have menus (including a help menu that would give help on python, scipy, and all the other packages)... I am (as everybody) lacking time to do this but I see enthought's envisage a good starting point for this. It seems possible to integrate pylab to it, in order to have dockable pylab windows. It already has an editor and a shell. The shell is not as nice as ipython: it is pycrust, but I hope one day ipython will be easy to integrate in wxpython applications, and that it will have syntax highlighting and docstrings popup like pycrust (beginners really love that). I think developing such an application would definitely help our community get more exposure. I know this will not interest the people who are currently investing a lot of time on scipy/ipython, as they are aiming for the other end of the spectrum: difficult tasks where no good answers are available, like distributed computing. I think that we should still keep this in mind, and as pycrust, envisage, and other inegration tools make progress see if and when we can put together such application. Maybe we should put up a wiki page to throw down some ideas about this. Ga?l From hasslerjc at adelphia.net Thu Sep 28 15:37:49 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Thu, 28 Sep 2006 15:37:49 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060928183853.GC28690@clipper.ens.fr> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> Message-ID: <451C248D.3010406@adelphia.net> I really don't agree that "python is slightly harder to figure out for complete beginners" (Gael). (You knew SOMEBODY would be disagreeable). I've used/taught Matlab, Mathcad, Scilab, and Python ... not to mention Fortran, QBasic, VBasic, etc. In Python, I click on Idle and start calculating. Maybe I have to "import math," but that's it. Sure, there are some points I have to know, but believe me, it's a lot easier than starting a beginner on Mathcad. Scilab is very similar to Matlab (except that Scilab is free, and has a nicer syntax for functions). If I want array calculations, it's one more "import," but otherwise no more difficult than Matlab and friends, and much easier than Mathcad. Now, suppose I want to do something a little more complicated, and I want a function. In Python (and Scilab), I can define the function on the fly, and then use it. In Matlab, I have to save the function in an "m-file" before I can use it, which brings up all kinds of problems of where to put the file, what to name it, etc. Easy enough for us, maybe, but not for our prototypical "complete beginner." (It's also not esthetically pleasing ... but that's a different problem.) In Mathcad, simple functions are pretty easy; complex ones are pretty not easy, but there's a fair bit to learn before you can make any of them work. I wouldn't expect a student to write functions in any of these without at least some background, and the required background in Python is certainly no more than that in any of the others. The biggest difference, for me, is that Python can "keep going." Matlab/Scilab/Mathcad all hit the wall fairly quickly, in terms of program size and/or complexity. But my real peeve is that Matlab is incomplete (batteries NOT included). I'm "visiting faculty" (hired help - I've retired, but I'm teaching a course). There came a point where we needed to solve a small set of simultaneous nonlinear equations. The students here are required to have Matlab, so I said, "Just call up fsolve." Oops. In Matlab, that's not in the student version. They'd have to buy the "optimization toolbox" to get it. EVERY other math system has it ... but it costs extra in Matlab. I wasn't fond of Matlab before that, but that little incident really soured me on it. (So I taught them flowsheet tearing. If you know what that is, you're a Chemical Engineer, and you're old.) I've never taught Python to "complete beginners," but I _have_ taught Matlab and Mathcad, and I can't imagine that Python would be any more difficult to teach. Maybe a beginners tutorial aimed at scientific calculation would help (although they exist on the web), but I think that the problem is really more of perception than reality. john Gael Varoquaux wrote: > I agree with Rob that python is slightly harder to figure out for > complete beginners. And I agree that it lacks integration. I would like > to have a application, with an icon on the Desktop, or in the menus, > that you can start, and start right away typing calculations in, without > importing packages, figuring out how with shell use, it might even have > an editor. It would have proper branding, look pretty, have menus > (including a help menu that would give help on python, scipy, and all > the other packages)... > > I am (as everybody) lacking time to do this but I see enthought's > envisage a good starting point for this. It seems possible to integrate > pylab to it, in order to have dockable pylab windows. It already has an > editor and a shell. The shell is not as nice as ipython: it is pycrust, > but I hope one day ipython will be easy to integrate in wxpython > applications, and that it will have syntax highlighting and docstrings > popup like pycrust (beginners really love that). > > I think developing such an application would definitely help our > community get more exposure. I know this will not interest the people > who are currently investing a lot of time on scipy/ipython, as they are > aiming for the other end of the spectrum: difficult tasks where no good > answers are available, like distributed computing. I think that we > should still keep this in mind, and as pycrust, envisage, and other > inegration tools make progress see if and when we can put together such > application. Maybe we should put up a wiki page to throw down some ideas > about this. > > Ga?l > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From gael.varoquaux at normalesup.org Thu Sep 28 15:46:25 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 28 Sep 2006 21:46:25 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451C248D.3010406@adelphia.net> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> Message-ID: <20060928194625.GA1454@clipper.ens.fr> John, You have good argument and I certainly won't fight over you about that. I prefer python and find it easy to use. But I use object oriented programming, importing modules, functional programming, regexp, operator overloading, and some of my colleagues have difficulties with this. Maybe python can seem harder because it can be used at higher levels. I still thing that it is lacking an IDE where you can "click and start typing". -- Ga?l From peridot.faceted at gmail.com Thu Sep 28 16:06:23 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Thu, 28 Sep 2006 16:06:23 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060928194625.GA1454@clipper.ens.fr> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> Message-ID: On 28/09/06, Gael Varoquaux wrote: > John, > > You have good argument and I certainly won't fight over you about that. > I prefer python and find it easy to use. But I use object oriented > programming, importing modules, functional programming, regexp, operator > overloading, and some of my colleagues have difficulties with this. This is a question of what *you* use - python is fairly good about keeping its fancy features out of the way of those who don't want to use them. The fact that you *can* write object-oriented programs doesn't affect the syntax you use to define a function (for example). > Maybe python can seem harder because it can be used at higher levels. I > still thing that it is lacking an IDE where you can "click and start > typing". This is a question of user interface, and it's a good one. I use ipython and vim, mostly, and am still getting the hang of making them work together well (I had been using reload(module) constantly, and I just found %run; I expect there are other things I still need, like a good way to use PDB). This is not a terribly convenient user interface, although it could perhaps be made into one with a bit of integration work (and a good tutorial). IDLE seems to be an attempt to provide a convenient user interface integrating debugger, interactive session, editor, and help, but I pretty consistently crash it after half an hour of use, losing all my state in all the aforementioned omponents. One advantage of a single unified package is that people can start filing integration and UI bugs - "I start PyLab but my plots appear under all the other windows", "the editor has a different default directory than the interpreter", and so on. As for missing features, I'd like to see even very basic symbolic calculation tools, so that (for example) I could just call a symbolic derivative function to supply a Jacobian to the minimizers. From a bit of web research, swiginac seems like the most promising alternative. (This should naturally be integrated with the orthogonal polynomials.) A. M. Archibald From p.jensen at virgin.net Thu Sep 28 16:06:50 2006 From: p.jensen at virgin.net (Peter Jensen) Date: Thu, 28 Sep 2006 21:06:50 +0100 Subject: [SciPy-user] Webpage In-Reply-To: <451BFBFF.8040009@ieee.org> References: <451BFBFF.8040009@ieee.org> Message-ID: <1159474011.4973.13.camel@localhost> Travis, I was looking at you list of disadvantages of PyLab, and the following point count my attention. "This is due to not having two in-fix operators to represent array multiplication and element-by-element multiplication." I assume that you have had discussion with Guido van Rossum about introducing a new operator for element-by-element multiplication ( i.e .* ), and that that somehow would cause problems with the parser. Is that correct ?. I can't myself identify the problem. Peter On Thu, 2006-09-28 at 10:44 -0600, Travis Oliphant wrote: > Oh yeah! Here's the link: > > http://www.scipy.org/NumPyProConPage > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From R.Springuel at umit.maine.edu Thu Sep 28 16:17:57 2006 From: R.Springuel at umit.maine.edu (R. Padraic Springuel) Date: Thu, 28 Sep 2006 16:17:57 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: Message-ID: <451C2DF5.1020004@umit.maine.edu> For those of you objecting to the "difficulty" in importing several different packages to get all the different capabilities here going at once, there is an easy fix here suggested by Travis's use of Pylab as the title for the grouping. If you look at matplotlib's pylab.py file that it places in your site-packages directory, its just redirects the import statement to find what's needed. Add a few lines to this file and it can pull in numpy, scipy, and IPython at the same time. What might make things easier for the beginner would be to have all of the packages Travis mentioned available in a bundled download that is easily installable (like Enthought Python, but without the extra bells & whistles) and with a modified version of the pylab.py file that contains all the necessary import statements. It might also be nice if this kind of package would automatically configure the interpreter to issue the necessary import statements automatically when it starts. In this way, the beginner need not learn about import statements right away. All the basics are right there for them. -- R. Padraic Springuel Teaching Assistant Department of Physics and Astronomy University of Maine Bennett 309 Office Hours: Wednesday 2-3pm From stefan at sun.ac.za Thu Sep 28 16:07:45 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 28 Sep 2006 22:07:45 +0200 Subject: [SciPy-user] python crash in leastsq In-Reply-To: <005a01c6e32a$4b51ee30$6401a8c0@Fish> References: <1159465052.451c085ca711b@webmail.pdx.edu> <451C0EDE.3020309@ieee.org> <005a01c6e32a$4b51ee30$6401a8c0@Fish> Message-ID: <20060928200745.GA21281@mentat.za.net> On Thu, Sep 28, 2006 at 11:17:01AM -0700, Daniel Fish wrote: > OS: Ubuntu Dapper Drake > Python info: Python 2.4, Scipy 0.3.2, Numpy 1.0b5 Your version of scipy is very old. Using a newer version solves the problem. Regards St?fan From fperez.net at gmail.com Thu Sep 28 16:37:24 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 28 Sep 2006 14:37:24 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451C2DF5.1020004@umit.maine.edu> References: <451C2DF5.1020004@umit.maine.edu> Message-ID: On 9/28/06, R. Padraic Springuel wrote: > For those of you objecting to the "difficulty" in importing several > different packages to get all the different capabilities here going at > once, there is an easy fix here suggested by Travis's use of Pylab as > the title for the grouping. If you look at matplotlib's pylab.py file > that it places in your site-packages directory, its just redirects the > import statement to find what's needed. Add a few lines to this file > and it can pull in numpy, scipy, and IPython at the same time. What > might make things easier for the beginner would be to have all of the > packages Travis mentioned available in a bundled download that is easily > installable (like Enthought Python, but without the extra bells & > whistles) and with a modified version of the pylab.py file that contains > all the necessary import statements. > > It might also be nice if this kind of package would automatically > configure the interpreter to issue the necessary import statements > automatically when it starts. In this way, the beginner need not learn > about import statements right away. All the basics are right there for > them. That's pretty much what ipython -pylab does already, modulo importing scipy, which it does NOT assume is installed. But doing ipython -pylab -p scipy does that. The next version of ipython will add the latter as a shortcut in the Windows start menu upon installation, at Ryan Krauss' suggestion. *nix users are perfectly capable of making an alias if they so desire. Cheers, f From gael.varoquaux at normalesup.org Thu Sep 28 16:40:49 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 28 Sep 2006 22:40:49 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> Message-ID: <20060928204048.GA4529@clipper.ens.fr> On Thu, Sep 28, 2006 at 04:06:23PM -0400, A. M. Archibald wrote: > As for missing features, I'd like to see even very basic symbolic > calculation tools, so that (for example) I could just call a symbolic > derivative function to supply a Jacobian to the minimizers. From a bit > of web research, swiginac seems like the most promising alternative. You can have a look at sympy too. It is very young, though. Ga?l From hasslerjc at adelphia.net Thu Sep 28 16:56:15 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Thu, 28 Sep 2006 16:56:15 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060928194625.GA1454@clipper.ens.fr> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> Message-ID: <451C36EF.6010408@adelphia.net> Gael Varoquaux wrote: > John, > > [snippage] > > Maybe python can seem harder because it can be used at higher levels. I > still thing that it is lacking an IDE where you can "click and start > typing". > But that's exactly my point. For the most common sort of calculations one does as an undergraduate, using Python is exactly like using Matlab, with trivial differences. Here's a little homework problem - calculating the flow velocity in a packed column. This requires the solution of a quadratic equation: Click on Idle. Then start typing: Python 2.5 (r25:51908, Sep 19 2006, 09:52:17) [MSC v.1310 32 bit (Intel)] on win32 Type "copyright", "credits" or "license()" for more information. **************************************************************** Personal firewall software may warn about the connection IDLE makes to its subprocess using this computer's internal loopback interface. This connection is not visible on any external interface and no data is sent to or received from the Internet. **************************************************************** IDLE 1.2 >>> from numpy import * # Yes, I know, but ... >>> rho = 62.4 >>> rhop = 160. # Use decimal point, or use from __future ... >>> psi = 0.95 >>> Dp = 0.02/12 >>> mu = 1*6.72e-4 >>> eps = 0.42 >>> g = 32.174 >>> a = 1.75*rho/(psi*Dp*eps**3) >>> b = 150.*mu*(1-eps)/((psi*Dp)**2*eps**3) >>> c = g*(rhop - rho) >>> print a,b,c 14918.248001 314771.892136 3140.1824 >>> roots([a,b,-c]) array([-0.34783557, 0.00969792]) >>> Velocity is 0.097 m/sec. This is how the students use Matlab - the problems are longer, but the basic idea is the same. You also say: Gael: "But I use object oriented programming, importing modules, functional programming, regexp, operator overloading, and some of my colleagues have difficulties with this." Do they also have difficulty doing these in Matlab??????? (You can't even DO most of these in Matlab. Most Matlab users wouldn't even recognize the words.) (Ok, ok, ad hominem towards Matlab, I apologize.) john From hetland at tamu.edu Thu Sep 28 16:57:14 2006 From: hetland at tamu.edu (Rob Hetland) Date: Thu, 28 Sep 2006 15:57:14 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> Message-ID: On Sep 28, 2006, at 3:06 PM, A. M. Archibald wrote: > This is a question of what *you* use - python is fairly good about > keeping its fancy features out of the way of those who don't want to > use them. The fact that you *can* write object-oriented programs > doesn't affect the syntax you use to define a function (for example). Yes, you can keep it simple, and not use classes out of the box. However, you are faced, immediately, with a bunch of different sequence objects -- at least tuples, lists, and arrays. Then there is the issue of methods, which most science students I know have not encountered. These things are not insurmountable, and the argument made below that a function in matlab must be a file is really powerful (and one I must have blotted from my memory). All of the arguments made *for* PyLab are true -- you think so too, or you wouldn't be reading this. I have been a huge proponent of PyLab, and have taught seminars on it here at Texas A&M and Woods Hole to people who primarily use MATLAB. I have heard a number of objections or excuses that it all looks good, but..... - it's hard to install - I already know how to use MATLAB, and it works fine for me - when do I find a week (or month or semester) to learn a new programing language - I already have so many m-files that I would need to rewrite Then there are the issues of bugs and beta quality software (not true anymore for numpy, but still true for mpl), small user community in your own research community (e.g., Oceanographers all use MATLAB), etc. We need to think about the objections of the people who are *not* already here, and make sure we have an easy way for them to join us on the true path... -Rob ---- Rob Hetland, Associate Professor Dept. of Oceanography, Texas A&M University http://pong.tamu.edu/~rob phone: 979-458-0096, fax: 979-845-6331 From hasslerjc at adelphia.net Thu Sep 28 17:27:55 2006 From: hasslerjc at adelphia.net (John Hassler) Date: Thu, 28 Sep 2006 17:27:55 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451BFBD7.4080701@ieee.org> References: <451BFBD7.4080701@ieee.org> Message-ID: <451C3E5B.1090403@adelphia.net> I went back to the beginning of the thread, to find out what I was actually talking about. I interpreted the question to mean using Python vs Matlab the way Matlab is commonly used by students. I now see that this is too restrictive, but still, I think it's representative of a large class of users. So what, exactly, is the question? What sort of user do we mean? Somebody who has used other "array environments" would have no difficulty switching to Python. Someone who is completely new to computer computation would seem to me to be unlikely to use any advanced features of the language. Matlab has some specialized (as in "expensive") toolboxes for special problems; do we mean these? I'm familiar with the controls toolbox, and by omission, with the optimization toolbox. Neither has anything that an undergraduate student would use that isn't also in SciPy. I don't know anything about any of the other toolboxes. As an aside, I use Python with Jedit. It serves as a perfectly usable combination, at least as convenient as Matlab with its built-in editor. (I've used Scite and PSPad, too, but I personally like Jedit better.) I've got it set up so that I save and hit F5 to run ... as in Matlab. john Travis Oliphant wrote: > Hi all, > > I've started a possibly controversial but hopefully informative page > that tries to list some of the advantages of using > Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination > PyLab) versus other array environments. > > The purpose is not to go into detail about semantic differences, but > document higher-level differences that might help somebody decide > whether or not they could use NumPy instead of some other environment. > I've started with a comparison to MATLAB, based on an email response I > sent to a friend earlier today. > > Additions and corrections welcome. > > -Travis O. > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From oliphant at ee.byu.edu Thu Sep 28 18:09:05 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 28 Sep 2006 16:09:05 -0600 Subject: [SciPy-user] Webpage In-Reply-To: <1159474011.4973.13.camel@localhost> References: <451BFBFF.8040009@ieee.org> <1159474011.4973.13.camel@localhost> Message-ID: <451C4801.5090801@ee.byu.edu> Peter Jensen wrote: >Travis, > >I was looking at you list of disadvantages of >PyLab, and the following point count my attention. > >"This is due to not having two in-fix operators to >represent array multiplication and element-by-element >multiplication." > >I assume that you have had discussion with >Guido van Rossum about introducing a new >operator for element-by-element multiplication >( i.e .* ), and that that somehow would cause >problems with the parser. Is that correct ?. >I can't myself identify the problem. > > No, nobody's really tried to push it with Guido, as previous arguments for it have not met with warm reception. Perhaps if several of us make an argument for a few "special" operators it would help. -Travis From rclewley at cam.cornell.edu Thu Sep 28 19:34:37 2006 From: rclewley at cam.cornell.edu (Robert Clewley) Date: Thu, 28 Sep 2006 19:34:37 -0400 (EDT) Subject: [SciPy-user] Calculating symbolic Jacobians, was [OT] Re: Pros and Cons of Python verses other array environments In-Reply-To: <20060928204048.GA4529@clipper.ens.fr> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <20060928204048.GA4529@clipper.ens.fr> Message-ID: Hi, AFAIK sympy does not support multivariable expressions or user-defined functions. These are useful in constructing Jacobians in the most syntactically neat way. But if you need something simple and in pure python right away, our PyDSTool project contains a modestly-featured symbolic toolbox in pure python, including functions to create Jacobians very easily. You don't have to keep the whole baggage of our project to use the module Symbolic.py. Although it currently only works with "old" SciPy/Numeric, we are working on migrating to NumPy, etc. right now. Also, it's not as professionally developed as swiginac/ginac and undoubtedly not as fast, but it has worked well for me on moderately-sized problems... e.g. for a vector field defined in a test example of SciPy's VODE integrator: >>> y0=Var('y0') >>> y1=Var('y1') >>> y2=Var('y2') >>> ydot0=Fun(-0.04*y0 + 1e4*y1*y2, [y0, y1, y2], 'ydot0') >>> ydot2=Fun(3e7*y1*y1, [y0, y1, y2], 'ydot2') >>> ydot1=Fun(-ydot0(y0,y1,y2)-ydot2(y0,y1,y2), [y0, y1, y2], 'ydot1') >>> F = Fun([ydot0(y0,y1,y2),ydot1(y0,y1,y2),ydot2(y0,y1,y2)], [y0,y1,y2], 'F') # Diff returns a symbolic object so would like to turn back into a # function of y0, y1, y2 >>> jac=Fun(Diff(F,[y0,y1,y2]), [y0, y1, y2], 'Jacobian') # Calls to this function return a symbolic object by default >>> jac(0.1, 0.3, 0.5) QuantSpec Jacobian (ExpFuncSpec) # so use >>> print jac(0.1, 0.3, 0.5) [[-0.040000000000000001,5000.0,3000.0],[0.040000000000000001,-18005000.0,-3000.0],[0,18000000.0,0]] # or use .tonumeric() because there are no free names left in the # resulting expression >>> jac(0.1, 0.3, 0.5).tonumeric() array([[ -4.00000000e-02, 5.00000000e+03, 3.00000000e+03], [ 4.00000000e-02, -1.80050000e+07, -3.00000000e+03], [ 0.00000000e+00, 1.80000000e+07, 0.00000000e+00]]) # The beauty of this is being able to do use other symbols in the call, # substitutions, etc. >>> x=Var('x') >>> j = jac(x, x, 0.5) >>> print j [[-0.04,10000*0.5,10000*x],[0.040000000000000001,(-10000*0.5)-30000000*2*x,-10000*x],[0,60000000*x,0]] >>> jsubs = j.eval(x=10) >>> jsubs.tonumeric() array([[ -4.00000000e-02, 5.00000000e+03, 1.00000000e+05], [ 4.00000000e-02, -6.00005000e+08, -1.00000000e+05], [ 0.00000000e+00, 6.00000000e+08, 0.00000000e+00]]) Automatic simplification of the resulting float-only sub-expressions (like "10000*0.5") isn't fully working yet for these "symbol arrays", but it does work in the scalar case. Anyway, check out the wiki page Symbolic at pydstool.sourceforge.net/ if you want more examples and documentation. Hope this is of interest! -Rob On Thu, 28 Sep 2006, Gael Varoquaux wrote: > On Thu, Sep 28, 2006 at 04:06:23PM -0400, A. M. Archibald wrote: >> As for missing features, I'd like to see even very basic symbolic >> calculation tools, so that (for example) I could just call a symbolic >> derivative function to supply a Jacobian to the minimizers. From a bit >> of web research, swiginac seems like the most promising alternative. > > You can have a look at sympy too. It is very young, though. > > Ga?l From stephenemslie at gmail.com Thu Sep 28 20:04:22 2006 From: stephenemslie at gmail.com (stephen emslie) Date: Fri, 29 Sep 2006 01:04:22 +0100 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451BFBD7.4080701@ieee.org> References: <451BFBD7.4080701@ieee.org> Message-ID: <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> On 9/28/06, Travis Oliphant wrote: > The purpose is not to go into detail about semantic differences, but > document higher-level differences that might help somebody decide > whether or not they could use NumPy instead of some other environment. > I've started with a comparison to MATLAB, based on an email response I > sent to a friend earlier today. I think this is an important resource to have around. In the context of choosing an array environment for a project where you need speed it's important that python's advantages are made clear and easy to communicate. That way we all get to use python :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Thu Sep 28 20:22:58 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 29 Sep 2006 02:22:58 +0200 Subject: [SciPy-user] Calculating symbolic Jacobians, was [OT] Re: Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <20060928204048.GA4529@clipper.ens.fr> Message-ID: <20060929002258.GC7753@clipper.ens.fr> On Thu, Sep 28, 2006 at 07:34:37PM -0400, Robert Clewley wrote: > But if you need something simple and in pure python right away, our > PyDSTool project contains a modestly-featured symbolic toolbox in pure > python, including functions to create Jacobians very easily. Nice, if I find some time I'll have a look. Ga?l From ahoover at eecs.berkeley.edu Thu Sep 28 21:03:58 2006 From: ahoover at eecs.berkeley.edu (Aaron Hoover) Date: Thu, 28 Sep 2006 18:03:58 -0700 Subject: [SciPy-user] malloc incorrect checksum for freed object error using delaunay module Message-ID: Hi all, I'm using SciPy 0.5.1 with the delaunay module built from source on OS X (Intel Mac). I'm also using iPython 0.7.2, NumPy 1.0.5b, and matplotlib 0.87.5 (all built from source). I'm having the strangest problem - I've been able to use the delaunay function in delaunay.triangulate just fine up until recently when I triangulated a fairly large data set (2M points). The triangulation succeeded (but took a while, of course), but now whenever I attempt to use the delaunay function again, I get the following error: Python(288,0xa000cf60) malloc: *** error for object 0x1b47e00: incorrect checksum for freed object - object was probably modified after being freed, break at szone_error to debug Python(288,0xa000cf60) malloc: *** set a breakpoint in szone_error to debug Then, iPython dies with a "Bus error." Rebooting does not help My question is twofold. Does anyone know what causes this error? (Google turned up an enormous variety of sources for this error in just about all software that uses malloc.h). And, is it possible to fix this without rebuilding SciPy. Rebuilding isn't really *that* big of a deal, but I'd like to know if there's an easier solution in case this thing rears its ugly head in the future. Thanks, Aaron From wbaxter at gmail.com Thu Sep 28 21:05:24 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 10:05:24 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: The numpy for matlab users page ( http://www.scipy.org/NumPy_for_Matlab_Users ) also list a number of pros and cons. So far, I find the biggest cons to numpy to be 1) integration of plotting is not as good as matlab. You have to be careful about calling "show()" in matplotlib because of event-loop integration issues. Also no good 3D plotting solution. MayaVi is supposed to be good, but it would be better if it were all just built into matplotlib. 2) integration of debugging is not as good as matlab. In matlab when you stop at a breakpoint in your code, you get an interactive console where you can probe current values in your program, or create new ones etc. The Wing IDE has this, but I couldn't find any open source IDEs that did this. --bb On 9/29/06, stephen emslie wrote: > > On 9/28/06, Travis Oliphant wrote: > > The purpose is not to go into detail about semantic differences, but > > document higher-level differences that might help somebody decide > > whether or not they could use NumPy instead of some other environment. > > I've started with a comparison to MATLAB, based on an email response I > > sent to a friend earlier today. From robert.kern at gmail.com Thu Sep 28 21:24:30 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 28 Sep 2006 20:24:30 -0500 Subject: [SciPy-user] malloc incorrect checksum for freed object error using delaunay module In-Reply-To: References: Message-ID: <451C75CE.5090607@gmail.com> Aaron Hoover wrote: > Hi all, > > I'm using SciPy 0.5.1 with the delaunay module built from source on > OS X (Intel Mac). I'm also using iPython 0.7.2, NumPy 1.0.5b, and > matplotlib 0.87.5 (all built from source). > > I'm having the strangest problem - I've been able to use the delaunay > function in delaunay.triangulate just fine up until recently when I > triangulated a fairly large data set (2M points). > The triangulation succeeded (but took a while, of course), but now > whenever I attempt to use the delaunay function again, I get the > following error: > > Python(288,0xa000cf60) malloc: *** error for object 0x1b47e00: > incorrect checksum for freed object - object was probably modified > after being freed, break at szone_error to debug > Python(288,0xa000cf60) malloc: *** set a breakpoint in szone_error to > debug > > Then, iPython dies with a "Bus error." Rebooting does not help > > My question is twofold. Does anyone know what causes this error? > (Google turned up an enormous variety of sources for this error in > just about all software that uses malloc.h). I'll look into it. The Delaunay triangulation code is a great big ball of mud when it comes to memory handling. The core algorithm derives from an old C codebase that semi-manually manages a memory pool acquired using malloc. This is wrapped in some C++ upon which I added my own C++ from before I was terribly comfortable with the C++ and STL memory models. On top of *that*, there is the Python memory management. Fun. Please attach the script (or preferably, a minimal example that demonstrates the problem) to a ticket on the Trac, I would appreciate it. Due to spam, I'm afraid you will have to register an account. http://projects.scipy.org/scipy/scipy/register After that: http://projects.scipy.org/scipy/scipy/newticket The button to attach a file will show up after you initially create the ticket. I might not be able to reproduce your problem on the machines available to me (PPC OS X and AMD64 Linux), so you might want to follow the advice of the error message that you got and run your program under the gdb debugger. Something like the following should work: [~]$ gdb python GNU gdb 6.3.50-20050815 (Apple version gdb-477) (Sun Apr 30 20:06:22 GMT 2006) Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "powerpc-apple-darwin"...Reading symbols for shared libraries ... done (gdb) break szone_error Breakpoint 1 at 0x901143c4 (gdb) run my_script.py > And, is it possible to fix this without rebuilding SciPy. Rebuilding > isn't really *that* big of a deal, but I'd like to know if there's an > easier solution in case this thing rears its ugly head in the > future. You won't be able to get around rebuilding the _delaunay extension module, but the rest of scipy you can leave alone. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From rmay at ou.edu Thu Sep 28 21:27:06 2006 From: rmay at ou.edu (Ryan May) Date: Thu, 28 Sep 2006 20:27:06 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: <451C766A.6070900@ou.edu> Bill Baxter wrote: > The numpy for matlab users page > ( http://www.scipy.org/NumPy_for_Matlab_Users ) > also list a number of pros and cons. > > 2) integration of debugging is not as good as matlab. In matlab when > you stop at a breakpoint in your code, you get an interactive console > where you can probe current values in your program, or create new ones > etc. The Wing IDE has this, but I couldn't find any open source IDEs > that did this. > > --bb > You might try the Eric IDE: http://www.die-offenbachs.de/detlev/eric3.html The GUI is a little complicated/full, but now that I've started to get the hang of it, I really like it. Instead of a console, it gives you a list of locals and globals while in debug mode. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma rmay at rossby.ou.edu From fperez.net at gmail.com Thu Sep 28 21:37:50 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 28 Sep 2006 19:37:50 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: On 9/28/06, Bill Baxter wrote: > The numpy for matlab users page > ( http://www.scipy.org/NumPy_for_Matlab_Users ) > also list a number of pros and cons. > > So far, I find the biggest cons to numpy to be > 1) integration of plotting is not as good as matlab. You have to be > careful about calling "show()" in matplotlib because of event-loop > integration issues. Also no good 3D plotting solution. MayaVi is > supposed to be good, but it would be better if it were all just built > into matplotlib. > 2) integration of debugging is not as good as matlab. In matlab when > you stop at a breakpoint in your code, you get an interactive console > where you can probe current values in your program, or create new ones > etc. The Wing IDE has this, but I couldn't find any open source IDEs > that did this. You may want to try ipython. It's a console program, not an IDE, but it does both of the above (no 3d plotting, just integrating 'intelligently' with mpl). I'll be happy to provide you with further details if you have questions. Cheers, f From wbaxter at gmail.com Thu Sep 28 21:42:28 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 10:42:28 +0900 Subject: [SciPy-user] Debug console - Re: Pros and Cons of Python verses other array environments Message-ID: On 9/29/06, Ryan May wrote: > Bill Baxter wrote: > > 2) integration of debugging is not as good as matlab. In matlab when > > you stop at a breakpoint in your code, you get an interactive console > > where you can probe current values in your program, or create new ones > > etc. The Wing IDE has this, but I couldn't find any open source IDEs > > that did this. > > > You might try the Eric IDE: > http://www.die-offenbachs.de/detlev/eric3.html > > The GUI is a little complicated/full, but now that I've started to get > the hang of it, I really like it. Instead of a console, it gives you a > list of locals and globals while in debug mode. Thanks. A number of free IDEs provied a list of watch variables or locals&globals (Wing included). But a list of variables isn't quite the same as being able to interactively test out functions and expressions on the values in question. That's what the debug console is great for. Oops! Got an error. Bad syntax, huh? Hmm, is this right? No, what about this? Nope. This? yeh that's it! Ok great. Fix code. Rerun. I think the PyDev eclipse plugin actually has this now, but didn't when I was shopping around. Either way, Eclipse has a pretty steep learning curve. Anyway, I went ahead and bought a license for Wing. I think it was well worth it. Far cheaper than a license for Matlab, anyway. :-) The only thing it seems to lack that I would expect from a commercial IDE is tooltips to show the values of variables. --bb From perry at stsci.edu Thu Sep 28 21:44:43 2006 From: perry at stsci.edu (Perry Greenfield) Date: Thu, 28 Sep 2006 21:44:43 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> Message-ID: I pretty much agree with Rob's points. Ease of installation and having all the basic tools packaged together (note, this doesn't mean everything) is a big deal to many users. Despite others' comments to the contrary, that Python is more general than matlab will make it less seamless. Beginners can get confused over simple things like why math.add doesn't work on an array. The fact is that most of the rest of the Python world isn't array aware (and don't care). Matlab or IDL pretty much ensure that everything integrates with arrays. I don't think this issue will ever go away. We can minimize it, but we will never (imho) beat matlab in this respect. On the other hand, we can sell people on the fact that Python is much more useful for other things so they don't have to learn some other tool for those. Perry On Sep 28, 2006, at 2:13 PM, Rob Hetland wrote: > > I think everybody on this list is pretty familiar with the pros of > PyLab (although a good set of arguments in one place is a Good Idea). > > Perhaps it would be productive for us to start a discussion of cons > here on the list, and decide how to mitigate them. Here I give some > of my impressions on how the novice might see PyLab. > > I am teaching a class on data assimilation this semester, and I have > decided to use PyLab. I had used MATLAB in the past, and I had many > students who weren't familiar with how it worked and there were > licensing issues. Most students could eventually figure out how to > deal with the language; vectorization is always a tough thing to > get. Also, they could purchase a student version of MATLAB for about > $50 -- not free, but cheaper than many college textbooks. > > Using PyLab, I have a different set of issues. First is > installation. I recommended Enthon python to the PC users, and I > helped the Mac users deal with the various distributions. Even with > a number of clickable installer packages, putting python on their > computers was not straightforward. > > Then there is the issue with Python itself. Python is a more > powerful language than MATLAB's core programming language, and > students have a slightly steeper learning curve figuring all of that > out. They suddenly have to deal with importing packages, zero-based > indexing, many different kinds of sequences, methods, and a host of > other issues. Additionally, numpy is a more powerful array tool, in > my opinion, but it is also harder to learn. In the final analysis, > students had similar issues working with MATLAB, but my impression is > that they find MATLAB slightly easier. Also, many of them come to > the class with MATLAB experience -- none of them have used python, > let alone any of the scientific packages, before. In the long run, > python is clearly better, but in the short term, I think MATLAB might > be simpler. > > Finally, I think that one of the reasons MATLAB is successful is that > it includes everything together, in one place. It does some things > very poorly, but I was always willing to put up with MATLAB's > weaknesses to remain within a single environment. PyLab is starting > to feel like that, but there are still some pretty clear boundaries > between the packages, and an almost overwhelming array of choices if > you want to look for them. This is good if you are a geek and what > plotting package X or numeric package Y and you still want everything > else to work, but bad if you are a novice and just want things to > work simply and smoothly together. > > Thus, the first thing that would improve PyLab usage the most, for > the majority of people migrating from MATLAB, would be a cohesive, > easy to install package. Like Enthon for a wide variety of > platforms. I think this is a goal that will could be achived in a > year or so, given how things are going now. > > Also, a good tutorial would be essential. This would be less > complete and less formal than actual NumPy documentation, and would > include information on using all of the PyLab packages together. > There is a good start to this on the scipy wiki. > > Third, I think it is important for all of the packages to work > together. This has certainly happened in terms of compilation -- mpl > and numpy have been released in lockstep so they work together > despite major changes in numpy. The Python+NumPy+Scipy+Matplotlib > +IPython suite seems like a great place to start, and I think more > could be done to make these tools seem like one seamless superpackage. > > -Rob > > > On Sep 28, 2006, at 12:09 PM, Fernando Perez wrote: > >> On 9/28/06, Travis Oliphant wrote: >>> >>> Hi all, >>> >>> I've started a possibly controversial but hopefully informative page >>> that tries to list some of the advantages of using >>> Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination >>> PyLab) versus other array environments. >> >> Great, I think this is important to have. For reference, the link >> is: >> >> http://www.scipy.org/NumPyProConPage >> >> Cheers, >> >> f >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user > > ---- > Rob Hetland, Associate Professor > Dept. of Oceanography, Texas A&M University > http://pong.tamu.edu/~rob > phone: 979-458-0096, fax: 979-845-6331 > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From wbaxter at gmail.com Thu Sep 28 21:46:17 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 10:46:17 +0900 Subject: [SciPy-user] Debug console - Re: Pros and Cons of Python verses other array environments In-Reply-To: References: Message-ID: > Anyway, I went ahead and bought a license for Wing. I think it was > well worth it. Far cheaper than a license for Matlab, anyway. :-) > The only thing it seems to lack that I would expect from a commercial > IDE is tooltips to show the values of variables. Oh yeh, and autocompletion in the console. It doesn't do autocompletion in the console (but does in the source code windows). --bb From rshepard at appl-ecosys.com Thu Sep 28 21:55:06 2006 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Thu, 28 Sep 2006 18:55:06 -0700 (PDT) Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451C766A.6070900@ou.edu> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451C766A.6070900@ou.edu> Message-ID: On Thu, 28 Sep 2006, Ryan May wrote: >> 2) integration of debugging is not as good as matlab. In matlab when >> you stop at a breakpoint in your code, you get an interactive console >> where you can probe current values in your program, or create new ones >> etc. The Wing IDE has this, but I couldn't find any open source IDEs >> that did this. While winpdb is not a full-blown IDE, it is a gui front end to pdb. It does a very good job of showing you what's going on, and you can examine the value of variables and do other explorations at breakpoints or while stepping over (or through) your code. Rich -- Richard B. Shepard, Ph.D. | The Environmental Permitting Applied Ecosystem Services, Inc.(TM) | Accelerator Voice: 503-667-4517 Fax: 503-667-8863 From wbaxter at gmail.com Thu Sep 28 21:57:19 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 10:57:19 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: On 9/29/06, Fernando Perez wrote: > On 9/28/06, Bill Baxter wrote: > > The numpy for matlab users page > > ( http://www.scipy.org/NumPy_for_Matlab_Users ) > > also list a number of pros and cons. > > > > So far, I find the biggest cons to numpy to be > > 1) integration of plotting is not as good as matlab. You have to be > > careful about calling "show()" in matplotlib because of event-loop > > integration issues. Also no good 3D plotting solution. MayaVi is > > supposed to be good, but it would be better if it were all just built > > into matplotlib. > > 2) integration of debugging is not as good as matlab. In matlab when > > you stop at a breakpoint in your code, you get an interactive console > > where you can probe current values in your program, or create new ones > > etc. The Wing IDE has this, but I couldn't find any open source IDEs > > that did this. > > You may want to try ipython. It's a console program, not an IDE, but > it does both of the above (no 3d plotting, just integrating > 'intelligently' with mpl). I'll be happy to provide you with further > details if you have questions. Hey Fernando. I actually broke down and started using ipython instead of pyCrust recently, despite my dislike for being stuck in the lame Windows console. It is a great shell, (love the ? and "func arg" --> "func(arg)" features). Just sad that it's locked into text mode. Hopefully the ipython1 project will keep moving along so we can have ipython in a GUI before long. As for its debug console ability, I wasn't aware of that. I knew you could have it trigger the debugger at a breakpoint, but then you drop into a prompt with debugger syntax rather than normal python syntax, right? So you have to prefix every command with something. Is there some other mode that I'm not aware of that gives you a regular console? No gui also means setting breakpoints by line number or function name, no? --bb From wbaxter at gmail.com Thu Sep 28 22:11:09 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 11:11:09 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451C766A.6070900@ou.edu> Message-ID: I checked out winpdb, but I didn't see any debug console the likes of Matlab or Wing IDE. I recall that it also integrates pretty with Stani's Python Editor (SPE) to make something very close to an IDE, but I couldn't find the equivalent of the debug console I was looking for in that combo. Looking here: http://www.digitalpeers.com/pythondebugger/images/screenshot_winpdb.png there is a big 'console' window, but it looks like that 'console' is a console for the debugger, rather than regular python shell, like a GUI version of what ipython gives you. That means you issue _debugger_ commands at the prompt, rather than plain old python expressions. IIRC, one of those debugger commands is indeed "eval a python expression", but that's an unnecessary layer of inconvenience. If there is a way to get a real python shell executing in the context of the program being currently debugged with ipython or with winpdb, I would certainly like to know about it, and would like to advertise that ability on the numpy for matlab users page. I was really quite surprised to not find any debuggers or IDEs that had this very useful feature of matlab. --bb On 9/29/06, Rich Shepard wrote: > On Thu, 28 Sep 2006, Ryan May wrote: > > >> 2) integration of debugging is not as good as matlab. In matlab when > >> you stop at a breakpoint in your code, you get an interactive console > >> where you can probe current values in your program, or create new ones > >> etc. The Wing IDE has this, but I couldn't find any open source IDEs > >> that did this. > > While winpdb is not a full-blown IDE, it is a gui front end to pdb. It > does a very good job of showing you what's going on, and you can examine the > value of variables and do other explorations at breakpoints or while > stepping over (or through) your code. > > Rich > From fperez.net at gmail.com Thu Sep 28 22:49:59 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 28 Sep 2006 20:49:59 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: On 9/28/06, Bill Baxter wrote: > Hey Fernando. I actually broke down and started using ipython > instead of pyCrust recently, despite my dislike for being stuck in the > lame Windows console. It is a great shell, (love the ? and "func arg" > --> "func(arg)" features). Just sad that it's locked into text mode. > Hopefully the ipython1 project will keep moving along so we can have > ipython in a GUI before long. It's moving along... > As for its debug console ability, I wasn't aware of that. I knew you > could have it trigger the debugger at a breakpoint, but then you drop > into a prompt with debugger syntax rather than normal python syntax, > right? So you have to prefix every command with something. Is there > some other mode that I'm not aware of that gives you a regular > console? Well, the ipdb console is a primitive python console, but any single-line expression which is valid python will be directly evaluated. In addition, it has a few extra commands (type help to see them). I happen to find it quite satisfactory for most of my needs, but I'm sure better could be done. > No gui also means setting breakpoints by line number or function name, no? Yes, that's certainly true. pdb is fairly gdb-like in that respect. I'm certainly /not/ claiming ipython to be an IDE, simply that it does have some useful features. For a certain class of users, probably those who prefer the emacs/vi/favorite editor + terminal combination to an IDE, it seems to do the trick quite nicely. And yes, things are moving along to make sure that it's even better in the future, with GUI integration, notebook-type environments and lots more. It's slowly but surely coming together. Cheers, f From jdhunter at ace.bsd.uchicago.edu Thu Sep 28 22:48:22 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 28 Sep 2006 21:48:22 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: (Rob Hetland's message of "Thu, 28 Sep 2006 15:57:14 -0500") References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> Message-ID: <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Rob" == Rob Hetland writes: Rob> All of the arguments made *for* PyLab are true -- you think Rob> so too, or you wouldn't be reading this. I have been a huge Rob> proponent of PyLab, and have taught seminars on it here at Rob> Texas A&M and Woods Hole to people who primarily use MATLAB. Rob> I have heard a number of objections or excuses that it all Rob> looks good, but..... - it's hard to install - I already know Rob> how to use MATLAB, and it works fine for me - when do I find Rob> a week (or month or semester) to learn a new programing Rob> language - I already have so many m-files that I would need Rob> to rewrite The first thing this thread makes me think is: why does wikipedia work but wikis for scientific python not. If we followed Travis' lead and aggregated the collective wisdom on this thread into the wiki page, we would have something enduring for the masses. As it is, only geeks like us who read mailing lists or archives will benefit from it. Maybe this points to the problem: the primary users and developers of scientific computing in python are sufficiently technologically literate that they not only overcome the additional complexity, they need it and crave it. I was a huge matlab user for almost a decade; I tried to write a book about matlab (see http://matplotlib.sf.net/matlab_cookbook.pdf, unfortunately as incomplete as the mpl cookbook and other documentation). At some point I "hit the wall" and could no longer be productive in matlab. The extra overhead of managing complex data structures, developing complex GUIs, and working with networked data and databases was consuming most of my programming energy. Yes, matlab provides you a simple, comprehensive interface, and a fairly complete set of numerical libs, but when you want to work with complex data in a realistic networked environment, you hit the limits of the language and environment pretty hard. Then you rewrite what you like about matlab in python and get on with it. matlab is a great tool for beginners and intermediates. For experts, it has limitations which are hard to overcome. My advice to students: if you aspire to be an expert, bite the bullet now and build a set of tools that can scale with you on your ascent. Also, realize that The Mathworks is like the crack dealer on the street: the first hit is free; once you are addicted it becomes quite expensive. An academic license or a student version sells for under $100. If you are a business and need the important toolkits, you are looking at 50K per year. If you are an entrepreneurial student and dream of starting your own business once you graduate, ask yourself what you could do with the extra cash saved from a single site license. If your fledgling business grows, ask yourself what you can do with the cash saved from 50 site licenses (hint, that is 2.5 million dollars a year). If you are ready to spend the 2.5 million dollars, fine, but first try the following exercises in matlab and python * download and parse a CSV file from a web server, eg http://ichart.finance.yahoo.com/table.csv?s=INTC&d=8&e=29&f=2006&g=d&a=6&b=9&c=1986&ignore=.csv (for a python implementation, see the matplotlib.finance module) * fill out a web CGI form in matlab (hint: you can do it with the embedded JVM, a virtual machine running in a virtual machine) * query a mysql database on linux, win32, and OS X with the same script and populate an array with the results Now how much would you pay? PS: it's been a while since I looked at that matlab cookbook I was working on. I find the following sections of the matlab PDF linked above fun in a historical light:: Alternatives to matlab I am a devotee of open source software. I (almost exclusively) use linux as an operating system, emacs as an integrated development environment, python for small and large scale programming, C++ for numerics, and so on. Matlab is the only commercial piece of software I use regularly. I really don't want to use it, mainly because it is so expensive. I work in an academic environment, where site licenses go for the incredibly cheap price of $75 per year, toolboxes included. Check out the commercial price list to get an idea of just how expensive it is outside of academia. I'll give you a hint. About as much as a new Lexus sport utility vehicle. So aside from my support for GNU and linux and open source software, I don't want to wake up some day outside the folds of academia having to pay for matlab. Every day I use matlab is another set of plotting and analysis functions that I come to rely on, which makes it increasingly hard to go cold turkey. Every once in a while I make an aborted attempt to give it up (I know it's not good for me) but I always find myself coming back. The main reason is the graphics -- the ease with which I can make publication quality figures that I just haven't found in competing, open source, free as in Richard Stallman (http://www.gnu.org/philosophy/free-sw.html), solutions. Free alternatives * python -- python is the one true language. I have written extensively in perl, C++, FORTRAN, BASIC, and yes matlab, and in python I have found the one true language. I say that with tongue in cheek -- there is no one true language, because the strengths of a language often imply its weaknesses. The classic trade offs between user friendliness and power, expressiveness and readability, development time and execution time. python solves all these problems for me because it is so clear syntactically, has so many great libraries built in, and so many great external libraries. In the final category, relevant to this discussion, is numpy (http://www.pfdubois.com/numpy) and its recent successor scipy (http://www.scipy.org). These libraries provide efficient C/C++/FORTRAN libraries, all wrapped in python, that give you a huge array of highly tested, optimized, numerical libraries, for free. And you can read and modify the source code at will, in large part obviating the classic problem of closed source (matlab) libraries. That in a few years, when another platform is dominant, your solution of today is no longer supported. With open source, your solution is supported as long as users continue to use it and support it. SGI was the proprietary platform of choice for high performance graphics software 5 years ago. Today, support and maintenance have become increasingly difficult and expensive. And while numerous graphics packages for scipy exist, none compare to the breadth, ease of use, generality and quality of the matlab libraries. Yet. As a general rule, open source solutions follow excellent close source solutions with a short time lag. Witness the gimp, an excellent drop in replacement for Photoshop). So keep your eye on python for standardized, excellent graphics solutions in the near future. If you want to split the difference, python does support an interface to matlab called pymat (http://claymore.engineer.gvsu.edu/~steriana/Python/pymat.html), so you can do your number crunching in numpy, and pass the results off to matlab for plotting, thus minimizing your dependence on matlab until the final step of producing graphical output. * octave (http://www.octave.org) Octave is an open source clone of matlab. Many m-files will run in octave without changes. But when you start to make plots, you'll hit incompatibilities. octave uses gnuplot for plotting, and the support, particularly for handle graphics, is limited, as is the quality of the graphics produced. JDH From peridot.faceted at gmail.com Thu Sep 28 23:17:23 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Thu, 28 Sep 2006 23:17:23 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: On 28/09/06, John Hunter wrote: > >>>>> "Rob" == Rob Hetland writes: > > The first thing this thread makes me think is: why does wikipedia work > but wikis for scientific python not. If we followed Travis' lead and > aggregated the collective wisdom on this thread into the wiki page, we > would have something enduring for the masses. As it is, only geeks > like us who read mailing lists or archives will benefit from it. > Maybe this points to the problem: the primary users and developers of > scientific computing in python are sufficiently technologically > literate that they not only overcome the additional complexity, they > need it and crave it. As an ex-Wikipedia addict, my first thought was to go and start hacking on the page. But there's no discussion page! You have to just go in and start mangling it and hope nobody minds (because if they do, all they can do is start changing it back, with acerbic comments...) A. M. Archibald From jdhunter at ace.bsd.uchicago.edu Thu Sep 28 23:08:18 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 28 Sep 2006 22:08:18 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: ("A. M. Archibald"'s message of "Thu, 28 Sep 2006 23:17:23 -0400") References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <8764f7id3x.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "A" == A M Archibald writes: A> As an ex-Wikipedia addict, my first thought was to go and start A> hacking on the page. But there's no discussion page! You have A> to just go in and start mangling it and hope nobody minds A> (because if they do, all they can do is start changing it back, A> with acerbic comments...) I could be wrong, but I think it safe to say that unless you are a psychopath, we would all be much obliged if you just jumped in and started hacking on the page. JDH From gael.varoquaux at normalesup.org Thu Sep 28 23:29:37 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 29 Sep 2006 05:29:37 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <20060929032937.GA20056@clipper.ens.fr> On Thu, Sep 28, 2006 at 11:17:23PM -0400, A. M. Archibald wrote: > As an ex-Wikipedia addict, my first thought was to go and start > hacking on the page. But there's no discussion page! You have to just > go in and start mangling it and hope nobody minds (because if they do, > all they can do is start changing it back, with acerbic comments...) I think you should just go ahead and do it, people don't fight over the wiki at scipy.org. If you have controversial changes to do, discuss them on the mailing list, but elsewhere just edit the page. Speaking of which, John, can we quote you on the wiki ? And a question to everybody, where should we put that quote ? On a dedicated page with a link to it from the ProConPage ? Ga?l From peridot.faceted at gmail.com Thu Sep 28 23:45:29 2006 From: peridot.faceted at gmail.com (A. M. Archibald) Date: Thu, 28 Sep 2006 23:45:29 -0400 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929032937.GA20056@clipper.ens.fr> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <20060929032937.GA20056@clipper.ens.fr> Message-ID: On 28/09/06, Gael Varoquaux wrote: > On Thu, Sep 28, 2006 at 11:17:23PM -0400, A. M. Archibald wrote: > > As an ex-Wikipedia addict, my first thought was to go and start > > hacking on the page. But there's no discussion page! You have to just > > go in and start mangling it and hope nobody minds (because if they do, > > all they can do is start changing it back, with acerbic comments...) > > I think you should just go ahead and do it, people don't fight over the > wiki at scipy.org. If you have controversial changes to do, discuss them > on the mailing list, but elsewhere just edit the page. > > Speaking of which, John, can we quote you on the wiki ? And a > question to everybody, where should we put that quote ? On a dedicated > page with a link to it from the ProConPage ? Well, I'll let you judge about the psychopathy, but I created a discussion page and put some discussion there (all mine, so far, but please do add your own). A. M. Archibald From jdhunter at ace.bsd.uchicago.edu Thu Sep 28 23:40:37 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 28 Sep 2006 22:40:37 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929032937.GA20056@clipper.ens.fr> (Gael Varoquaux's message of "Fri, 29 Sep 2006 05:29:37 +0200") References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <20060929032937.GA20056@clipper.ens.fr> Message-ID: <87odszbaru.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Gael" == Gael Varoquaux writes: Gael> Speaking of which, John, can we quote you on the wiki ? And Gael> a question to everybody, where should we put that quote ? On Gael> a dedicated page with a link to it from the ProConPage ? My general policy on people quoting me is "anywhere and everywhere" :-) I think we should concentrate on making the main page as comprehensive and useful as possible. So put as much there as you can. If you want to summarize arguments on the main page and provide links to supporting material do so. It's a wiki! JDH From otto at tronarp.se Fri Sep 29 03:24:42 2006 From: otto at tronarp.se (Otto Tronarp) Date: Fri, 29 Sep 2006 09:24:42 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> Message-ID: <20060929092442.bw9xmhiuns4484kw@mathcore.kicks-ass.org> Quoting Perry Greenfield : > Despite others' comments to the contrary, that Python is more general > than matlab will make it less seamless. Beginners can get confused > over simple things like why math.add doesn't work on an array. The > fact is that most of the rest of the Python world isn't array aware > (and don't care). Matlab or IDL pretty much ensure that everything > integrates with arrays. This touches the subject of inconsistency with math.add as one example. One of my "favorite" peeve's is how the size (shape) is given to functions that generate matrices. In some functions the size is given as a tuple, in other the size in different dimensions are given as seperate arguments. Here are some examples: To to create an array with shape M, N you do: zeros((M, N)) ones((M, N)) rand(M, N) eye(M, N) I'm sure more examples exists. Is it only me that find that increadible irretating? Otto From david at ar.media.kyoto-u.ac.jp Fri Sep 29 03:44:01 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 29 Sep 2006 16:44:01 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: <451CCEC1.6060608@ar.media.kyoto-u.ac.jp> Bill Baxter wrote: > > Hey Fernando. I actually broke down and started using ipython > instead of pyCrust recently, despite my dislike for being stuck in the > lame Windows console. It is a great shell, (love the ? and "func arg" > --> "func(arg)" features). everytime I am stuck on windows, I have the same problem. I don't know if it is doable, but you may want to see if ipython can work inside console (I don't know how is the console application separated from the shell, and if pyreadline can be used inside something else than cmd.exe): http://sourceforge.net/projects/console/ At least, you can change the font, have tab consoles, and change the size of the window. David From fperez.net at gmail.com Fri Sep 29 03:49:31 2006 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 29 Sep 2006 01:49:31 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451CCEC1.6060608@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CCEC1.6060608@ar.media.kyoto-u.ac.jp> Message-ID: On 9/29/06, David Cournapeau wrote: > Bill Baxter wrote: > > > > Hey Fernando. I actually broke down and started using ipython > > instead of pyCrust recently, despite my dislike for being stuck in the > > lame Windows console. It is a great shell, (love the ? and "func arg" > > --> "func(arg)" features). > everytime I am stuck on windows, I have the same problem. I don't know > if it is doable, but you may want to see if ipython can work inside > console (I don't know how is the console application separated from the > shell, and if pyreadline can be used inside something else than cmd.exe): > > > http://sourceforge.net/projects/console/ > > At least, you can change the font, have tab consoles, and change the > size of the window. Just yesterday it was reported on the ipython list that it works like a charm. Ville Vainio (the trunk maintainer) was very pleased with it, after another user mentioned it on the list. It sounds like a good alternative for those who are forced to use Windows for one reason or another, and would like a sane terminal to work in. Cheers, f From david at ar.media.kyoto-u.ac.jp Fri Sep 29 03:58:48 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 29 Sep 2006 16:58:48 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: <451CD238.3040509@ar.media.kyoto-u.ac.jp> Bill Baxter wrote: > The numpy for matlab users page > ( http://www.scipy.org/NumPy_for_Matlab_Users ) > also list a number of pros and cons. > > So far, I find the biggest cons to numpy to be > 1) integration of plotting is not as good as matlab. You have to be > careful about calling "show()" in matplotlib because of event-loop > integration issues. Also no good 3D plotting solution. MayaVi is > supposed to be good, but it would be better if it were all just built > into matplotlib. > 2) integration of debugging is not as good as matlab. In matlab when > you stop at a breakpoint in your code, you get an interactive console > where you can probe current values in your program, or create new ones > etc. The Wing IDE has this, but I couldn't find any open source IDEs > that did this. > > Concerning point 1, matplotlib is better than matlab for some things, but much worse for other (it can also be that I am just clueless). For example, interactive plot is not great with matplotlib (zoom and so), and much slower for redrawing (but I think this is a consequence of the flexibility of matplotlib). As Someone said it before, matlab makes it easier for beginners; but once you hit the wall, you hit it very hard :) I am using scipy for a few months now, after several years of matlab which I consider myself at least moderately knowledgeable about, and there is no coming back for me. After 2 weeks, I was more or less as efficient in numpy as I was in matlab The things I consider much better in matlab are: - interactive plots - profiling -> this is the thing I am missing the most. If one of my module is slow, I find it really hard to find where the problems are compared to matlab. - size of community: in machine learning and signal processing, matlab is kind of pervasive. In machine learning, particularly, you have tons of software available on the internet for free. Something nobody has mentioned before for pros for numpy is integration with C code: this is really clumsy in matlab, and it works great with python (ctypes, swig, boost, pyrex are all great tools for that, depending on what you want to do). David From wbaxter at gmail.com Fri Sep 29 04:11:32 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 17:11:32 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451CD238.3040509@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CD238.3040509@ar.media.kyoto-u.ac.jp> Message-ID: > Something nobody has mentioned before for pros for numpy is integration > with C code: this is really clumsy in matlab, and it works great with > python (ctypes, swig, boost, pyrex are all great tools for that, > depending on what you want to do). It is on the web page that started the discussion: "very easy to extend in C/C++ or Fortran" But I agree. It's nicer with python/numpy. Both extending with C/C++ and embedding in C/C++. I think I would say "easy" rather than "very easy", but that quibbling... :-) --bb From wbaxter at gmail.com Fri Sep 29 04:14:45 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 17:14:45 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CD238.3040509@ar.media.kyoto-u.ac.jp> Message-ID: Given the amount of discussion here, I'm starting to think maybe each of the lines on the comparison page (http://www.scipy.org/NumPyProConPage) should be a link to a whole page of discussion just about that particular issue. --bb On 9/29/06, Bill Baxter wrote: > > Something nobody has mentioned before for pros for numpy is integration > > with C code: this is really clumsy in matlab, and it works great with > > python (ctypes, swig, boost, pyrex are all great tools for that, > > depending on what you want to do). > > It is on the web page that started the discussion: > "very easy to extend in C/C++ or Fortran" > > But I agree. It's nicer with python/numpy. Both extending with C/C++ > and embedding in C/C++. I think I would say "easy" rather than "very > easy", but that quibbling... :-) > > --bb > From f.braennstroem at gmx.de Fri Sep 29 04:17:50 2006 From: f.braennstroem at gmx.de (Fabian Braennstroem) Date: Fri, 29 Sep 2006 10:17:50 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: Hi John, * John Hunter wrote: >>>>>> "Rob" == Rob Hetland writes: > > Rob> All of the arguments made *for* PyLab are true -- you think > Rob> so too, or you wouldn't be reading this. I have been a huge > Rob> proponent of PyLab, and have taught seminars on it here at > Rob> Texas A&M and Woods Hole to people who primarily use MATLAB. > Rob> I have heard a number of objections or excuses that it all > Rob> looks good, but..... - it's hard to install - I already know > Rob> how to use MATLAB, and it works fine for me - when do I find > Rob> a week (or month or semester) to learn a new programing > Rob> language - I already have so many m-files that I would need > Rob> to rewrite > [...] > So aside from my support for GNU and linux and open source software, I > don't want to wake up some day outside the folds of academia having to > pay for matlab. Every day I use matlab is another set of plotting and > analysis functions that I come to rely on, which makes it increasingly > hard to go cold turkey. Every once in a while I make an aborted attempt > to give it up (I know it's not good for me) but I always find > myself coming back. The main reason is the graphics -- the ease with > which I can make publication quality figures that I just haven't found > in competing, open source, free as in Richard Stallman > (http://www.gnu.org/philosophy/free-sw.html), solutions. Pretty interesting! I hope I can convince my Prof to go for pyhton using your arguments ... but I do not understand that there is a problem with publication quality graphics. For 2D matplotlib should be enough; for 3D mayavi2/tvtk and python-vtk should be able to produce quality graphics too!? Greetings! Fabian From wbaxter at gmail.com Fri Sep 29 04:26:22 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 17:26:22 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929092442.bw9xmhiuns4484kw@mathcore.kicks-ass.org> References: <451BFBD7.4080701@ieee.org> <20060929092442.bw9xmhiuns4484kw@mathcore.kicks-ass.org> Message-ID: On 9/29/06, Otto Tronarp wrote: > This touches the subject of inconsistency with math.add as one > example. One of my "favorite" peeve's is how the size (shape) is given > to functions that generate matrices. In some functions the size is > given as a tuple, in other the size in different dimensions are given > as seperate arguments. Here are some examples: > > To to create an array with shape M, N you do: > zeros((M, N)) > ones((M, N)) > rand(M, N) > eye(M, N) > > I'm sure more examples exists. Yep, repmat(A,M,N) is another one. http://projects.scipy.org/scipy/numpy/ticket/292 But I think you'll find that rand at least is no longer there. Instead you're supposed to use random.random((M,N)) eye(M,N) is considered to be ok, because 1-D eye is pretty useless, and it's not clear what you'd want out of N-D identity matrix. There has been a lot of discussion about trying to unify on shape args always being tuples. --bb From willemjagter at gmail.com Fri Sep 29 04:30:54 2006 From: willemjagter at gmail.com (William Hunter) Date: Fri, 29 Sep 2006 10:30:54 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <87odszbaru.fsf@peds-pc311.bsd.uchicago.edu> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <20060929032937.GA20056@clipper.ens.fr> <87odszbaru.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <8b3894bc0609290130m1652612r1b0436ccd96f9de3@mail.gmail.com> As a newbie and non-programmer... If I could alter my knowledge/experiences in a flash, I would add some (not exactly sure what though) C programming after doing a thorough Python course. Then I would read a few good articles on open source and what it is. Get up to speed with the terminology used in mailing lists and what they refer to, and how to use a mailing list. Get people excited about open source. I think many first years (and non-programmers like myself) will then better understand what PyLab is about. Also, I realised that I need to take my MATLAB cap off when working with PyLab. When I do that, I tend to get to solutions quicker, I think one should come to the party with an attitude of "This is a different tool to solve engineering/science problems with." My biggest gripe is the inconsistent look of of the wikis, but I suppose I could change that. Lot of work for nothing really... As for easy installation: Enthought's a beauty (bit big though). A Windows PyLab installer and a Linux PyLab deb or rpm would be great... Is this realistic though, taking into acount all the Linux flavours out there? -- WH From david at ar.media.kyoto-u.ac.jp Fri Sep 29 04:43:21 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 29 Sep 2006 17:43:21 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> Fabian Braennstroem wrote: > > Pretty interesting! I hope I can convince my Prof to go for > pyhton using your arguments ... but I do not > understand that there is a problem with publication quality > graphics. For 2D matplotlib should be enough; for 3D > mayavi2/tvtk and python-vtk should be able to produce > quality graphics too!? > > I don't really understand either. As I said earlier, I have some grips about matplotlib, but the export is not one of them, I would say it is a strong point of matplotlib, on the contrary; I always find matlab's way of exporting clumsy, and latex support in matplotlib is easier to use, too. The only problem I ever had with matplotlib for exporting to eps for articles is to have wasted one half hour to find a way to get transparent background. David From otto at tronarp.se Fri Sep 29 04:47:32 2006 From: otto at tronarp.se (Otto Tronarp) Date: Fri, 29 Sep 2006 10:47:32 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <20060929092442.bw9xmhiuns4484kw@mathcore.kicks-ass.org> Message-ID: <20060929104732.rj97r8wi680sgccw@mathcore.kicks-ass.org> Quoting Bill Baxter : > On 9/29/06, Otto Tronarp wrote: >> This touches the subject of inconsistency with math.add as one >> example. One of my "favorite" peeve's is how the size (shape) is given >> to functions that generate matrices. In some functions the size is >> given as a tuple, in other the size in different dimensions are given >> as seperate arguments. Here are some examples: >> >> To to create an array with shape M, N you do: >> zeros((M, N)) >> ones((M, N)) >> rand(M, N) >> eye(M, N) >> >> I'm sure more examples exists. > > Yep, > repmat(A,M,N) > is another one. > http://projects.scipy.org/scipy/numpy/ticket/292 > > But I think you'll find that rand at least is no longer there. > Instead you're supposed to use random.random((M,N)) > > eye(M,N) is considered to be ok, because 1-D eye is pretty useless, > and it's not clear what you'd want out of N-D identity matrix. That is valid point, but I would still argue for unified use of shape args. Assume that I have a function that does some calculations on a matrix and adds another like this: def foo(A, mxCreateFunc): # do some stuff with A return A + mxCreateFunc(A.shape) If we hade unified shape args I could do foo(A, random) # Add some noice foo(A, zeros # Don't add nocie foo(A, eye) # ?? I must admitt that I don't have any real world use cases for eye at hand, but I'm sure they are out there... Otto > > There has been a lot of discussion about trying to unify on shape args > always being tuples. > > --bb > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From wbaxter at gmail.com Fri Sep 29 04:51:56 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 17:51:56 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> Message-ID: On 9/29/06, David Cournapeau wrote: > Fabian Braennstroem wrote: > > > >... but I do not > > understand that there is a problem with publication quality > > graphics. For 2D matplotlib should be enough; for 3D > > mayavi2/tvtk and python-vtk should be able to produce > > quality graphics too!? > > > > > I don't really understand either. As I said earlier, I have some grips > about matplotlib, but the export is not one of them, I think John's gripe about plotting was from a day gone by: John Hunter said: "PS: it's been a while since I looked at that matlab cookbook I was working on. I find the following sections of the matlab PDF linked above fun in a historical light::" --bb From otto at tronarp.se Fri Sep 29 04:53:35 2006 From: otto at tronarp.se (Otto Tronarp) Date: Fri, 29 Sep 2006 10:53:35 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> Message-ID: <20060929105335.hfnshytxws448w8g@mathcore.kicks-ass.org> I don't now the matplotlib history that well but an educated guess is that matplotib wasn't written at that time (1 october 2003), or at least it was in a very early stage. Otto Quoting David Cournapeau : > Fabian Braennstroem wrote: >> >> Pretty interesting! I hope I can convince my Prof to go for >> pyhton using your arguments ... but I do not >> understand that there is a problem with publication quality >> graphics. For 2D matplotlib should be enough; for 3D >> mayavi2/tvtk and python-vtk should be able to produce >> quality graphics too!? >> >> > I don't really understand either. As I said earlier, I have some grips > about matplotlib, but the export is not one of them, I would say it is a > strong point of matplotlib, on the contrary; I always find matlab's way > of exporting clumsy, and latex support in matplotlib is easier to use, > too. The only problem I ever had with matplotlib for exporting to eps > for articles is to have wasted one half hour to find a way to get > transparent background. > > David > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From wbaxter at gmail.com Fri Sep 29 05:01:02 2006 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 29 Sep 2006 18:01:02 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929104732.rj97r8wi680sgccw@mathcore.kicks-ass.org> References: <451BFBD7.4080701@ieee.org> <20060929092442.bw9xmhiuns4484kw@mathcore.kicks-ass.org> <20060929104732.rj97r8wi680sgccw@mathcore.kicks-ass.org> Message-ID: On 9/29/06, Otto Tronarp wrote: > Quoting Bill Baxter : > > > On 9/29/06, Otto Tronarp wrote: > >> This touches the subject of inconsistency with math.add as one > >> example. One of my "favorite" peeve's is how the size (shape) is given > >> to functions that generate matrices. In some functions the size is > >> given as a tuple, in other the size in different dimensions are given > >> as seperate arguments. Here are some examples: > > eye(M,N) is considered to be ok, because 1-D eye is pretty useless, > > and it's not clear what you'd want out of N-D identity matrix. > > That is valid point, but I would still argue for unified use of shape > args. Assume that I have a function that does some calculations on a > matrix and adds another like this: Yeh, I don't disagree with you. The other argument is that eye((3,3)) is more typing for interactive sessions, and seems weird to people coming from matlab. Why the extra parens? Matlab generally accepts either form for functions like that. Personally I wrote my own versions of eye(), rand(), zeros(), ones(), and empty() that work either with tuples or separate args. I think that's the best solution, but some folks here cringe at the thought of overloading Python functions in that way. It's true that overloading is not very pretty in Python, but the users don't have to look at the code in the library. :-) As long as it looks simple from the outside. (At least that's the C++ motto). --bb From skokot at po.opole.pl Fri Sep 29 05:01:27 2006 From: skokot at po.opole.pl (Seweryn Kokot) Date: Fri, 29 Sep 2006 11:01:27 +0200 Subject: [SciPy-user] Debian Sarge and Python Message-ID: <87r6xvgi6w.fsf@poczta.po.opole.pl> Hello, I've started reading this group for 5 days and I find the discussions really interesting. For some time I also started with Python and looking forward to replacing Matlab with SciPy/Matplotlib and friends. The problem is that I'm using Debian Sarge and the packages are so outdated especially in the case of SciPy and Matplotlib which are in continous and fast progress. Of course I could compile all the packages myself but I'm wondering if there are some other Debian Sarge users and how they cope with the problem? Any suggestions? Regards, Seweryn Kokot From sgarcia at olfac.univ-lyon1.fr Fri Sep 29 05:11:24 2006 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Fri, 29 Sep 2006 11:11:24 +0200 Subject: [SciPy-user] Debian Sarge and Python In-Reply-To: <87r6xvgi6w.fsf@poczta.po.opole.pl> References: <87r6xvgi6w.fsf@poczta.po.opole.pl> Message-ID: <451CE33C.9030505@olfac.univ-lyon1.fr> Hello, my advice : use debian etch. It is stable enougth. In a few mounth it will the official stable. And scipy is uptodate. So instalable in 5 s. sam Seweryn Kokot wrote: > Hello, > > I've started reading this group for 5 days and I find the discussions > really interesting. For some time I also started with Python and > looking forward to replacing Matlab with SciPy/Matplotlib and > friends. > The problem is that I'm using Debian Sarge and the packages are so > outdated especially in the case of SciPy and Matplotlib which are in > continous and fast progress. > Of course I could compile all the packages myself but I'm wondering if > there are some other Debian Sarge users and how they cope with the > problem? > > Any suggestions? > Regards, > Seweryn Kokot > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Samuel Garcia Universite Claude Bernard LYON 1 CNRS - UMR5020, Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 FRANCE T?l : 04 37 28 74 64 Fax : 04 37 28 76 01 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From edin.salkovic at gmail.com Fri Sep 29 05:28:46 2006 From: edin.salkovic at gmail.com (Edin Salkovic) Date: Fri, 29 Sep 2006 11:28:46 +0200 Subject: [SciPy-user] Debian Sarge and Python In-Reply-To: <87r6xvgi6w.fsf@poczta.po.opole.pl> References: <87r6xvgi6w.fsf@poczta.po.opole.pl> Message-ID: <63eb7fa90609290228kaefe7a2gfcced77506b450be@mail.gmail.com> You can check "Andrew Straw's Apt Repository": http://debs.astraw.com/ HTH, Edin On 9/29/06, Seweryn Kokot wrote: > Hello, > > I've started reading this group for 5 days and I find the discussions > really interesting. For some time I also started with Python and > looking forward to replacing Matlab with SciPy/Matplotlib and > friends. > The problem is that I'm using Debian Sarge and the packages are so > outdated especially in the case of SciPy and Matplotlib which are in > continous and fast progress. > Of course I could compile all the packages myself but I'm wondering if > there are some other Debian Sarge users and how they cope with the > problem? > > Any suggestions? > Regards, > Seweryn Kokot > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From david at ar.media.kyoto-u.ac.jp Fri Sep 29 05:30:43 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 29 Sep 2006 18:30:43 +0900 Subject: [SciPy-user] Debian Sarge and Python In-Reply-To: <87r6xvgi6w.fsf@poczta.po.opole.pl> References: <87r6xvgi6w.fsf@poczta.po.opole.pl> Message-ID: <451CE7C3.5000503@ar.media.kyoto-u.ac.jp> Seweryn Kokot wrote: > Hello, > > I've started reading this group for 5 days and I find the discussions > really interesting. For some time I also started with Python and > looking forward to replacing Matlab with SciPy/Matplotlib and > friends. > The problem is that I'm using Debian Sarge and the packages are so > outdated especially in the case of SciPy and Matplotlib which are in > continous and fast progress. > Of course I could compile all the packages myself but I'm wondering if > there are some other Debian Sarge users and how they cope with the > problem? > > There are "beta" versions of the packages here, which I think are the incoming official ones for etch: http://deb-scipy.alioth.debian.org/apt/ They also worked more or less on ubuntu edgy if you are willing to make trivial changes in the sources of the package (I cannot remember which ones out of my head); that's what I am using personally, David From stephenemslie at gmail.com Fri Sep 29 08:22:30 2006 From: stephenemslie at gmail.com (stephen emslie) Date: Fri, 29 Sep 2006 13:22:30 +0100 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451BFBD7.4080701@ieee.org> References: <451BFBD7.4080701@ieee.org> Message-ID: <51f97e530609290522j2373c9d4mca60cfa87e23d961@mail.gmail.com> How about a comparrison of pylab and opencv? Would this be appropriate? On 9/28/06, Travis Oliphant wrote: > > > Hi all, > > I've started a possibly controversial but hopefully informative page > that tries to list some of the advantages of using > Python+NumPy+Scipy+Matplotlib+IPython (I'm calling that combination > PyLab) versus other array environments. > > The purpose is not to go into detail about semantic differences, but > document higher-level differences that might help somebody decide > whether or not they could use NumPy instead of some other environment. > I've started with a comparison to MATLAB, based on an email response I > sent to a friend earlier today. > > Additions and corrections welcome. > > -Travis O. > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zelakiew at crd.ge.com Fri Sep 29 08:36:21 2006 From: zelakiew at crd.ge.com (Scott Zelakiewicz) Date: Fri, 29 Sep 2006 12:36:21 +0000 (UTC) Subject: [SciPy-user] =?utf-8?q?Pros_and_Cons_of_Python_verses_other_array?= =?utf-8?q?=09environments?= References: <451C2DF5.1020004@umit.maine.edu> Message-ID: Fernando Perez gmail.com> writes: > > On 9/28/06, R. Padraic Springuel umit.maine.edu> wrote: > > For those of you objecting to the "difficulty" in importing several > > different packages to get all the different capabilities here going at > > once, there is an easy fix here suggested by Travis's use of Pylab as > > the title for the grouping. If you look at matplotlib's pylab.py file > > that it places in your site-packages directory, its just redirects the > > import statement to find what's needed. Add a few lines to this file > > and it can pull in numpy, scipy, and IPython at the same time. What > > might make things easier for the beginner would be to have all of the > > packages Travis mentioned available in a bundled download that is easily > > installable (like Enthought Python, but without the extra bells & > > whistles) and with a modified version of the pylab.py file that contains > > all the necessary import statements. > > > > It might also be nice if this kind of package would automatically > > configure the interpreter to issue the necessary import statements > > automatically when it starts. In this way, the beginner need not learn > > about import statements right away. All the basics are right there for > > them. > > That's pretty much what > > ipython -pylab > > does already, modulo importing scipy, which it does NOT assume is > installed. But doing > > ipython -pylab -p scipy > > does that. The next version of ipython will add the latter as a > shortcut in the Windows start menu upon installation, at Ryan Krauss' > suggestion. *nix users are perfectly capable of making an alias if > they so desire. > > Cheers, > > f > When it comes to the distribution, I like Enthought's Python package, but it is only for Windows. What I have done for Linux is set up a gar make system (also used by konstruct and gargnome) to build everything anyone needs to get a python system up and running. The download is quite small (1 makefile per package to be built). The makefile tells the system the name and location of the download. When make is run it automatically goes to the web, gets the source, builds and installs it. All the user has to do is type 'make install' and they have a full scientific python environment waiting for them. Right now I have it set up to build blas, lapack, numpy, scipy, pygtk, matplotlib, ipython and python2.4 (if it is not already there). I also add a little script in their path that can be run to set the correct environment variables, launch ipython and import scipy and mpl. Perhaps this is a way to easily get Linux users up and running. s. From jelle.feringa at ezct.net Fri Sep 29 09:00:14 2006 From: jelle.feringa at ezct.net (Jelle Feringa / EZCT Architecture & Design Research) Date: Fri, 29 Sep 2006 15:00:14 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other arrayenvironments In-Reply-To: <51f97e530609290522j2373c9d4mca60cfa87e23d961@mail.gmail.com> Message-ID: <017001c6e3c7$352bfc70$c000a8c0@JELLE> Not really, opencv is really quite specific for image analysis isn't it? Actually, if anyone has successfully compiled it on win32/python24, would you please consider sharing it ;') -jelle -------------- next part -------------- An HTML attachment was scrubbed... URL: From thyreau at shfj.cea.fr Fri Sep 29 08:58:34 2006 From: thyreau at shfj.cea.fr (Benjamin THYREAU) Date: Fri, 29 Sep 2006 14:58:34 +0200 Subject: [SciPy-user] =?iso-8859-1?q?Pros_and_Cons_of_Python_verses_other_?= =?iso-8859-1?q?array=09environments?= In-Reply-To: References: Message-ID: <200609291458.34486.thyreau@shfj.cea.fr> (...) > > When it comes to the distribution, I like Enthought's Python package, but > it is only for Windows. What I have done for Linux is set up a gar make > system (also used by konstruct and gargnome) to build everything anyone > needs to get a python system up and running. The download is quite small > (1 makefile per package to be built). The makefile tells the system the > name and location of the download. When make is run it automatically goes > to the web, gets the source, builds and installs it. All the user has to > do is type 'make install' and they have a full scientific python > environment waiting for them. Right now I have it set up to build blas, > lapack, numpy, scipy, pygtk, matplotlib, ipython and python2.4 (if it is > not already there). I also add a little script in their path that can be > run to set the correct environment variables, launch ipython and import > scipy and mpl. > > Perhaps this is a way to easily get Linux users up and running. > > s. > > Sounds interesting. I remember having mostly good experience with Konstrut. Is there some publicy available location where i could find your script ? -- Benjamin Thyreau From david at ar.media.kyoto-u.ac.jp Fri Sep 29 09:40:10 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 29 Sep 2006 22:40:10 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451C2DF5.1020004@umit.maine.edu> Message-ID: <451D223A.4030402@ar.media.kyoto-u.ac.jp> Scott Zelakiewicz wrote: > > When it comes to the distribution, I like Enthought's Python package, but it is > only for Windows. What I have done for Linux is set up a gar make system (also > used by konstruct and gargnome) to build everything anyone needs to get a python > system up and running. This is a really good idea, so good that I am surprised nobody thought about it before :) I didn't know that gar was usable outside gnome. I have used this systeme to test some gnome beta, and it was a good experience overall (specially compared to the nightmare of building gnome by yourself). David From jdhunter at ace.bsd.uchicago.edu Fri Sep 29 09:37:35 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 29 Sep 2006 08:37:35 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929105335.hfnshytxws448w8g@mathcore.kicks-ass.org> (Otto Tronarp's message of "Fri, 29 Sep 2006 10:53:35 +0200") References: <451BFBD7.4080701@ieee.org> <20060928183853.GC28690@clipper.ens.fr> <451C248D.3010406@adelphia.net> <20060928194625.GA1454@clipper.ens.fr> <877izn8k21.fsf@peds-pc311.bsd.uchicago.edu> <451CDCA9.3030608@ar.media.kyoto-u.ac.jp> <20060929105335.hfnshytxws448w8g@mathcore.kicks-ass.org> Message-ID: <87hcyqg5eo.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Otto" == Otto Tronarp writes: Otto> I don't now the matplotlib history that well but an educated Otto> guess is that matplotib wasn't written at that time (1 Otto> october 2003), or at least it was in a very early stage. Yes, exactly, I abandoned my matlab cookbook project to solve the problems addressed in that note and wrote matplotlib. JDH From david at ar.media.kyoto-u.ac.jp Fri Sep 29 11:07:51 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 30 Sep 2006 00:07:51 +0900 Subject: [SciPy-user] Is anybody working on a toolbox for Kalman filtering and derivative ? Message-ID: <451D36C7.6080507@ar.media.kyoto-u.ac.jp> Hi, I wanted to know if anyone is developing, or know a toolbox for kalman filtering and derivative (particle filter, non linear kalman, etc...) ? I am working right now on (simple, linear, Gaussian) Kalman filtering, would like to extend it, and I thought it would be a great addition to scipy, and I would like to avoid duplicating works, David From jelle.feringa at ezct.net Fri Sep 29 12:30:43 2006 From: jelle.feringa at ezct.net (Jelle Feringa / EZCT Architecture & Design Research) Date: Fri, 29 Sep 2006 18:30:43 +0200 Subject: [SciPy-user] Is anybody working on a toolbox for Kalman filtering and derivative ? In-Reply-To: <451D36C7.6080507@ar.media.kyoto-u.ac.jp> Message-ID: <01b801c6e3e4$9cc124b0$c000a8c0@JELLE> http://sourceforge.net/projects/pykf/ From bhendrix at enthought.com Fri Sep 29 12:33:16 2006 From: bhendrix at enthought.com (Bryce Hendrix) Date: Fri, 29 Sep 2006 11:33:16 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451C2DF5.1020004@umit.maine.edu> Message-ID: <451D4ACC.2000903@enthought.com> Scott Zelakiewicz wrote: > When it comes to the distribution, I like Enthought's Python package, but it is > only for Windows. What I have done for Linux is set up a gar make system (also > used by konstruct and gargnome) to build everything anyone needs to get a python > system up and running. The download is quite small (1 makefile per package to > be built). The makefile tells the system the name and location of the download. > When make is run it automatically goes to the web, gets the source, builds and > installs it. All the user has to do is type 'make install' and they have a full > scientific python environment waiting for them. Right now I have it set up to > build blas, lapack, numpy, scipy, pygtk, matplotlib, ipython and python2.4 (if > it is not already there). I also add a little script in their path that can be > run to set the correct environment variables, launch ipython and import scipy > and mpl. > I am working on getting Python Enthought Edition ported to Linux for our next release. I know which distro's our customers use, but which distro's (and processor) is everyone else using? And before you answer, as much as I like Gentoo, I don't think its something we can support at this time. Please just send me personal email, no need to flood the list... Bryce From ahoover at eecs.berkeley.edu Fri Sep 29 12:37:08 2006 From: ahoover at eecs.berkeley.edu (Aaron Hoover) Date: Fri, 29 Sep 2006 09:37:08 -0700 Subject: [SciPy-user] Is anybody working on a toolbox for Kalman filtering and derivative ? In-Reply-To: <451D36C7.6080507@ar.media.kyoto-u.ac.jp> References: <451D36C7.6080507@ar.media.kyoto-u.ac.jp> Message-ID: Hi David, Though it's not just a Kalman filtering toolbox, the open source computer vision library, OpenCV contains implementations of Kalman filtering. I've had good success using OpenCV's Python wrappers to do a lot of 2D vision and image processing very quickly with Python. If nothing else, you could use their source as a point of comparison for your implementation. Cheers, Aaron On Sep 29, 2006, at 8:07 AM, David Cournapeau wrote: > Hi, > > I wanted to know if anyone is developing, or know a toolbox for > kalman filtering and derivative (particle filter, non linear kalman, > etc...) ? I am working right now on (simple, linear, Gaussian) Kalman > filtering, would like to extend it, and I thought it would be a great > addition to scipy, and I would like to avoid duplicating works, > > David > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From Glen.Mabey at swri.org Fri Sep 29 15:48:28 2006 From: Glen.Mabey at swri.org (Glen W. Mabey) Date: Fri, 29 Sep 2006 14:48:28 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ Message-ID: <20060929194828.GB13399@swri16wm.electro.swri.edu> Hello, I don't know of a better place to post than this for a question regarding pywt. Following the instructions found at http://wavelets.scipy.org/index.html for checking out the development version: $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets svn: REPORT request failed on '/svn/multiresolution/!svn/vcc/default' svn: REPORT of '/svn/multiresolution/!svn/vcc/default': 400 Bad Request (http://wavelets.scipy.org) and the indicated path in http://wavelets.scipy.org/svn/multiresolution/pywt/trunk/README.txt yields a similar error. However, trying: $$ svn co https://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ at least asks for a password. However, shouldn't there be (or, could there please be) anonymous access to this tree? Thank you, Glen Mabey From robert.kern at gmail.com Fri Sep 29 16:35:56 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 29 Sep 2006 15:35:56 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <20060929194828.GB13399@swri16wm.electro.swri.edu> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> Message-ID: <451D83AC.9080703@gmail.com> Glen W. Mabey wrote: > Hello, > > I don't know of a better place to post than this for a question > regarding pywt. > > Following the instructions found at http://wavelets.scipy.org/index.html for > checking out the development version: > > $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets > $$ svn co https://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ Both of these URLs work for me. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From davidgrant at gmail.com Fri Sep 29 16:35:50 2006 From: davidgrant at gmail.com (David Grant) Date: Fri, 29 Sep 2006 13:35:50 -0700 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451D4ACC.2000903@enthought.com> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> Message-ID: On 9/29/06, Bryce Hendrix wrote: > > Scott Zelakiewicz wrote: > > When it comes to the distribution, I like Enthought's Python package, > but it is > > only for Windows. What I have done for Linux is set up a gar make > system (also > > used by konstruct and gargnome) to build everything anyone needs to get > a python > > system up and running. The download is quite small (1 makefile per > package to > > be built). The makefile tells the system the name and location of the > download. > > When make is run it automatically goes to the web, gets the source, > builds and > > installs it. All the user has to do is type 'make install' and they > have a full > > scientific python environment waiting for them. Right now I have it set > up to > > build blas, lapack, numpy, scipy, pygtk, matplotlib, ipython and > python2.4 (if > > it is not already there). I also add a little script in their path that > can be > > run to set the correct environment variables, launch ipython and import > scipy > > and mpl. > > > > I am working on getting Python Enthought Edition ported to Linux for our > next release. I know which distro's our customers use, but which > distro's (and processor) is everyone else using? And before you answer, > as much as I like Gentoo, I don't think its something we can support at > this time. Please just send me personal email, no need to flood the > list... Bryce, you shouldn't support any specific distro or processor. If it is a source package that can be built with make or distutils or scons, then any distro can easily incorporate it. -- David Grant http://www.davidgrant.ca -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Sep 29 16:39:39 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 29 Sep 2006 15:39:39 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> Message-ID: <451D848B.7030404@gmail.com> David Grant wrote: > > On 9/29/06, *Bryce Hendrix* > wrote: > I am working on getting Python Enthought Edition ported to Linux for > our > next release. I know which distro's our customers use, but which > distro's (and processor) is everyone else using? And before you answer, > as much as I like Gentoo, I don't think its something we can support at > this time. Please just send me personal email, no need to flood the > list... > > Bryce, you shouldn't support any specific distro or processor. If it is > a source package that can be built with make or distutils or scons, then > any distro can easily incorporate it. Well those already exist. That's not what people are asking for here. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From Glen.Mabey at swri.org Fri Sep 29 16:49:20 2006 From: Glen.Mabey at swri.org (Glen W. Mabey) Date: Fri, 29 Sep 2006 15:49:20 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <451D83AC.9080703@gmail.com> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> Message-ID: <20060929204920.GA31188@swri16wm.electro.swri.edu> On Fri, Sep 29, 2006 at 03:35:56PM -0500, Robert Kern wrote: > Glen W. Mabey wrote: > > Hello, > > > > I don't know of a better place to post than this for a question > > regarding pywt. > > > > Following the instructions found at http://wavelets.scipy.org/index.html for > > checking out the development version: > > > > $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets > > > $$ svn co https://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ > > Both of these URLs work for me. If you mean browsing to the URL via a web browser, that also works for me. However, running the svn client gives the error I included. I can check out svn versions for both numpy and scipy with no problem, too. Glen From gael.varoquaux at normalesup.org Fri Sep 29 16:53:01 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 29 Sep 2006 22:53:01 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451D848B.7030404@gmail.com> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> <451D848B.7030404@gmail.com> Message-ID: <20060929205301.GA5384@clipper.ens.fr> On Fri, Sep 29, 2006 at 03:39:39PM -0500, Robert Kern wrote: > > Bryce, you shouldn't support any specific distro or processor. If it is > > a source package that can be built with make or distutils or scons, then > > any distro can easily incorporate it. > Well those already exist. That's not what people are asking for here. I think there needs to be some small, well separated source packages, for instance one for Trait, one for Envisage, on for Mayavi, and a demand from users to distro maintainers to have these packages added to the distros. One there starts to be a user base of, say, traits, then distros will package it. And the user base is lagging behind because the whole ETS it just to big to be shipped as a dependancie. But I know you guys are working on that. Ga?l From bhendrix at enthought.com Fri Sep 29 17:05:54 2006 From: bhendrix at enthought.com (Bryce Hendrix) Date: Fri, 29 Sep 2006 16:05:54 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929205301.GA5384@clipper.ens.fr> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> <451D848B.7030404@gmail.com> <20060929205301.GA5384@clipper.ens.fr> Message-ID: <451D8AB2.1080009@enthought.com> Yes, we're working on splitting ETS up and getting the dependencies right. We've stripped it down to 14 MB though, scipy is 12.7 MB, so its getting in the same ballpark. I've currently got the entire Python Enthought Edition down to about 90 MB spread across about 50 eggs. The next version will be (optionally) much leaner. Bryce Gael Varoquaux wrote: > I think there needs to be some small, well separated source packages, > for instance one for Trait, one for Envisage, on for Mayavi, and a > demand from users to distro maintainers to have these packages added to > the distros. One there starts to be a user base of, say, traits, then > distros will package it. And the user base is lagging behind because > the whole ETS it just to big to be shipped as a dependancie. But I know > you guys are working on that. > > Ga?l > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From robert.kern at gmail.com Fri Sep 29 17:08:25 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 29 Sep 2006 16:08:25 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <20060929204920.GA31188@swri16wm.electro.swri.edu> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> <20060929204920.GA31188@swri16wm.electro.swri.edu> Message-ID: <451D8B49.5010805@gmail.com> Glen W. Mabey wrote: > On Fri, Sep 29, 2006 at 03:35:56PM -0500, Robert Kern wrote: >> Glen W. Mabey wrote: >>> Hello, >>> >>> I don't know of a better place to post than this for a question >>> regarding pywt. >>> >>> Following the instructions found at http://wavelets.scipy.org/index.html for >>> checking out the development version: >>> >>> $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets >>> $$ svn co https://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ >> Both of these URLs work for me. > > If you mean browsing to the URL via a web browser, that also works for > me. However, running the svn client gives the error I included. No, I checked both of them out using svn. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Fri Sep 29 17:18:08 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 29 Sep 2006 16:18:08 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <20060929205301.GA5384@clipper.ens.fr> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> <451D848B.7030404@gmail.com> <20060929205301.GA5384@clipper.ens.fr> Message-ID: <451D8D90.6080809@gmail.com> Gael Varoquaux wrote: > On Fri, Sep 29, 2006 at 03:39:39PM -0500, Robert Kern wrote: >>> Bryce, you shouldn't support any specific distro or processor. If it is >>> a source package that can be built with make or distutils or scons, then >>> any distro can easily incorporate it. > >> Well those already exist. That's not what people are asking for here. > > I think there needs to be some small, well separated source packages, > for instance one for Trait, one for Envisage, on for Mayavi, and a > demand from users to distro maintainers to have these packages added to > the distros. But we're not talking about the enthought package (http://code.enthought.com/ets/) in this thread, but rather Enthon (http://code.enthought.com/enthon/). In this thread, people definitely *are* asking for Linux binaries for the packages that are included in Enthon. All of these packages (perhaps aside from the enthought package) have very nice source distributions that will build from source on any Linux distribution you care to name. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From Glen.Mabey at swri.org Fri Sep 29 17:21:24 2006 From: Glen.Mabey at swri.org (Glen W. Mabey) Date: Fri, 29 Sep 2006 16:21:24 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <451D8B49.5010805@gmail.com> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> <20060929204920.GA31188@swri16wm.electro.swri.edu> <451D8B49.5010805@gmail.com> Message-ID: <20060929212124.GE31188@swri16wm.electro.swri.edu> On Fri, Sep 29, 2006 at 04:08:25PM -0500, Robert Kern wrote: > Glen W. Mabey wrote: > > On Fri, Sep 29, 2006 at 03:35:56PM -0500, Robert Kern wrote: > >> Glen W. Mabey wrote: > >>> Hello, > >>> > >>> I don't know of a better place to post than this for a question > >>> regarding pywt. > >>> > >>> Following the instructions found at http://wavelets.scipy.org/index.html for > >>> checking out the development version: > >>> > >>> $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets > >>> $$ svn co https://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ > >> Both of these URLs work for me. > > > > If you mean browsing to the URL via a web browser, that also works for > > me. However, running the svn client gives the error I included. > > No, I checked both of them out using svn. Ah, yes. I remember getting this error when trying to check out numpy and scipy, before a change was made to allow access via https ... I think. I can only assume it has something to do with the corporate firewall -- I'm not well-versed in svn. Glen From millman at berkeley.edu Fri Sep 29 17:40:39 2006 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 29 Sep 2006 14:40:39 -0700 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <20060929212124.GE31188@swri16wm.electro.swri.edu> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> <20060929204920.GA31188@swri16wm.electro.swri.edu> <451D8B49.5010805@gmail.com> <20060929212124.GE31188@swri16wm.electro.swri.edu> Message-ID: Hey Glen, The wavelet.scipy.org code was originally going to be wrappers around Francois Meyer's C code, but it turned out there was another project further along: http://www.pybytes.com/pywavelets/ So after approaching the author Filip Wasilewski, he agreed to move his project over to scipy. The wavelet.scipy.org front page is a little out of data in regards to the change. You should check out the code anonymously using: svn co http://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ pywt-trunk Fernando Perez has access to the wavelet.scipy.org site, but he is on vacation. When he returns I will get the site updated with more recent information. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From Glen.Mabey at swri.org Fri Sep 29 17:48:34 2006 From: Glen.Mabey at swri.org (Glen W. Mabey) Date: Fri, 29 Sep 2006 16:48:34 -0500 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> <20060929204920.GA31188@swri16wm.electro.swri.edu> <451D8B49.5010805@gmail.com> <20060929212124.GE31188@swri16wm.electro.swri.edu> Message-ID: <20060929214834.GF31188@swri16wm.electro.swri.edu> Hi Jarrod, On Fri, Sep 29, 2006 at 02:40:39PM -0700, Jarrod Millman wrote: > The wavelet.scipy.org code was originally going to be wrappers around > Francois Meyer's C code, but it turned out there was another project > further along: > http://www.pybytes.com/pywavelets/ > So after approaching the author Filip Wasilewski, he agreed to move > his project over to scipy. > > The wavelet.scipy.org front page is a little out of data in regards to > the change. You should check out the code anonymously using: > svn co http://wavelets.scipy.org/svn/multiresolution/pywt/trunk/ pywt-trunk > > Fernando Perez has access to the wavelet.scipy.org site, but he is on > vacation. When he returns I will get the site updated with more > recent information. Thank you for the explanation -- it was a bit confusing. Unfortunately, the svn command you provided still does not work for me, and I'll have to pursue that issue a different day. Best Regards, Glen From millman at berkeley.edu Fri Sep 29 17:56:21 2006 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 29 Sep 2006 14:56:21 -0700 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <20060929214834.GF31188@swri16wm.electro.swri.edu> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> <451D83AC.9080703@gmail.com> <20060929204920.GA31188@swri16wm.electro.swri.edu> <451D8B49.5010805@gmail.com> <20060929212124.GE31188@swri16wm.electro.swri.edu> <20060929214834.GF31188@swri16wm.electro.swri.edu> Message-ID: On 9/29/06, Glen W. Mabey wrote: > Unfortunately, the svn command you provided still does not work for me, > and I'll have to pursue that issue a different day. Hmmm ... it works for me, so hopefully it is just something on your firewall. Please let me know if you have any more questions or if after talking to your network administrator the problem still persists. Good luck, Jarrod From fredmfp at gmail.com Fri Sep 29 19:52:56 2006 From: fredmfp at gmail.com (fred) Date: Sat, 30 Sep 2006 01:52:56 +0200 Subject: [SciPy-user] error building pyopengl... Message-ID: <451DB1D8.8000707@gmail.com> Hi the list, Yes, I know it's a little bit OT, but I saw on Wilna DuToit Scipy webpage his mat3d.py and thus I need to build PyOpenGL (2.0.2.01). I can't build it because gcc failed with exit status 1... What do I need to build it ? PyOpenGL seems to be quite old on sourceforge and not to be still maintained. Cheers, PS : python2.4, swig 1.3.23 -- ? Python, c'est un peu comme la ba?onnette en 14, faut toujours sortir avec, et pas h?siter d'en mettre un coup en cas de besoin. ? J.-Ph. D. From gruben at bigpond.net.au Fri Sep 29 20:10:13 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 30 Sep 2006 10:10:13 +1000 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451D4ACC.2000903@enthought.com> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> Message-ID: <451DB5E5.8090000@bigpond.net.au> Hi Bryce, just for your survey: Usually I use Windows, but when I'm using Linux I use Ubuntu on a 32-bit system. A guy I share an office with uses 64-bit Ubuntu. regards, Gary Bryce Hendrix wrote: > I am working on getting Python Enthought Edition ported to Linux for our > next release. I know which distro's our customers use, but which > distro's (and processor) is everyone else using? And before you answer, > as much as I like Gentoo, I don't think its something we can support at > this time. Please just send me personal email, no need to flood the list... From gruben at bigpond.net.au Fri Sep 29 20:13:32 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 30 Sep 2006 10:13:32 +1000 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451DB5E5.8090000@bigpond.net.au> References: <451C2DF5.1020004@umit.maine.edu> <451D4ACC.2000903@enthought.com> <451DB5E5.8090000@bigpond.net.au> Message-ID: <451DB6AC.60903@bigpond.net.au> Doh!! Yes, there had to be one careless idiot who sent his reply to the list and it had to be me didn't it. Sorry. Gary R. Gary Ruben wrote: > Hi Bryce, > > just for your survey: > Usually I use Windows, but when I'm using Linux I use Ubuntu on a 32-bit > system. A guy I share an office with uses 64-bit Ubuntu. > > regards, > Gary From gruben at bigpond.net.au Fri Sep 29 23:44:07 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 30 Sep 2006 13:44:07 +1000 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451CD238.3040509@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CD238.3040509@ar.media.kyoto-u.ac.jp> Message-ID: <451DE807.1010602@bigpond.net.au> David Cournapeau wrote: > The things I consider much better in matlab are: > > - profiling -> this is the thing I am missing the most. If one of my > module is slow, I find it really hard to find where the problems are > compared to matlab. Hi David, I've never found profiling a problem, especially using prun in ipython. Have you tried this? Gary From gruben at bigpond.net.au Sat Sep 30 00:13:18 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 30 Sep 2006 14:13:18 +1000 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> Message-ID: <451DEEDE.2060203@bigpond.net.au> Hi Bill, If you're using Windows, the FOSS PyScripter IDE does this. The debugger gives you watches, conditional breakpoints, tooltip hover values etc. Unfortunately it's not available for Linux. The latest release version is here: The latest beta is available here: I use SciTE, ipython and PyScripter, but use PyScripter for most of my serious debugging. Gary R. Bill Baxter wrote: > 2) integration of debugging is not as good as matlab. In matlab when > you stop at a breakpoint in your code, you get an interactive console > where you can probe current values in your program, or create new ones > etc. The Wing IDE has this, but I couldn't find any open source IDEs > that did this. > > --bb From david.p.finlayson at gmail.com Sat Sep 30 03:05:49 2006 From: david.p.finlayson at gmail.com (David Finlayson) Date: Sat, 30 Sep 2006 00:05:49 -0700 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451DEEDE.2060203@bigpond.net.au> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451DEEDE.2060203@bigpond.net.au> Message-ID: I'm a lurker on this list, but this thread peaked my interests. I am a hydrographer (coastal mapping) not a computer scientist and I don't have much training in numerical computation (I did take the token applied math class in Matlab of course). So, my perspective on numeric Python is as an end user in a production environment. To me (and most of the people I support), data analysis environments like Matlab are black boxes. Maybe I am not the numeric Python target audience. We depend extensively on Matlab to do data analysis and plotting on our team. The vast majority of the scientists I work with struggle with programming and hand coding an FFT in FORTRAN would be impossible (for example). Matlab, or something like it is a necessary tool. Why not numeric Python? In a nutshell, it still looks like alpha software compared to Matlab. Documentation is not ready for end users (and not professionally published). Some of the numeric libraries have been around for ages, but that only adds to the confusion because there are numerous packages spread all over the Internet with a chain of dependencies that adds still more confusion. It still looks like a patchwork quilt rather than an organized system. Finally, most production environments outside of academia use Windows or maybe OSX. That means that (a) there is no compiler, its batteries included or its dead; and (b) there is a real need for an integrated environment because Emacs doesn't come installed on Windows. Python itself is making great headway on Windows at least. In my field (mapping), the big commercial vendor included Python as its macro language, so there has been an explosion of interest in python scripting. Recent publishing of IronPython for .NET was fully supported by Microsoft and there is every reason to believe that it will be a popular way to script and control the .NET framework. So, I think that average engineers and scientists are aware of Python the language and would be receptive to a data analysis package in Python so long as it was polished and well done. For example, the R statistical language has done a good job of packaging up R for Windows (I wish it were as well integrated into Gnome). I am not trying to take pot shots at numeric python here or FOSS or Linux. I use all of these personally. I just can't convince myself that this is a safe recommendation for folks I support. -- David Finlayson -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Sat Sep 30 03:22:18 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Sat, 30 Sep 2006 09:22:18 +0200 Subject: [SciPy-user] access to wavelets.scipy.org/svn/ In-Reply-To: <20060929194828.GB13399@swri16wm.electro.swri.edu> References: <20060929194828.GB13399@swri16wm.electro.swri.edu> Message-ID: <20060930072218.GA1607@mentat.za.net> On Fri, Sep 29, 2006 at 02:48:28PM -0500, Glen W. Mabey wrote: > Hello, > > I don't know of a better place to post than this for a question > regarding pywt. > > Following the instructions found at http://wavelets.scipy.org/index.html for > checking out the development version: > > $$ svn co http://wavelets.scipy.org/svn/multiresolution/wavelets/trunk wavelets > svn: REPORT request failed on '/svn/multiresolution/!svn/vcc/default' > svn: REPORT of '/svn/multiresolution/!svn/vcc/default': 400 Bad Request (http://wavelets.scipy.org) > This is a well known SVN issue. Are you running through a proxy (transparent or otherwise)? One way to get around it is to use secure http, the other is to get your system administrator to fix the proxy. See http://subversion.tigris.org/faq.html#proxy Regards St?fan From david.p.finlayson at gmail.com Sat Sep 30 03:53:58 2006 From: david.p.finlayson at gmail.com (David Finlayson) Date: Sat, 30 Sep 2006 00:53:58 -0700 Subject: [SciPy-user] Running mean: numpy or pyrex? Message-ID: I need to run a series of convolution filters over swath bathymetry data. The data is spaced in track lines approximately 4048 samples across and several million lines long. Currently, I have the data in a python 2D list and then send several variations of a running mean down the track. I was fairly impressed with the filter performance using psyco until I met some truly huge files. Now I am wondering if I would get any speed benefit from using numpy (for the 2D array access) or would I be better off trying pyrex for the running mean loops? One of the issues I am facing is the possibility that very large files may need to be read from the disk as needed instead of holding the whole thing in ram. I also need to distrubute this code to my team, so minimizing the pain of installing numeric and other packages would be an issue. Should I expect significant speed benefits from a numy version of this type of code? Note that I have already made the simple optimization to calculate the full mean only the first iteration and just update the mean with the new data for the remaining iterations. Thanks for any advise you might have. -- David Finlayson -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at ar.media.kyoto-u.ac.jp Sat Sep 30 07:54:06 2006 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 30 Sep 2006 20:54:06 +0900 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451DE807.1010602@bigpond.net.au> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CD238.3040509@ar.media.kyoto-u.ac.jp> <451DE807.1010602@bigpond.net.au> Message-ID: <451E5ADE.3050404@ar.media.kyoto-u.ac.jp> Gary Ruben wrote: > Hi David, > I've never found profiling a problem, especially using prun in > ipython. Have you tried this? I didn't know prun, but it looks like it is doing some profiling with hotshot. For example, on one of my package, I get: 300 48.510 0.162 48.510 0.162 densities.py:109(_diag_gauss_den) 10 7.500 0.750 59.930 5.993 gmm_em.py:122(sufficient_statistics) 10 6.470 0.647 13.820 1.382 gmm_em.py:143(update_em) 1050 6.310 0.006 6.310 0.006 :0(dot) 10 3.150 0.315 52.070 5.207 gmm_em.py:233(multiple_gauss_den) 51 1.740 0.034 1.740 0.034 :0(sum) 600 1.510 0.003 1.510 0.003 :0(where) 5 0.810 0.162 0.810 0.162 :0(double_vq) 1 0.650 0.650 2.030 2.030 kmean.py:47(kmean) 1 0.580 0.580 3.780 3.780 gmm_em.py:62(init_kmean) etc... Basically, I know that sufficient_statistics is the cullprit, and I know it is because of _diag_gauss_densities (this last point can only be known if you read the code, though, but I know this code quite well, as it is mine: ) ). But now, how can I optimize this function ? The dot, sum are called everywhere through the code, so I don't know which call are expensive where (all calls are not done with the same args, for example). So in my experience, this is not enough. In matlab, once you do profiling, you can generate a really nice report in the form of one html file, and it gives you the time taken by all your code, per line (for the lines which matter). For people not familiar with matlab, I put an example here: http://www.ar.media.kyoto-u.ac.jp/members/david/profile_results/ (this is the new version of matlab we have at the lab; I am not familiar with it, it is much more fancy than what I need and what used to be on older matlab versions, but this should give you an idea) You have first an index on all top level functions, and you can dig it through as deep as you want. Notice how you know for a given function which call are called when and how often. I have no idea how difficult this would be to implement in python. I was told some months ago on the main python list that hotshot can give a per line profiling of python code, but this is not documented; also, it looks like it is possible to get the source code at runtime without too much difficulty in python. I would be really surprised if nobody tried to do something similar for python in general, because this is really useful. I have never found anything for python, but it may be just because I don't know the name for this kind of tools (I tried googling with terms such as "source profiling", without much success). David From steve at shrogers.com Sat Sep 30 10:50:28 2006 From: steve at shrogers.com (Steven H. Rogers) Date: Sat, 30 Sep 2006 08:50:28 -0600 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451DEEDE.2060203@bigpond.net.au> Message-ID: <451E8434.4010901@shrogers.com> G'day David: I believe the weaknesses you list are well understood and are being addressed. I don't think it fair to say that SciPy looks like alpha software. It may not have a slick interface suitable for the most naive users, but it is quite powerful. You'll have to decide about the "safety" of recommending Python/NumPy/SciPy for your organization. I introduced Python into an engineering organization about six years ago as a scripting language for a piece of equipment we were developing and am just beginning to introduce NumPy. It's been a bumpy road, but Python has gained wider adoption than I expected. Many of our engineers can write an FFT in C or assembly, but balked at Python. Once they really used it, most grew to like it and only drop down to lower level code when necessary. I'd suggest looking for a niche requirement that SciPy can fill as a starting point. If it does well there, you can expand to other niches. If not, the risk should be minimal. Regards, Steve David Finlayson wrote: > I'm a lurker on this list, but this thread peaked my interests. I am a > hydrographer (coastal mapping) not a computer scientist and I don't have > much training in numerical computation (I did take the token applied > math class in Matlab of course). So, my perspective on numeric Python is > as an end user in a production environment. To me (and most of the > people I support), data analysis environments like Matlab are black > boxes. Maybe I am not the numeric Python target audience. > > We depend extensively on Matlab to do data analysis and plotting on our > team. The vast majority of the scientists I work with struggle with > programming and hand coding an FFT in FORTRAN would be impossible (for > example). Matlab, or something like it is a necessary tool. Why not > numeric Python? > > In a nutshell, it still looks like alpha software compared to Matlab. > Documentation is not ready for end users (and not professionally > published). Some of the numeric libraries have been around for ages, but > that only adds to the confusion because there are numerous packages > spread all over the Internet with a chain of dependencies that adds > still more confusion. It still looks like a patchwork quilt rather than > an organized system. Finally, most production environments outside of > academia use Windows or maybe OSX. That means that (a) there is no > compiler, its batteries included or its dead; and (b) there is a real > need for an integrated environment because Emacs doesn't come installed > on Windows. > > Python itself is making great headway on Windows at least. In my field > (mapping), the big commercial vendor included Python as its macro > language, so there has been an explosion of interest in python > scripting. Recent publishing of IronPython for .NET was fully supported > by Microsoft and there is every reason to believe that it will be a > popular way to script and control the .NET framework. So, I think that > average engineers and scientists are aware of Python the language and > would be receptive to a data analysis package in Python so long as it > was polished and well done. For example, the R statistical language has > done a good job of packaging up R for Windows (I wish it were as well > integrated into Gnome). > > I am not trying to take pot shots at numeric python here or FOSS or > Linux. I use all of these personally. I just can't convince myself that > this is a safe recommendation for folks I support. > > -- > David Finlayson -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From robert.kern at gmail.com Sat Sep 30 15:59:45 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 30 Sep 2006 14:59:45 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451E5ADE.3050404@ar.media.kyoto-u.ac.jp> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451CD238.3040509@ar.media.kyoto-u.ac.jp> <451DE807.1010602@bigpond.net.au> <451E5ADE.3050404@ar.media.kyoto-u.ac.jp> Message-ID: <451ECCB1.4070301@gmail.com> David Cournapeau wrote: > You have first an index on all top level functions, and you can dig it > through as deep as you want. Notice how you know for a given function > which call are called when and how often. I have no idea how difficult > this would be to implement in python. I was told some months ago on the > main python list that hotshot can give a per line profiling of python > code, but this is not documented; also, it looks like it is possible to > get the source code at runtime without too much difficulty in python. I > would be really surprised if nobody tried to do something similar for > python in general, because this is really useful. I have never found > anything for python, but it may be just because I don't know the name > for this kind of tools (I tried googling with terms such as "source > profiling", without much success). One excellent tool for drilling through these results is a KDE application called kcachegrind. It was written to visualize valgrind profiling results, but the file format is generic enough that someone wrote a script hotshot2calltree that converts hotshot results to it. I believe it comes with kcachegrind. http://kcachegrind.sourceforge.net/cgi-bin/show.cgi There is a new profiler the comes with 2.5 (but I believe is compatible with at least 2.4) called cProfile (available separately as lsprof). It too has a converter for kcachegrind. http://codespeak.net/svn/user/arigo/hack/misc/lsprof/ http://www.gnome.org/~johan/lsprofcalltree.py -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Sat Sep 30 16:01:54 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 30 Sep 2006 15:01:54 -0500 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451DEEDE.2060203@bigpond.net.au> Message-ID: <451ECD32.3030208@gmail.com> David Finlayson wrote: > In a nutshell, it still looks like alpha software compared to Matlab. > Documentation is not ready for end users (and not professionally > published). Some of the numeric libraries have been around for ages, but > that only adds to the confusion because there are numerous packages > spread all over the Internet with a chain of dependencies that adds > still more confusion. It still looks like a patchwork quilt rather than > an organized system. Finally, most production environments outside of > academia use Windows or maybe OSX. That means that (a) there is no > compiler, its batteries included or its dead; http://code.enthought.com/enthon/ > and (b) there is a real > need for an integrated environment because Emacs doesn't come installed > on Windows. The next release of Enthon will include the SPE IDE. http://pythonide.stani.be/ -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From malleite at gmail.com Sat Sep 30 16:50:54 2006 From: malleite at gmail.com (Marco Leite) Date: Sat, 30 Sep 2006 17:50:54 -0300 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: <451E8434.4010901@shrogers.com> References: <451BFBD7.4080701@ieee.org> <51f97e530609281704n522fa5c8ge00571acec53162e@mail.gmail.com> <451DEEDE.2060203@bigpond.net.au> <451E8434.4010901@shrogers.com> Message-ID: <38c32cda0609301350q38494eb2te09f8689ba9ee631@mail.gmail.com> Hi, Although I am addicted to Python and a frequent user of Numpy/Scipy, I don't want to go into the advantages of Python/Scipy/Numpy, as this has been very well addressed in many previous posts. But I would like to stress some issues that I feel and see/hear more and more from people, and I believe may scare some potential users away from Scipy/Numpy. I think we can draw parallels with many other situations, like Latex/ Word for example. I've got used to the "not so clear and direct" way to write a 10 line letter in Latex, but I also got used to very high quality of the output produced by it. But there was a time where getting all the TeX packages to work together, set up the outputs and so on was a long term project, and required some head scratching. There was a possibility that I would never endure that if all I ever had to do was to write 10 lines letters. I did that because what was available did not met the requirements of the project. I am not saying that Numpy/Scipy is in that same stage, but, for example, only recently we saw an effort to converge to one array package (not to mention that the type array exists in both the standard python package distribution, although different). There are other small quirks, that sometimes is a distribution fault, like matplotlib failing to compile on Suse 10. These are minor problems once one realizes all the potential of Python /Numpy/Scipy, but to overcome, understand and solve this may require some big motivation from the first time user, and can scare the "prospector" type of user, the one thinking "let me try this to see what I can do with it". The problem is that this creates a first impression of broken or alpha software that will not go away easily ... It's good to see that most of this is being addressed, but as someone pointed out, it still requires some travel to different sites and maybe some fiddling to get all working together. Having one big package may not be feasible due distribution restrictions, but a good improvement may be to work with the distributions to have rpms, debs and alike ready to use, even during of the development stage. Cheers, Marco Leite -------------- next part -------------- An HTML attachment was scrubbed... URL: From jstano at interchange.ubc.ca Sat Sep 30 17:01:02 2006 From: jstano at interchange.ubc.ca (joe stano) Date: Sat, 30 Sep 2006 14:01:02 -0700 Subject: [SciPy-user] cobyla help Message-ID: <0J6F00F2HBPOPE@smtp.interchange.ubc.ca> Hi there I am very new to python but have found myself in way over my head. I am trying to perform a constrained optimization using fmin_cobyla and am running into some problems. I was wondering if you may have some examples that I could review. Especially examples that use a list of constraint functions that take inputs from the function that is being optimized. I am a novice who is just trying to learn about python and programming in general. I am working with numpy arrays and when I run my program without the optimization everything seems to run ok. However, when I include the optimization, when I multiply 2 arrays together (element wise not using dot()), I get a (ValueError: shape mismatch: objects cannot be broadcast to a single shape) error message that confuses me as I don't get it when I simply run the program and print out the results. When I exclude the optimization, and print just the array that I pass to the function, it looks like how I define it (N x N array). But when I include the optimization and print what I pass from the function before the code crashes, it no longer looks like how I define it (i.e., 1 x more then N array). Which is why I think I get the shape error. I know that I am calling the optimization wrong so I figured I would give you a snapshot of what I am trying to do, and maybe you can tell what I am doing wrong in calling the optimization. I have attached some of the code with a description of the arrays. Any advice or help you could give would be greatly appreciated. Thanks either way, joe -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: try_optimize_help.py Type: application/octet-stream Size: 2532 bytes Desc: not available URL: From coatimundi at cox.net Sat Sep 30 18:33:41 2006 From: coatimundi at cox.net (Kyle Ferrio) Date: Sat, 30 Sep 2006 15:33:41 -0700 Subject: [SciPy-user] Q: Building SciPy for Windows without Cygwin dependency? Message-ID: <451EF0C5.1010308@cox.net> I want to build a full-up SciPy for MS Windows in line with the Python and SciPy licenses. I noticed the following text in the excellent installation instructions, regarding the choice of compiler: "Cygwin is required if you want to build ATLAS yourself." Well, I often build ATLAS myself. And now (implicitly) I want to build ATLAS in a way that is available to SciPy. So I have two questions: 1. Must ATLAS -- to be used with SciPy -- be built with a dependence on the GPL'ed Cygwin runtime library? (I can't imagine why this would be the case, but I want to test my understanding.) 2. Is the "need" for Cygwin actually just a way to get (extremely convenient) tools like 'ar' which do not infect the ATLAS lib with the GPL? Thanks! From ckkart at hoc.net Sat Sep 30 21:42:31 2006 From: ckkart at hoc.net (Christian Kristukat) Date: Sun, 01 Oct 2006 10:42:31 +0900 Subject: [SciPy-user] cobyla help In-Reply-To: <0J6F00F2HBPOPE@smtp.interchange.ubc.ca> References: <0J6F00F2HBPOPE@smtp.interchange.ubc.ca> Message-ID: <451F1D07.6070905@hoc.net> joe stano wrote: > I am a novice who is just trying to learn about python and programming in > general. I am working with numpy arrays and when I run my program without > the optimization everything seems to run ok. However, when I include the > optimization, when I multiply 2 arrays together (element wise not using > dot()), I get a (ValueError: shape mismatch: objects cannot be broadcast to > a single shape) error message that confuses me as I don't get it when I > simply run the program and print out the results. > > When I exclude the optimization, and print just the array that I pass to the > function, it looks like how I define it (N x N array). But when I include > the optimization and print what I pass from the function before the code > crashes, it no longer looks like how I define it (i.e., 1 x more then N > array). Which is why I think I get the shape error. I haven't run your code but the failure is proabbly to passing the wrong arguments to fmin_cobyla. Usually the scalar minimizers take a 1-D array as input and expect the function to be minimized to return a single scalar value. The constraints functions too have to return a single value > 0 if the constraint is fulfilled, < 0 if not. You get a flattened version of your arrays by calling numpy.ravel(). Later you may want to reshape it too its orginal shape using numpy.reshape(). Modify your script and try again, if it still doesn't work come back here again. Some days ago I posted an example using fmin_cobyla to the list. Have a look at that, too. Regards, Christian From w.northcott at unsw.edu.au Sat Sep 30 22:12:57 2006 From: w.northcott at unsw.edu.au (Bill Northcott) Date: Sun, 1 Oct 2006 12:12:57 +1000 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: Message-ID: On 30/09/2006, at 5:23 PM, David Finlayson wrote: > control the .NET framework. So, I think that average engineers and > scientists are aware of Python the language and would be receptive > to a data > analysis package in Python so long as it was polished and well > done. For > example, the R statistical language has done a good job of > packaging up R > for Windows (I wish it were as well integrated into Gnome). I think this is a very good comparison. Python desperately needs something like CTAN, CPAN, CRAN to bring some order out of the current state of creative chaos. These repositories make it simple to extend environments like Tex, Perl and R. R itself has much in common with Pylab. Matlab is often used to do the sort of advanced stats that might also be done in R . R however has nice potted Windows and MacOS distributions. The MacOS package also includes all the necessary dependencies (X11 Tcl/Tk and gfortran) which are not in Apple's Developer Tools. Bill Northcott From gael.varoquaux at normalesup.org Sat Sep 30 22:34:57 2006 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 1 Oct 2006 04:34:57 +0200 Subject: [SciPy-user] Pros and Cons of Python verses other array environments In-Reply-To: References: Message-ID: <20061001023457.GA14026@clipper.ens.fr> As for packages repository and dead-easy install for windows there is light at the end of the tunnel. Eggs are starting to appear as a package standard, and enthought in working on a installer for windows capable of download egg from repositories (see http://code.enthought.com/enthon/enstaller.shtml for an alpha version). Ga?l