From cimrman3 at ntc.zcu.cz Wed Jan 2 05:03:10 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 02 Jan 2008 11:03:10 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: <477B615E.1060702@ntc.zcu.cz> Nathan Bell wrote: > On Dec 27, 2007 9:11 PM, Jarrod Millman wrote: >>> I'd like to see arpack in the sparse folder (?) very fast as some my code >>> would need a sparse solver (I proposed that it could be moved in a scikit >>> but it makes sense to keep it in scipy so that sparse solvers are available >>> in scipy). >> Yes, arpack should go into the sparse package. If you have the time, >> it would be great if you could help get it moved over. Ideally, we >> can get it moved into scipy.sparse before the 0.7 release around the >> end of March. > > How do you see sparse being structured? Currently sparse contains > only the sparse matrix classes and a handful of creation functions > (e.g. spdiags) and the iterative solvers live in > scipy.linalg.iterative. IMHO iterative solvers (eigen- or linear) do not care about the format of matrices they work with - all they need is the matrix action on a vector. Due to this I think they do not belong under scipy.sparse - they should work without change for dense matrices, or even for matrix-like objects that have only the matrix action (A*x) implemented. lobpcg.py works like that already - a user can pass in a dense/sparse matrix or a function. > It would be strange to put an eigensolver under sparse and iterative > methods for linear systems under linalg. Also, lobpcg should live > along side arpack wherever they end up. I could imagine a structure > like: > > scipy.iterative.linear (for cg/gmres etc.) > scipy.iterative.eigen (for arpack/lobpcg etc.) +1. Maybe instead of 'iterative' I would use something like 'isolve(r(s))' to indicate their purpose better. regards, r. From wnbell at gmail.com Wed Jan 2 15:37:33 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 2 Jan 2008 14:37:33 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <477B615E.1060702@ntc.zcu.cz> References: <477B615E.1060702@ntc.zcu.cz> Message-ID: On Jan 2, 2008 4:03 AM, Robert Cimrman wrote: > IMHO iterative solvers (eigen- or linear) do not care about the format > of matrices they work with - all they need is the matrix action on a > vector. Due to this I think they do not belong under scipy.sparse - they > should work without change for dense matrices, or even for matrix-like > objects that have only the matrix action (A*x) implemented. lobpcg.py > works like that already - a user can pass in a dense/sparse matrix or a > function. > > +1. Maybe instead of 'iterative' I would use something like > 'isolve(r(s))' to indicate their purpose better. Travis suggested creating scipy.splinalg to include the sparse solvers and other functionality. We could have: splinalg.factor -- direct factorization methods (e.g. SuperLU) splinalg.eigen -- sparse eigensolvers (e.g. ARPACK, lobpcg) splinalg.solve -- sparse solvers for linear systems (e.g. CG, GMRES) In the process we'd eliminate scipy.linsolve and scipy.linalg.iterative and move that code to splinalg.factor and splinalg.solve respectively. We could then draw a distinction between scipy.linalg -- dense linear algebra and splinalg -- sparse linear algebra. We could then move sparse functions spkron and spdiags to splinalg.construct or something like that. I'm not married to any of the names above, so feel free to offer alternatives. I agree with your statement that such solvers should not be exclusively for sparse matrices. However it's important to set them apart from the dense solvers so new users can find the most appropriate solver for their needs. We should also explore the idea of having a scipy.gallery in the spirit of MATLAB's gallery function. This would be extremely helpful in illustrating non-trivial usage of the sparse module (in tutorials and docstrings) and would facilitate more interesting unittests. Any comments? -- Nathan Bell wnbell at gmail.com From ondrej at certik.cz Wed Jan 2 15:50:52 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 2 Jan 2008 21:50:52 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: <477B615E.1060702@ntc.zcu.cz> Message-ID: <85b5c3130801021250g652571b6ncd92a93d8954551c@mail.gmail.com> On Jan 2, 2008 9:37 PM, Nathan Bell wrote: > On Jan 2, 2008 4:03 AM, Robert Cimrman wrote: > > IMHO iterative solvers (eigen- or linear) do not care about the format > > of matrices they work with - all they need is the matrix action on a > > vector. Due to this I think they do not belong under scipy.sparse - they > > should work without change for dense matrices, or even for matrix-like > > objects that have only the matrix action (A*x) implemented. lobpcg.py > > works like that already - a user can pass in a dense/sparse matrix or a > > function. > > > > +1. Maybe instead of 'iterative' I would use something like > > 'isolve(r(s))' to indicate their purpose better. > > Travis suggested creating scipy.splinalg to include the sparse solvers > and other functionality. We could have: > > splinalg.factor -- direct factorization methods (e.g. SuperLU) > splinalg.eigen -- sparse eigensolvers (e.g. ARPACK, lobpcg) > splinalg.solve -- sparse solvers for linear systems (e.g. CG, GMRES) > > In the process we'd eliminate scipy.linsolve and > scipy.linalg.iterative and move that code to splinalg.factor and > splinalg.solve respectively. We could then draw a distinction between > scipy.linalg -- dense linear algebra and splinalg -- sparse linear > algebra. We could then move sparse functions spkron and spdiags to > splinalg.construct or something like that. > > I'm not married to any of the names above, so feel free to offer alternatives. > > I agree with your statement that such solvers should not be > exclusively for sparse matrices. However it's important to set them > apart from the dense solvers so new users can find the most > appropriate solver for their needs. > > We should also explore the idea of having a scipy.gallery in the > spirit of MATLAB's gallery function. This would be extremely helpful > in illustrating non-trivial usage of the sparse module (in tutorials > and docstrings) and would facilitate more interesting unittests. Any > comments? No comments, just agree with everything you said. Actually one comment - how could one plug more solvers to the above architecture? (pysparse comes to my mind) Ondrej From ondrej at certik.cz Wed Jan 2 15:51:41 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 2 Jan 2008 21:51:41 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <85b5c3130801021250g652571b6ncd92a93d8954551c@mail.gmail.com> References: <477B615E.1060702@ntc.zcu.cz> <85b5c3130801021250g652571b6ncd92a93d8954551c@mail.gmail.com> Message-ID: <85b5c3130801021251k41c510c8p38ae9ad0efbafb03@mail.gmail.com> On Jan 2, 2008 9:50 PM, Ondrej Certik wrote: > > On Jan 2, 2008 9:37 PM, Nathan Bell wrote: > > On Jan 2, 2008 4:03 AM, Robert Cimrman wrote: > > > IMHO iterative solvers (eigen- or linear) do not care about the format > > > of matrices they work with - all they need is the matrix action on a > > > vector. Due to this I think they do not belong under scipy.sparse - they > > > should work without change for dense matrices, or even for matrix-like > > > objects that have only the matrix action (A*x) implemented. lobpcg.py > > > works like that already - a user can pass in a dense/sparse matrix or a > > > function. > > > > > > +1. Maybe instead of 'iterative' I would use something like > > > 'isolve(r(s))' to indicate their purpose better. > > > > Travis suggested creating scipy.splinalg to include the sparse solvers > > and other functionality. We could have: > > > > splinalg.factor -- direct factorization methods (e.g. SuperLU) > > splinalg.eigen -- sparse eigensolvers (e.g. ARPACK, lobpcg) > > splinalg.solve -- sparse solvers for linear systems (e.g. CG, GMRES) > > > > In the process we'd eliminate scipy.linsolve and > > scipy.linalg.iterative and move that code to splinalg.factor and > > splinalg.solve respectively. We could then draw a distinction between > > scipy.linalg -- dense linear algebra and splinalg -- sparse linear > > algebra. We could then move sparse functions spkron and spdiags to > > splinalg.construct or something like that. > > > > I'm not married to any of the names above, so feel free to offer alternatives. > > > > I agree with your statement that such solvers should not be > > exclusively for sparse matrices. However it's important to set them > > apart from the dense solvers so new users can find the most > > appropriate solver for their needs. > > > > We should also explore the idea of having a scipy.gallery in the > > spirit of MATLAB's gallery function. This would be extremely helpful > > in illustrating non-trivial usage of the sparse module (in tutorials > > and docstrings) and would facilitate more interesting unittests. Any > > comments? > > No comments, just agree with everything you said. > > Actually one comment - how could one plug more solvers to the above > architecture? > (pysparse comes to my mind) and blzpack - that one is BSD, so it could even go to scipy. Ondrej From wnbell at gmail.com Wed Jan 2 16:15:36 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 2 Jan 2008 15:15:36 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <85b5c3130801021251k41c510c8p38ae9ad0efbafb03@mail.gmail.com> References: <477B615E.1060702@ntc.zcu.cz> <85b5c3130801021250g652571b6ncd92a93d8954551c@mail.gmail.com> <85b5c3130801021251k41c510c8p38ae9ad0efbafb03@mail.gmail.com> Message-ID: On Jan 2, 2008 2:51 PM, Ondrej Certik wrote: > > No comments, just agree with everything you said. > > > > Actually one comment - how could one plug more solvers to the above > > architecture? > > (pysparse comes to my mind) I would think a JDSYM solver like the one in pysparse could live in splinalg.eigen > and blzpack - that one is BSD, so it could even go to scipy. Likewise for blzpack -- Nathan Bell wnbell at gmail.com From wnbell at gmail.com Wed Jan 2 18:34:45 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 2 Jan 2008 17:34:45 -0600 Subject: [SciPy-dev] matlab5 sparse IO Message-ID: First let me thank those who have contributed to MATLAB IO in scipy. I noticed that the Matlab4 sparse writer was a little slow, so I sped it up using the coo_matrix format. Anyway, I was going to do the same for the Matlab5 writer, but it seems to be broken. It appears that the V5 code was a copy/paste of the V4 method, which it shouldn't be. http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/matlab/mio5.py#L662 Looking at the V5 sparse reader: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/matlab/mio5.py#L349 suggests that the CSC format should be used when writing, but the present code - even if it worked properly - appears to be writing out a triplet format (as is done in V4). I don't fully understand all the magic in the matlab IO code, but I'd be happy to help whomever maintains it in adding sparse V5 support. Here's a simple test that currently fails: from scipy import * from scipy.sparse import * from scipy.io import * from numpy.testing import * A = csr_matrix([[1.0,0.0,2.0],[0,-3.5,0]]) savemat('test4.mat', { 'A': A }, format='4') A4 = loadmat('test4.mat')['A'] assert_equal(A.todense(),A4.todense()) savemat('test5.mat', { 'A': A }, format='5') A5 = loadmat('test5.mat')['A'] assert_equal(A.todense(),A5.todense()) Note that the failure in savemat(... format='5') is currently just a typo (N is used as a variable and for numpy), but I think there are more fundamental issues as well. -- Nathan Bell wnbell at gmail.com From matthew.brett at gmail.com Wed Jan 2 18:58:58 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 2 Jan 2008 15:58:58 -0800 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: References: Message-ID: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> Hi Nathan, > I noticed that the Matlab4 sparse writer was a little slow, so I sped > it up using the coo_matrix format. Anyway, I was going to do the same > for the Matlab5 writer, but it seems to be broken. It appears that > the V5 code was a copy/paste of the V4 method, which it shouldn't be. > http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/matlab/mio5.py#L662 I'm sorry - this was my fault - I did most of the initial rewrite of the code, but haven't been paying it enough attention since then - so thanks for your reminder. > I don't fully understand all the magic in the matlab IO code, but I'd > be happy to help whomever maintains it in adding sparse V5 support. Thanks - I am sure that will be useful. At the moment I'm trying to recover from flu, and porting the scipy tests over to nose, an intriguing combination. But it would be good to get onto this soon. Will anyone else have time to join in here? Stefan? Thanks again, Matthew From wnbell at gmail.com Wed Jan 2 19:11:20 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 2 Jan 2008 18:11:20 -0600 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> References: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> Message-ID: On Jan 2, 2008 5:58 PM, Matthew Brett wrote: > > I'm sorry - this was my fault - I did most of the initial rewrite of > the code, but haven't been paying it enough attention since then - so > thanks for your reminder. > > Thanks - I am sure that will be useful. At the moment I'm trying to > recover from flu, and porting the scipy tests over to nose, an > intriguing combination. But it would be good to get onto this soon. > > Will anyone else have time to join in here? Stefan? Those are definitely higher priorities, so I'm willing to wait :) The V5 write support is not especially urgent since V4 writing (the default) seems to work. The user would have to specify V5 in order to produce the error. V5 reading seems to work also. -- Nathan Bell wnbell at gmail.com From stefan at sun.ac.za Wed Jan 2 19:33:35 2008 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 3 Jan 2008 02:33:35 +0200 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> References: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> Message-ID: <20080103003335.GJ5753@mentat.za.net> On Wed, Jan 02, 2008 at 03:58:58PM -0800, Matthew Brett wrote: > Hi Nathan, > > > I noticed that the Matlab4 sparse writer was a little slow, so I sped > > it up using the coo_matrix format. Anyway, I was going to do the same > > for the Matlab5 writer, but it seems to be broken. It appears that > > the V5 code was a copy/paste of the V4 method, which it shouldn't be. > > http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/matlab/mio5.py#L662 > > I'm sorry - this was my fault - I did most of the initial rewrite of > the code, but haven't been paying it enough attention since then - so > thanks for your reminder. > > > I don't fully understand all the magic in the matlab IO code, but I'd > > be happy to help whomever maintains it in adding sparse V5 support. > > Thanks - I am sure that will be useful. At the moment I'm trying to > recover from flu, and porting the scipy tests over to nose, an > intriguing combination. But it would be good to get onto this soon. > > Will anyone else have time to join in here? Stefan? Sure, I can take a look tomorrow morning. To find the "more fundamental" issues Nathan referred to, I'll take a look at http://www.mathworks.com/access/helpdesk/help/pdf_doc/matlab/matfile_format.pdf Please let me know if there are any other relevant docs. I assume the reader functions correctly, since the unit tests pass and I see testsparse_6.1_SOL2.mat testsparse_6.5.1_GLNX86.mat testsparse_7.1_GLNX86.mat testsparsecomplex_6.1_SOL2.mat testsparsecomplex_6.5.1_GLNX86.mat testsparsecomplex_7.1_GLNX86.mat in the test directory. Regards St?fan From matthew.brett at gmail.com Wed Jan 2 19:54:24 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 2 Jan 2008 16:54:24 -0800 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: <20080103003335.GJ5753@mentat.za.net> References: <1e2af89e0801021558y10f49734o51326e6b9f919d3e@mail.gmail.com> <20080103003335.GJ5753@mentat.za.net> Message-ID: <1e2af89e0801021654xa1f6fb2tf9f47e7bc7443806@mail.gmail.com> Hi, > Sure, I can take a look tomorrow morning. Excellent - thanks very much. > To find the "more fundamental" issues Nathan referred to, I'll take a look at >http://www.mathworks.com/access/helpdesk/help/pdf_doc/matlab/matfile_format.pdf That was my primary source. I think the V5 writing was a patch kindly provided - that I didn't check carefully enough. I had only put in the structure for it but didn't implement it myself. > I assume the > reader functions correctly, since the unit tests pass and I see I believe it does - but it would be good hear from anyone who can't read a matlab .mat file... Thanks a lot, Matthew From cimrman3 at ntc.zcu.cz Thu Jan 3 04:18:05 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 03 Jan 2008 10:18:05 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: <477B615E.1060702@ntc.zcu.cz> Message-ID: <477CA84D.6030307@ntc.zcu.cz> Nathan Bell wrote: > On Jan 2, 2008 4:03 AM, Robert Cimrman wrote: >> IMHO iterative solvers (eigen- or linear) do not care about the format >> of matrices they work with - all they need is the matrix action on a >> vector. Due to this I think they do not belong under scipy.sparse - they >> should work without change for dense matrices, or even for matrix-like >> objects that have only the matrix action (A*x) implemented. lobpcg.py >> works like that already - a user can pass in a dense/sparse matrix or a >> function. >> >> +1. Maybe instead of 'iterative' I would use something like >> 'isolve(r(s))' to indicate their purpose better. > > Travis suggested creating scipy.splinalg to include the sparse solvers > and other functionality. We could have: > > splinalg.factor -- direct factorization methods (e.g. SuperLU) > splinalg.eigen -- sparse eigensolvers (e.g. ARPACK, lobpcg) > splinalg.solve -- sparse solvers for linear systems (e.g. CG, GMRES) > > In the process we'd eliminate scipy.linsolve and > scipy.linalg.iterative and move that code to splinalg.factor and > splinalg.solve respectively. We could then draw a distinction between > scipy.linalg -- dense linear algebra and splinalg -- sparse linear > algebra. We could then move sparse functions spkron and spdiags to > splinalg.construct or something like that. Maybe the contruction functions could live directly in the splinalg namespace? > I'm not married to any of the names above, so feel free to offer alternatives. The names are good. Now what to do with umfpack? The current situation when the wrappers are in linsolve and the solver is external is not ideal. I did it this way at time when there were no scikits and SuperLU solver included in scipy had not performed well for my matrices. Would it be a good scikit? Or we could contact Tim Davis... Like Ondrej said, more solvers are always needed. > I agree with your statement that such solvers should not be > exclusively for sparse matrices. However it's important to set them > apart from the dense solvers so new users can find the most > appropriate solver for their needs. Yes, it makes sense to separate dense and sparse solvers, at least until there is some notion of a sparse matrix in numpy and most relevant functions work for 2D dense arrays, dense matrices and sparse matrices. We could make a new sparse matrix-like class, which just implements a matrix action via a user-specified function. Is the name MatrixAction ok? Many solvers expect a particular sparse matrix format. Apart from the fact that this should be properly documented, I would like to issue a warning when an implicit conversion of a sparse format occurs. This could be controlled by some warning-level or verbosity parameter. > We should also explore the idea of having a scipy.gallery in the > spirit of MATLAB's gallery function. This would be extremely helpful > in illustrating non-trivial usage of the sparse module (in tutorials > and docstrings) and would facilitate more interesting unittests. Any > comments? +1 I am sure Nils Wagner has lots of interesting matrices to test eigensolvers. :) r. From wnbell at gmail.com Thu Jan 3 04:54:07 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 3 Jan 2008 03:54:07 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <477CA84D.6030307@ntc.zcu.cz> References: <477B615E.1060702@ntc.zcu.cz> <477CA84D.6030307@ntc.zcu.cz> Message-ID: On Jan 3, 2008 3:18 AM, Robert Cimrman wrote: > Maybe the contruction functions could live directly in the splinalg > namespace? That's possible. I can only think of a few functions, so a submodule is probably unnecessary. > Now what to do with umfpack? The current situation when the wrappers are > in linsolve and the solver is external is not ideal. I did it this way > at time when there were no scikits and SuperLU solver included in scipy > had not performed well for my matrices. Would it be a good scikit? Or we > could contact Tim Davis... I think it would make a good scikit. It seems highly unlikely that Tim Davis would make a license change for us, but it's worth asking. Perhaps he'd put an older version in the public domain? Are there BSD licensed sparse LU codes that are superior to SuperLU? > We could make a new sparse matrix-like class, which just implements a > matrix action via a user-specified function. Is the name MatrixAction ok? Currently, the iterative solvers only require a matvec(x) method and perhaps a .shape attribute. I suppose you mean a class: MatrixAction(matvec, shape, rmatvec=None) that's essentially a dummy object which satisfies our expectations in the sparse solvers? I added rmatvec because some iterative solvers require matrix vector products by the transpose. I'm not keen on the name, but the idea is quite good. It's easier to document MatrixAction once than describing the expected interface (i.e. .matvec() and .shape) in each iterative solver's docstring. > Many solvers expect a particular sparse matrix format. Apart from the > fact that this should be properly documented, I would like to issue a > warning when an implicit conversion of a sparse format occurs. This > could be controlled by some warning-level or verbosity parameter. I've added a SparseEfficiencyWarning for people who construct matrices using the CSR/CSC formats. I suppose we could do something similar for functions which expect a particular format. Should we use SparseEfficiencyWarning for this situation, or should a new Warning be created? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From cimrman3 at ntc.zcu.cz Thu Jan 3 05:06:59 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 03 Jan 2008 11:06:59 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: <477B615E.1060702@ntc.zcu.cz> <477CA84D.6030307@ntc.zcu.cz> Message-ID: <477CB3C3.8070505@ntc.zcu.cz> Nathan Bell wrote: >> Now what to do with umfpack? The current situation when the wrappers are >> in linsolve and the solver is external is not ideal. I did it this way >> at time when there were no scikits and SuperLU solver included in scipy >> had not performed well for my matrices. Would it be a good scikit? Or we >> could contact Tim Davis... > > I think it would make a good scikit. > > It seems highly unlikely that Tim Davis would make a license change > for us, but it's worth asking. Perhaps he'd put an older version in > the public domain? Yeah, even an older version would be good. But as every version gets (significantly) faster than the previous ones, a scikit should exist for the latest version. > Are there BSD licensed sparse LU codes that are superior to SuperLU? This could be a good project for a student to find out... >> We could make a new sparse matrix-like class, which just implements a >> matrix action via a user-specified function. Is the name MatrixAction ok? > > Currently, the iterative solvers only require a matvec(x) method and > perhaps a .shape attribute. I suppose you mean a class: > MatrixAction(matvec, shape, rmatvec=None) > that's essentially a dummy object which satisfies our expectations in > the sparse solvers? I added rmatvec because some iterative solvers > require matrix vector products by the transpose. > > I'm not keen on the name, but the idea is quite good. It's easier to > document MatrixAction once than describing the expected interface > (i.e. .matvec() and .shape) in each iterative solver's docstring. DummyMatrix/DummySparseMatrix then? I am bad at inventing names. >> Many solvers expect a particular sparse matrix format. Apart from the >> fact that this should be properly documented, I would like to issue a >> warning when an implicit conversion of a sparse format occurs. This >> could be controlled by some warning-level or verbosity parameter. > > I've added a SparseEfficiencyWarning for people who construct matrices > using the CSR/CSC formats. I suppose we could do something similar > for functions which expect a particular format. > > Should we use SparseEfficiencyWarning for this situation, or should a > new Warning be created? SparseEfficiencyWarning is perfect! Thanks for adding this. r. From matthew.brett at gmail.com Thu Jan 3 12:46:18 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 3 Jan 2008 09:46:18 -0800 Subject: [SciPy-dev] stat.models.robust.scale test failure Message-ID: <1e2af89e0801030946h2fa74688y2d76c520dc46d2fe@mail.gmail.com> Hi, I guess this one's mainly for Jonathan... I'm just moving the scipy testing framework over to nose testing, and came across a couple of test failures in scipy.stats.models in tests that the numpy testing did not appear to pick up. Output appended. Is the fix obvious? I will have a go later today if nothing comes immediately to mind... Thanks a lot, Matthew ====================================================================== ERROR: test_huber (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/tests/test_scale.py", line 35, in test_huber m = scale.huber(X) File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/mb312/lib64/python2.5/site-packages/numpy/core/fromnumeric.py", line 866, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== ERROR: test_huberaxes (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/tests/test_scale.py", line 40, in test_huberaxes m = scale.huber(X, axis=0) File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/mb312/dev_trees/testing_cleanup/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/mb312/lib64/python2.5/site-packages/numpy/core/fromnumeric.py", line 866, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ---------------------------------------------------------------------- Ran 42 tests in 2.012s From ondrej at certik.cz Thu Jan 3 13:42:12 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 3 Jan 2008 19:42:12 +0100 Subject: [SciPy-dev] stat.models.robust.scale test failure In-Reply-To: <1e2af89e0801030946h2fa74688y2d76c520dc46d2fe@mail.gmail.com> References: <1e2af89e0801030946h2fa74688y2d76c520dc46d2fe@mail.gmail.com> Message-ID: <85b5c3130801031042q2bdb969l6bbdc3d86e2b6c0f@mail.gmail.com> On Jan 3, 2008 6:46 PM, Matthew Brett wrote: > Hi, > > I guess this one's mainly for Jonathan... > > I'm just moving the scipy testing framework over to nose testing, and That's very cool, I welcome that! Ondrej From stefan at sun.ac.za Thu Jan 3 14:36:10 2008 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 3 Jan 2008 21:36:10 +0200 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: References: Message-ID: <20080103193610.GA21502@mentat.za.net> On Wed, Jan 02, 2008 at 05:34:45PM -0600, Nathan Bell wrote: > I noticed that the Matlab4 sparse writer was a little slow, so I sped > it up using the coo_matrix format. Anyway, I was going to do the same > for the Matlab5 writer, but it seems to be broken. It appears that > the V5 code was a copy/paste of the V4 method, which it shouldn't be. > > http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/matlab/mio5.py#L662 This should now be fixed (r3770). Please give it a whirl and let me know. Cheers St?fan From wnbell at gmail.com Thu Jan 3 17:29:32 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 3 Jan 2008 16:29:32 -0600 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: <20080103193610.GA21502@mentat.za.net> References: <20080103193610.GA21502@mentat.za.net> Message-ID: On Jan 3, 2008 1:36 PM, Stefan van der Walt wrote: > This should now be fixed (r3770). Please give it a whirl and let me > know. Thanks for clearing this up Stefan. I see two remaining issues: sorted indices and non-double data types. MATLAB seems to expect sorted row indices, which can be fixed with a simple: A = self.arr.tocsc() A.sort_indices() I have committed this change to SVN. When loading 'byte.mat' and 'single.mat' produced by the script below I get: >> load byte >> A A = 1.0e-316 * (1,1) 0.8322 (2,1) 0 (3,1) 0 (4,1) 0 >> load single >> A A = (1,1) 0.0078 (2,1) 0.0078 (3,1) 0 (4,1) 0 So something is amiss with the data array. It appears to be loading the data as double precision no matter what dtype is used. Interestingly, scipy's loadmat() works properly on byte.mat and single.mat. I'm using MATLAB R2007b (linux,32-bit). from scipy import * from scipy.sparse import * from scipy.io import savemat indptr = array([0, 4]) indices = array([3, 2, 1, 0]) data = array([ 1., 1., 1., 1.]) A = csc_matrix((data,indices,indptr),shape=(4,1)) savemat('unsorted.mat',{'A' : A }, format='5') A.sort_indices() savemat('sorted.mat',{'A' : A }, format='5') A = A.astype('int8') savemat('byte.mat',{'A' : A }, format='5') A = A.astype('f4') savemat('single.mat',{'A' : A }, format='5') -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From stefan at sun.ac.za Thu Jan 3 19:30:41 2008 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 4 Jan 2008 02:30:41 +0200 Subject: [SciPy-dev] matlab5 sparse IO In-Reply-To: References: <20080103193610.GA21502@mentat.za.net> Message-ID: <20080104003041.GA2271@mentat.za.net> Hi Nathan On Thu, Jan 03, 2008 at 04:29:32PM -0600, Nathan Bell wrote: > I see two remaining issues: sorted indices and non-double data > types. I fixed a problem with the array labelling, but I doubt whether that would solve your problem (it does permit me to load the files into Octave, but the values are still not correct). I'll take another look tomorrow. Regards St?fan From nwagner at iam.uni-stuttgart.de Fri Jan 4 06:16:06 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 04 Jan 2008 12:16:06 +0100 Subject: [SciPy-dev] Trouble with fmin_cg Message-ID: Hi all, If I run the following script from scipy import * def func(x): return 0.5*dot(x,dot(A,x))-dot(x,b) def monitor(x): res = linalg.norm(dot(A,x)-b) return res n = 10 x0 = zeros(n,float) A = random.rand(n,n)+diag(4*ones(n)) A = 0.5*(A+A.T) b = random.rand(n) x = optimize.fmin_cg(func,x0,callback=monitor(x0)) I get Traceback (most recent call last): File "test_fmin_cg.py", line 16, in x = optimize.fmin_cg(func,x0,callback=monitor(x0)) File "/usr/local/lib64/python2.5/site-packages/scipy/optimize/optimize.py", line 911, in fmin_cg callback(xk) TypeError: 'numpy.float64' object is not callable Is that a bug ? Nils From dmitrey.kroshko at scipy.org Fri Jan 4 06:23:00 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Fri, 04 Jan 2008 13:23:00 +0200 Subject: [SciPy-dev] Trouble with fmin_cg In-Reply-To: References: Message-ID: <477E1714.5050007@scipy.org> I guess monitor(x0) is real number, not function. Hence it yields error. Maybe you need fmin_cg(func,x0,callback=monitor) ? BTW there is a way of connecting user-defined output func in openopt, I can explain you if you are interested (I haven't provided an example to OO documentation yet). Regards, D. Nils Wagner wrote: > Hi all, > > If I run the following script > > from scipy import * > > def func(x): > return 0.5*dot(x,dot(A,x))-dot(x,b) > > def monitor(x): > res = linalg.norm(dot(A,x)-b) > return res > > n = 10 > x0 = zeros(n,float) > A = random.rand(n,n)+diag(4*ones(n)) > A = 0.5*(A+A.T) > b = random.rand(n) > > x = optimize.fmin_cg(func,x0,callback=monitor(x0)) > > I get > > Traceback (most recent call last): > File "test_fmin_cg.py", line 16, in > x = optimize.fmin_cg(func,x0,callback=monitor(x0)) > File > "/usr/local/lib64/python2.5/site-packages/scipy/optimize/optimize.py", > line 911, in fmin_cg > callback(xk) > TypeError: 'numpy.float64' object is not callable > > Is that a bug ? > > Nils > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > From nwagner at iam.uni-stuttgart.de Fri Jan 4 06:31:41 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 04 Jan 2008 12:31:41 +0100 Subject: [SciPy-dev] Trouble with fmin_cg In-Reply-To: <477E1714.5050007@scipy.org> References: <477E1714.5050007@scipy.org> Message-ID: On Fri, 04 Jan 2008 13:23:00 +0200 dmitrey wrote: > I guess monitor(x0) is real number, not function. > Hence it yields error. > Maybe you need > > fmin_cg(func,x0,callback=monitor) ? > > BTW there is a way of connecting user-defined output >func in openopt, I > can explain you if you are interested (I haven't >provided an example to > OO documentation yet). > > Regards, D. > > Nils Wagner wrote: >> Hi all, >> >> If I run the following script >> >> from scipy import * >> >> def func(x): >> return 0.5*dot(x,dot(A,x))-dot(x,b) >> >> def monitor(x): >> res = linalg.norm(dot(A,x)-b) >> return res >> >> n = 10 >> x0 = zeros(n,float) >> A = random.rand(n,n)+diag(4*ones(n)) >> A = 0.5*(A+A.T) >> b = random.rand(n) >> >> x = optimize.fmin_cg(func,x0,callback=monitor(x0)) >> >> I get >> >> Traceback (most recent call last): >> File "test_fmin_cg.py", line 16, in >> x = optimize.fmin_cg(func,x0,callback=monitor(x0)) >> File >> "/usr/local/lib64/python2.5/site-packages/scipy/optimize/optimize.py", >> line 911, in fmin_cg >> callback(xk) >> TypeError: 'numpy.float64' object is not callable >> >> Is that a bug ? >> >> Nils >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Hi Dmitrey, Thank you for your prompt reply. It would be very kind of you if you could supply an OO example. Cheers Nils From nwagner at iam.uni-stuttgart.de Fri Jan 4 07:11:03 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 04 Jan 2008 13:11:03 +0100 Subject: [SciPy-dev] Pending tickets and roadmap 0.7 Message-ID: Hi all, The tickets 572, 575 and 576 can be closed. http://projects.scipy.org/scipy/scipy/query?status=new&status=assigned&status=reopened&milestone=0.7 Cheers, Nils From nwagner at iam.uni-stuttgart.de Fri Jan 4 09:02:40 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 04 Jan 2008 15:02:40 +0100 Subject: [SciPy-dev] Pending tickets and roadmap 0.7 In-Reply-To: References: Message-ID: On Fri, 04 Jan 2008 13:11:03 +0100 "Nils Wagner" wrote: > Hi all, > > The tickets 572, 575 and 576 can be closed. > > http://projects.scipy.org/scipy/scipy/query?status=new&status=assigned&status=reopened&milestone=0.7 > > Cheers, > Nils > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev How about tickets 569 and 314 ? AFAIK read_array is deprecated. Cheers Nils From jre at enthought.com Fri Jan 4 15:48:55 2008 From: jre at enthought.com (J. Ryan Earl) Date: Fri, 04 Jan 2008 14:48:55 -0600 Subject: [SciPy-dev] planet.scipy.org Message-ID: <477E9BB7.2000100@enthought.com> There was a problem with how the DNS was setup for this site, it has been corrected. There are no known outstanding DNS issues. Please tell me ASAP if you notice any problems. Thanks, J. Ryan Earl IT Administrator Enthought, Inc. From millman at berkeley.edu Fri Jan 4 16:00:14 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 4 Jan 2008 13:00:14 -0800 Subject: [SciPy-dev] [Enthought-dev] planet.scipy.org In-Reply-To: <477E9BB7.2000100@enthought.com> References: <477E9BB7.2000100@enthought.com> Message-ID: On Jan 4, 2008 12:48 PM, J. Ryan Earl wrote: > There was a problem with how the DNS was setup for this site, it has > been corrected. Thanks for getting this fixed! -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From fperez.net at gmail.com Mon Jan 7 02:18:44 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 7 Jan 2008 00:18:44 -0700 Subject: [SciPy-dev] Sage/Scipy Days 8 at Enthought: Feb 29/March 4 2008 Message-ID: Hi all, below is the full text of the announcement, which has also been posted here: http://wiki.sagemath.org/days8 Many thanks to Enthought for the generous support they've offered! We really look forward to this meeting being a great opportunity for collaboration between the Scipy and Sage teams. Cheers, Travis Oliphant William Stein Fernando Perez ================================ Sage/Scipy Days 8 at Enthought ================================ ------------------------------------------------------- Connecting Pure Mathematics With Scientific Computation ------------------------------------------------------- .. Contents:: .. 1 Introduction 2 Location 3 Costs and Funding 4 Contacts and further information 5 Preliminary agenda 5.1 Friday 29: Talks 5.2 Saturday 1: More talks/code planning/code 5.3 Sunday 2: Coding 5.4 Monday 3: Coding 5.5 Tuesday 4: Wrapup .. Introduction ============ The Sage_ and Scipy_ teams and `Enthought Inc.`_ are pleased to announce the first collaborative meeting for Sage/Scipy joint development, to be held from February 29 until March 4, 2007 at the Enthought headquarters in Austin, Texas. The purpose of this meeting is to gather developers for these projects in a friendly atmosphere for a few days of technical talks and active development work. It should be clear to those interested in attending that this is *not* an academic conference, but instead an opportunity for the two teams to find common points for collaboration, joint work, better integration between the projects and future directions. The focus of the workshop will be to actually *implement* such ideas, not just to plan for them. **Sage** is a Python-based system which aims at providing an open source, free alternative to existing propietary mathematical software and does so by integrating multiple open source projects, as well as providing its own native functionality in many areas. It includes by default the NumPy and SciPy packages. **NumPy and SciPy** are Python libraries whose focus is high-performance numerical computing, and they are widely accepted as the foundation of most Python-based scientific computing projects. **Enthought** is a scientific computing company that produces Python-based tools for many application-specific domains. Enthought has a strong commitment to open source development: it provides support and hosting for Numpy, Scipy, and many other Python scientific projects and many of its tools_ are freely available. The theme of the workshop is finding ways to best combine our strengths to create something that is significantly better than anything ever done so far. .. _Sage: http://sagemath.org .. _Scipy: http://scipy.org .. _`Enthought Inc.`: http://enthought.com .. _tools: http://code.enthought.com Location ======== The workshop will be held at the headquarters of `Enthought Inc.`_:: Suite 2100 515 Congress Ave. Austin, TX 78701 (512) 536-1057, voice (512) 536-1059, fax .. _`Enthought Inc.`: http://enthought.com Costs and Funding ================= We can accomodate a total of 30 attendees at Enthought's headquarters for the meeting. There is a $100 registration fee, which will be used to cover coffee, snacks and lunches for all the days of the meeting, plus one group dinner outing. Attendees can use the wiki_ to coordinate room/car rental sharing if they so desire. Thanks to Enthought's generous offer of support, we'll be able to cover the above costs for 15 attendees, in addition to offering them housing and transportation. Please note that housing will be provided at Enthought's personal residences, so remember to bring your clean pajamas. We are currently looking into the possibility of additional funding to cover the registration fee for all attendees, and will update the wiki accordingly if that becomes possible. If you plan on coming please email Fernando.Perez at colorado.edu to let us know of your intent so we can have a better idea of the total numbers. Please indicate if you could only come under the condition that you can be hosted. We will try to offer hosting to as many of the Sage and Scipy developers as possible, but if you can fund your own expenses, this may open a slot for someone with limited funds. If the total attendance is below 15, we will offer hosting to everyone. We will close registration for those requesting hosting by Sunday, February 3, 2008. If we actually fill up all 30 available slots we will announce it, otherwise you are free to attend by letting us know anytime before the meeting, though past Feb. 1 you will be required to pay the registration fee of $100. .. _wiki: http://wiki.sagemath.org/days8 Contacts and further information ================================ For further information, you can either contact one of the following people (in parenthesis we note the topic most likely to be relevant to them): - William Stein (Sage): wstein at gmail.org - Fernando Perez (Scipy): Fernando.Perez at colorado.edu - Travis Oliphant (Enthought): oliphant at enthought.com or you can go to our wiki_ for up to date details. Preliminary agenda ================== Friday 29: Talks ---------------- This is a rough cut of suggested topics, along with a few notes on possible details that might be of interest. The actual material in the talks will be up to the presenters, of course. Some of these topics might just become projects to work on rather than actual talks, if we don't have a speaker available but have interested parties who wish to focus on the problem. Speakers are asked to include a slide of where they see any chances for better collaboration between the various projects (to the best of their knowledge). There will be a note-taker during the day who will try to keep tabs on this information and will summarize it as starting material for the joint work panel discussion to be held on Saturday (FPerez volunteers for this task if needed). - Numpy internal architecture and type system. - Sage internal type system, with emphasis on its number type system. - A clarification of where the 'sage language' goes beyond python. Things like ``A\b`` are valid in the CLI but not the notebook. Scipy is pure python, so it would help the scipy team better understand the boundaries between the two. - Special methods used by Sage (foo._magical_sage_method_)? If some of these make sense, we might want to agree on common protocols for numpy/scipy/sage objects to honor. - Sage usage of numpy, sage.matrix vs numpy.arrays. Smoother integration of numpy arrays/sage matrices and vectors. - Extended precision LAPACK. Integration in numpy/sage. The extended precision work LAPACK work was done by Y. Hida and J. Demmel at UC Berkeley. - Distributed/Parallel computing: DSage, ipython, Brian Granger's work on Global arrays for NASA... - Scikits: these are 'toolkits' that use numpy/scipy and can contain GPL code (details of how these will work are being firmed up in the scipy lists, and will be settled by the workshop). Perhaps some of SAGE's library wrappers (like GMP, MPFR or GSL) could become scikits? - Cython: status (inclusion in py2.6?), overview, opportunities for better numpy integration and usage. - Enthought technologies: Traits, TVTK, Mayavi, Chaco, Envisage. - User interface collaboration: 'sage-lite'/pylab/ipython code sharing possibilities? Saturday 1: More talks/code planning/coding ------------------------------------------- 9-11 am: Any remaining talks that didn't fit on Friday. Only if needed. 11-12: panel for specific coding projects and ideas, spill over into lunch time. 12-1: lunch. Rest of day: start coding! Organize in teams according to the plans made earlier and code away... Sunday 2: Coding ---------------- Work on projects decided above. 5-6pm: brief (5-10 minutes) status updates from coding teams. Problems encountered, progress, suggestions for adjustment. Monday 3: Coding ---------------- Same as Sunday. Tuesday 4: Wrapup ----------------- 9-11 am: Wrapup sessions with summary from coding projects. 11-12 am: Panel discussion on future joint work options. Afternoon: anyone left around can continue to code! From david.huard at gmail.com Mon Jan 7 23:19:53 2008 From: david.huard at gmail.com (David Huard) Date: Mon, 7 Jan 2008 23:19:53 -0500 Subject: [SciPy-dev] Pending tickets and roadmap 0.7 In-Reply-To: References: Message-ID: <91cf711d0801072019k12778d42u702ed97e55fe2de2@mail.gmail.com> Ticket 422 can be closed too, although I'd appreciate if someone could confirm the solution is sound (see revision 3797 and explanations in said ticket). 2008/1/4, Nils Wagner : > > On Fri, 04 Jan 2008 13:11:03 +0100 > "Nils Wagner" wrote: > > Hi all, > > > > The tickets 572, 575 and 576 can be closed. > > > > > http://projects.scipy.org/scipy/scipy/query?status=new&status=assigned&status=reopened&milestone=0.7 > > > > Cheers, > > Nils > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > How about tickets 569 and 314 ? > AFAIK read_array is deprecated. > > Cheers > > Nils > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Linda.Gilbert at WPAFB.AF.MIL Tue Jan 8 13:57:07 2008 From: Linda.Gilbert at WPAFB.AF.MIL (Gilbert, Linda M CTR USAF AFRL/RXOC) Date: Tue, 8 Jan 2008 13:57:07 -0500 Subject: [SciPy-dev] Scipy, numpy and Abaqus Message-ID: <2B00361EE3107A4F88383EC1B041DC9A02D71466@VFOHMLAO01.Enterprise.afmc.ds.af.mil> > _____________________________________________ > From: Gilbert, Linda M CTR USAF AFRL/MLOC > Sent: Thursday, January 03, 2008 11:01 AM > To: 'scipy-dev at scipy.org'; 'pearu at cens.ioc.ee' > Subject: Scipy, numpy and Abaqus > > Hello, > > Do you know how to get numpy and scipy to work from within abaqus? > > Thank you in advance, > > Linda Gilbert > AFRL/RXOC > RCF Information Systems > 937-904-7632 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Tue Jan 8 14:01:34 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Jan 2008 20:01:34 +0100 Subject: [SciPy-dev] Scipy, numpy and Abaqus In-Reply-To: <2B00361EE3107A4F88383EC1B041DC9A02D71466@VFOHMLAO01.Enterprise.afmc.ds.af.mil> References: <2B00361EE3107A4F88383EC1B041DC9A02D71466@VFOHMLAO01.Enterprise.afmc.ds.af.mil> Message-ID: On Tue, 8 Jan 2008 13:57:07 -0500 "Gilbert, Linda M CTR USAF AFRL/RXOC" wrote: > > >> _____________________________________________ >> From: Gilbert, Linda M CTR USAF AFRL/MLOC >> Sent: Thursday, January 03, 2008 11:01 AM >> To: 'scipy-dev at scipy.org'; 'pearu at cens.ioc.ee' >> Subject: Scipy, numpy and Abaqus >> >> Hello, >> >> Do you know how to get numpy and scipy to work from >>within abaqus? >> >> Thank you in advance, >> >> Linda Gilbert >> AFRL/RXOC >> RCF Information Systems >> 937-904-7632 >> >> Interesting question. Let me add another question wrt Abaqus. I know that you can export element matrices in MatrixMarket (*.mtx) format. How about system matrices ? Nils From matthew.brett at gmail.com Tue Jan 8 20:15:23 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 8 Jan 2008 17:15:23 -0800 Subject: [SciPy-dev] Nose testing branch - call for - er - testing Message-ID: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> Hi, I've been working for a bit on replacing numpy testing with the nose testing framework: http://somethingaboutorange.com/mrl/projects/nose/ Fernando and the guys started this at the recent Berkeley sprint, and I've since been slogging through. I've now done most of the work, and I was hoping some of you would check out the branch and see what you find: http://projects.scipy.org/scipy/scipy/browser/branches/testing_cleanup/ As before, you can run module level tests with (e.g) scipy.io.test() Now the arguments are a bit different, and select the tests using the nose framework, such as: scipy.io.test('bench') select the tests labelled with the benchmark decorators, this: scipy.io.test('fast') corresponds to the default, and this scipy.io.test('full', verbose=10) to the usual test(10, 10) idiom. You'll see hints on the nose stuff in the scipy.testing module, and the scipy.sandbox.exmplpackage module. Any thoughts from the bravest out there? Matthew From david at ar.media.kyoto-u.ac.jp Wed Jan 9 03:59:01 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Wed, 09 Jan 2008 17:59:01 +0900 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> Message-ID: <47848CD5.6060902@ar.media.kyoto-u.ac.jp> Matthew Brett wrote: > Hi, > > I've been working for a bit on replacing numpy testing with the nose > testing framework: > > http://somethingaboutorange.com/mrl/projects/nose/ > > Fernando and the guys started this at the recent Berkeley sprint, and > I've since been slogging through. > > I've now done most of the work, and I was hoping some of you would > check out the branch and see what you find: > > http://projects.scipy.org/scipy/scipy/browser/branches/testing_cleanup/ > > As before, you can run module level tests with (e.g) > > scipy.io.test() > > Now the arguments are a bit different, and select the tests using the > nose framework, such as: > > scipy.io.test('bench') > > select the tests labelled with the benchmark decorators, this: > > scipy.io.test('fast') > > corresponds to the default, and this > > scipy.io.test('full', verbose=10) > > to the usual test(10, 10) idiom. > > You'll see hints on the nose stuff in the scipy.testing module, and > the scipy.sandbox.exmplpackage module. > I quickly tried it, and it worked (on Ubuntu with gcc). What are the main advantages of using nose ? Does it make the actual testing framework simpler, or are there any other advantages I am not aware of ? cheers, David From matthew.brett at gmail.com Wed Jan 9 04:29:31 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 9 Jan 2008 01:29:31 -0800 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <47848CD5.6060902@ar.media.kyoto-u.ac.jp> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <47848CD5.6060902@ar.media.kyoto-u.ac.jp> Message-ID: <1e2af89e0801090129n519e91fcid696080248ad216c@mail.gmail.com> Hi, > > You'll see hints on the nose stuff in the scipy.testing module, and > > the scipy.sandbox.exmplpackage module. > > > I quickly tried it, and it worked (on Ubuntu with gcc). What are the > main advantages of using nose ? Does it make the actual testing > framework simpler, or are there any other advantages I am not aware of ? Others could probably step in better than I can - but the main advantages are: The testing framework is standard and well-maintained. New tests have minimal overhead - as simple as writing a function with 'test' in the name It will do doctests, if asked. It's got a flexible test selection system from the command line It does parametric tests rather nicely - see http://projects.scipy.org/scipy/scipy/browser/branches/testing_cleanup/scipy/sandbox/exmplpackage/tests/test_foo.py At the moment, the main difference is only the clearing out of the numpy testing framework specific stuff. New tests should be simpler to write with less overhead. Matthew From faltet at carabos.com Wed Jan 9 04:41:35 2008 From: faltet at carabos.com (Francesc Altet) Date: Wed, 9 Jan 2008 10:41:35 +0100 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <1e2af89e0801090129n519e91fcid696080248ad216c@mail.gmail.com> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <47848CD5.6060902@ar.media.kyoto-u.ac.jp> <1e2af89e0801090129n519e91fcid696080248ad216c@mail.gmail.com> Message-ID: <200801091041.36010.faltet@carabos.com> A Wednesday 09 January 2008, Matthew Brett escrigu?: > Hi, > > > > You'll see hints on the nose stuff in the scipy.testing module, > > > and the scipy.sandbox.exmplpackage module. > > > > I quickly tried it, and it worked (on Ubuntu with gcc). What are > > the main advantages of using nose ? Does it make the actual testing > > framework simpler, or are there any other advantages I am not aware > > of ? > > Others could probably step in better than I can - but the main > advantages are: > > The testing framework is standard and well-maintained. > New tests have minimal overhead - as simple as writing a function > with 'test' in the name > It will do doctests, if asked. > It's got a flexible test selection system from the command line > It does parametric tests rather nicely - see > http://projects.scipy.org/scipy/scipy/browser/branches/testing_cleanu >p/scipy/sandbox/exmplpackage/tests/test_foo.py > > At the moment, the main difference is only the clearing out of the > numpy testing framework specific stuff. New tests should be simpler > to write with less overhead. Maybe this has been already discussed, but what about its LGPL license? I understand that packages under this license should not be part of SciPy (correct me if I'm wrong). Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From matthieu.brucher at gmail.com Wed Jan 9 04:46:01 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Jan 2008 10:46:01 +0100 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <200801091041.36010.faltet@carabos.com> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <47848CD5.6060902@ar.media.kyoto-u.ac.jp> <1e2af89e0801090129n519e91fcid696080248ad216c@mail.gmail.com> <200801091041.36010.faltet@carabos.com> Message-ID: > > Maybe this has been already discussed, but what about its LGPL license? > I understand that packages under this license should not be part of > SciPy (correct me if I'm wrong). > Nose is not included in Scipy, so there are problems. A lot of such tools are GPL or LGPL (the gcc compiler for instance), but we can use them for our purpose. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Wed Jan 9 04:46:51 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Jan 2008 10:46:51 +0100 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <47848CD5.6060902@ar.media.kyoto-u.ac.jp> <1e2af89e0801090129n519e91fcid696080248ad216c@mail.gmail.com> <200801091041.36010.faltet@carabos.com> Message-ID: > > Nose is not included in Scipy, so there are problems. A lot of such tools > are GPL or LGPL (the gcc compiler for instance), but we can use them for our > purpose. > There are _no_problems (I can't even write correctly...) Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From faltet at carabos.com Wed Jan 9 04:58:27 2008 From: faltet at carabos.com (Francesc Altet) Date: Wed, 9 Jan 2008 10:58:27 +0100 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <200801091041.36010.faltet@carabos.com> Message-ID: <200801091058.27510.faltet@carabos.com> A Wednesday 09 January 2008, Matthieu Brucher escrigu?: > > Maybe this has been already discussed, but what about its LGPL > > license? I understand that packages under this license should not > > be part of SciPy (correct me if I'm wrong). > > Nose is not included in Scipy, so there are problems. A lot of such > tools are GPL or LGPL (the gcc compiler for instance), but we can use > them for our purpose. So, I understand that it should be installed by the user, so this would become an added dependency, I guess :-/ Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From charlesr.harris at gmail.com Wed Jan 9 05:50:16 2008 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 9 Jan 2008 03:50:16 -0700 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <200801091058.27510.faltet@carabos.com> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> <200801091041.36010.faltet@carabos.com> <200801091058.27510.faltet@carabos.com> Message-ID: On Jan 9, 2008 2:58 AM, Francesc Altet wrote: > A Wednesday 09 January 2008, Matthieu Brucher escrigu?: > > > Maybe this has been already discussed, but what about its LGPL > > > license? I understand that packages under this license should not > > > be part of SciPy (correct me if I'm wrong). > > > > Nose is not included in Scipy, so there are problems. A lot of such > > tools are GPL or LGPL (the gcc compiler for instance), but we can use > > them for our purpose. > > So, I understand that it should be installed by the user, so this would > become an added dependency, I guess :-/ > I wonder if we need to exclude LGPL packages from scypy? If you link to them or use them you aren't under any obligation, nor would anyone using scipy put themselves at risk of having to expose their own code. They could even remove it after running the tests. The main problem I see with the LGPL is that the meaning is pretty obscure. It's hard to figure out just what it says, although I think the intent is well understood. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Thu Jan 10 02:21:15 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Jan 2008 00:21:15 -0700 Subject: [SciPy-dev] Scipy server suffering again... Message-ID: Howdy, I keep on getting, frequently, the by now familiar """Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. """ so doing anything around the site, using trac, moin, etc, is becoming rather difficult. I just noticed a load average on the box around 16, though no process is consuming any significant amount of CPU. If there's anything on our side (the individual project admins) we can do to help, please let us know. Cheers, f From nwagner at iam.uni-stuttgart.de Thu Jan 10 02:31:05 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 10 Jan 2008 08:31:05 +0100 Subject: [SciPy-dev] Trouble with different python installations Message-ID: Hi all, I have two different python versions on my linux box- python2.3 (global installation) and python2.5 (local installation without root privileges). I have removed the build directory and tried to install numpy for python2.5 ~/local/bin/python setup.py install --prefix=$HOME/local If I try to import numpy I get ~/local/bin/python Python 2.5.1 (r251:54863, Dec 21 2007, 09:21:07) [GCC 3.4.6 20060404 (Red Hat 3.4.6-3)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy Traceback (most recent call last): File "", line 1, in File "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/__init__.py", line 44, in import core File "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/__init__.py", line 5, in import multiarray ImportError: /data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/multiarray.so: undefined symbol: Py_InitModule4 >>> How can I resolve the conflict between 2.3 and 2.5 ? Nils From ondrej at certik.cz Thu Jan 10 10:11:57 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Jan 2008 16:11:57 +0100 Subject: [SciPy-dev] adding a nice progressbar to scipy Message-ID: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> Hi, in almost all of my computational projects, I use this nice progressbar: http://pypi.python.org/pypi/progressbar/ it's very easy to use: widgets=["calculating", " ", progressbar.Percentage(), ' ', progressbar.Bar(), ' ', progressbar.ETA()] pbar=progressbar.ProgressBar(widgets=widgets,maxval=10000).start() for i in range(10000): pbar.update(i) #do some heavy calculation in each step pbar.finish() and it shows the progress, estimated time of arrival (ETA), it's completely configurable with many options, etc. I just asked the author and he made the code dual licensed under BSD and LGPL (the original license). Especially the ETA is very handy, because I get a clue how long I need to wait for my code to finish and I get it for free (see the above example). Would this be a good adition to scipy? This is an example of a small but handy code, that I am lazy to install, but if it was in scipy, I'd be much more easier to use it. I'd send a patch then and write a helper function, so that it can be used like this for users that just want the default behavior: pbar=progressbar(maxval=10000) for i in range(10000): pbar.update(i) #do some heavy calculation in each step pbar.finish() Ondrej From oliphant at enthought.com Thu Jan 10 11:24:30 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Thu, 10 Jan 2008 10:24:30 -0600 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> Message-ID: <478646BE.80608@enthought.com> Ondrej Certik wrote: > Hi, > > in almost all of my computational projects, I use this nice progressbar: > > http://pypi.python.org/pypi/progressbar/ > > it's very easy to use: > > widgets=["calculating", " ", progressbar.Percentage(), ' ', > progressbar.Bar(), ' ', progressbar.ETA()] > pbar=progressbar.ProgressBar(widgets=widgets,maxval=10000).start() > for i in range(10000): > pbar.update(i) > #do some heavy calculation in each step > pbar.finish() > > > and it shows the progress, estimated time of arrival (ETA), it's > completely configurable with > many options, etc. I just asked the author > and he made the code dual licensed under BSD and LGPL (the original license). > > Especially the ETA is very handy, because I get a clue how long I need > to wait for my code > to finish and I get it for free (see the above example). > I tend to think that something like this might be more useful for IPython, or some other interactive environment. But, I'm open to opinions. -Travis O. From ondrej at certik.cz Thu Jan 10 12:06:50 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Jan 2008 18:06:50 +0100 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <478646BE.80608@enthought.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <478646BE.80608@enthought.com> Message-ID: <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> On Jan 10, 2008 5:24 PM, Travis E. Oliphant wrote: > Ondrej Certik wrote: > > Hi, > > > > in almost all of my computational projects, I use this nice progressbar: > > > > http://pypi.python.org/pypi/progressbar/ > > > > it's very easy to use: > > > > widgets=["calculating", " ", progressbar.Percentage(), ' ', > > progressbar.Bar(), ' ', progressbar.ETA()] > > pbar=progressbar.ProgressBar(widgets=widgets,maxval=10000).start() > > for i in range(10000): > > pbar.update(i) > > #do some heavy calculation in each step > > pbar.finish() > > > > > > and it shows the progress, estimated time of arrival (ETA), it's > > completely configurable with > > many options, etc. I just asked the author > > and he made the code dual licensed under BSD and LGPL (the original license). > > > > Especially the ETA is very handy, because I get a clue how long I need > > to wait for my code > > to finish and I get it for free (see the above example). > > > I tend to think that something like this might be more useful for > IPython, or some other interactive environment. But, I'm open to > opinions. Yes, or ipython. All I want is that it is part of some standard package, so that it's installed by default. Ondrej From nwagner at iam.uni-stuttgart.de Thu Jan 10 13:15:19 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 10 Jan 2008 19:15:19 +0100 Subject: [SciPy-dev] Trouble with different python installations In-Reply-To: References: Message-ID: On Thu, 10 Jan 2008 08:31:05 +0100 "Nils Wagner" wrote: > Hi all, > > I have two different python versions on my linux box- > python2.3 (global installation) and python2.5 (local > installation without root privileges). > > I have removed the build directory and tried to install > numpy > for python2.5 > ~/local/bin/python setup.py install --prefix=$HOME/local > > If I try to import numpy I get > > ~/local/bin/python > Python 2.5.1 (r251:54863, Dec 21 2007, 09:21:07) > [GCC 3.4.6 20060404 (Red Hat 3.4.6-3)] on linux2 > Type "help", "copyright", "credits" or "license" for >more > information. >>>> import numpy > Traceback (most recent call last): > File "", line 1, in > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/__init__.py", > line 44, in > import core > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/__init__.py", > line 5, in > import multiarray > ImportError: > /data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/multiarray.so: > undefined symbol: Py_InitModule4 >>>> > > How can I resolve the conflict between 2.3 and 2.5 ? > > Nils > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scoipy-dev I found some information at http://mail.python.org/pipermail/python-list/2007-March/432606.html but I still do not know how to resolve the problem. Since scipy requires python versions > 2.3 I would be happy if someone can help me with different python installations on one machine. Thanks in advance. Nils From fperez.net at gmail.com Thu Jan 10 14:08:25 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Jan 2008 12:08:25 -0700 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <478646BE.80608@enthought.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> Message-ID: On Jan 10, 2008 10:06 AM, Ondrej Certik wrote: > > > Especially the ETA is very handy, because I get a clue how long I need > > > to wait for my code > > > to finish and I get it for free (see the above example). > > > > > I tend to think that something like this might be more useful for > > IPython, or some other interactive environment. But, I'm open to > > opinions. > > Yes, or ipython. All I want is that it is part of some standard > package, so that it's installed by default. Well, here we're back to the old discussion between 'integrated, all-encompassing packages' and modular libraries. The progressbar package is already in pypi, so anyone can install it with easy_install progressbar So the question is, what do we win by bundling it. Mostly convenience for end users. But then that means we have to put it somewhere. Where? Do we simply swallow it into ipython? I'm not sure the original developer would want that. Do we put it into ipython.externals (we have that for external packages we absolutely need). That's certainly a possibility, but then slowly ipython will begin to carry other third-party projects we need to sync with, etc. And at some point, that becomes the problem Sage already faces: you duplicate lots of stuff, and distributors (like Debian) start to complain that it doesn't make sense to have this type of double-packaging. In the scipy world, we've taken an approach in many ways opposite to sage: well-defined and separated tools that do one thing well each (ipython, numpy, scipy, matplotlib, etc). People are then free to assemble them as they best see fit. This has both pluses and minuses. Sage has gone the other way: include the entire universe inside, down to python itself, a Fortran compiler, low-level system libraries like termcap, etc. Sage is almost an operating system (in fact for windows it is, since Sage for windows ships an entire Linux distribution as a VMWare image). There's an undeniable advantage of convenience for end users here, but it also does cause problems on other fronts. Asking for progressbar *inside* ipython is asking for that kind of 'bundle everything that could be useful' approach. How do you see that balance? What's the perspective of others? I have a lot more thoughts on this question, and in fact it's something I hope to discuss in detail at the Sage/Scipy meeting, but I'd like to hear other opinions. Cheers, f From stefan at sun.ac.za Thu Jan 10 14:08:32 2008 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 10 Jan 2008 21:08:32 +0200 Subject: [SciPy-dev] Trouble with different python installations In-Reply-To: References: Message-ID: <20080110190832.GJ12476@mentat.za.net> Hi Nils, Just use a different PYTHONPATH depending on which version you are running (I assume you manually set such a path?). If 2.3 extensions are found in the PYTHONPATH whilst running 2.5, you're bound to have problems. Regards St?fan On Thu, Jan 10, 2008 at 07:15:19PM +0100, Nils Wagner wrote: > On Thu, 10 Jan 2008 08:31:05 +0100 > "Nils Wagner" wrote: > > Hi all, > > > > I have two different python versions on my linux box- > > python2.3 (global installation) and python2.5 (local > > installation without root privileges). > > > > I have removed the build directory and tried to install > > numpy > > for python2.5 > > ~/local/bin/python setup.py install --prefix=$HOME/local > > > > If I try to import numpy I get > > > > ~/local/bin/python > > Python 2.5.1 (r251:54863, Dec 21 2007, 09:21:07) > > [GCC 3.4.6 20060404 (Red Hat 3.4.6-3)] on linux2 > > Type "help", "copyright", "credits" or "license" for > >more > > information. > >>>> import numpy > > Traceback (most recent call last): > > File "", line 1, in > > File > > "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/__init__.py", > > line 44, in > > import core > > File > > "/data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/__init__.py", > > line 5, in > > import multiarray > > ImportError: > > /data/home/nwagner/local/lib64/python2.3/site-packages/numpy/core/multiarray.so: > > undefined symbol: Py_InitModule4 > >>>> > > > > How can I resolve the conflict between 2.3 and 2.5 ? > > > > Nils From ondrej at certik.cz Thu Jan 10 14:32:52 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Jan 2008 20:32:52 +0100 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <478646BE.80608@enthought.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> Message-ID: <85b5c3130801101132o31150449s3b4e3fc742bf655c@mail.gmail.com> On Jan 10, 2008 8:08 PM, Fernando Perez wrote: > On Jan 10, 2008 10:06 AM, Ondrej Certik wrote: > > > > > Especially the ETA is very handy, because I get a clue how long I need > > > > to wait for my code > > > > to finish and I get it for free (see the above example). > > > > > > > I tend to think that something like this might be more useful for > > > IPython, or some other interactive environment. But, I'm open to > > > opinions. > > > > Yes, or ipython. All I want is that it is part of some standard > > package, so that it's installed by default. > > Well, here we're back to the old discussion between 'integrated, > all-encompassing packages' and modular libraries. > > The progressbar package is already in pypi, so anyone can install it with > > easy_install progressbar > > So the question is, what do we win by bundling it. Mostly convenience > for end users. But then that means we have to put it somewhere. > Where? Do we simply swallow it into ipython? I'm not sure the > original developer would want that. Do we put it into > ipython.externals (we have that for external packages we absolutely > need). That's certainly a possibility, but then slowly ipython will > begin to carry other third-party projects we need to sync with, etc. > And at some point, that becomes the problem Sage already faces: you > duplicate lots of stuff, and distributors (like Debian) start to > complain that it doesn't make sense to have this type of > double-packaging. > > In the scipy world, we've taken an approach in many ways opposite to > sage: well-defined and separated tools that do one thing well each > (ipython, numpy, scipy, matplotlib, etc). People are then free to > assemble them as they best see fit. This has both pluses and minuses. > > Sage has gone the other way: include the entire universe inside, down > to python itself, a Fortran compiler, low-level system libraries like > termcap, etc. Sage is almost an operating system (in fact for windows > it is, since Sage for windows ships an entire Linux distribution as a > VMWare image). There's an undeniable advantage of convenience for end > users here, but it also does cause problems on other fronts. > > Asking for progressbar *inside* ipython is asking for that kind of > 'bundle everything that could be useful' approach. > > How do you see that balance? What's the perspective of others? I > have a lot more thoughts on this question, and in fact it's something > I hope to discuss in detail at the Sage/Scipy meeting, but I'd like to > hear other opinions. I think such a question cannot be "objectively decided". It's just a matter of personal preferences and a consensus among scipy/numpy/etc developers and users. I agree that this double or triple packaging sucks. But scipy also includes stuff like superlu and umfpack, that you can simply ask users to install. I think the scipy package should include tools, that people use frequently when dealing with scientific calculations in python. So it boils down to a question - do you find the progressbar as useful as I do and would you use it in your projects? Would you use it in your projects when (mostly) working with scipy or ipython? If the answer to both is yes, then I think it could be included (as it is small). If it's only me, then I think it shouldn't be included. Ondrej From oliphant at enthought.com Thu Jan 10 14:51:56 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Thu, 10 Jan 2008 13:51:56 -0600 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <478646BE.80608@enthought.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> Message-ID: <4786775C.9030603@enthought.com> Fernando Perez wrote: > Well, here we're back to the old discussion between 'integrated, > all-encompassing packages' and modular libraries. > > The progressbar package is already in pypi, so anyone can install it with > > easy_install progressbar > > So the question is, what do we win by bundling it. Mostly convenience > for end users. But then that means we have to put it somewhere. > Where? Do we simply swallow it into ipython? I'm not sure the > original developer would want that. Do we put it into > ipython.externals (we have that for external packages we absolutely > need). That's certainly a possibility, but then slowly ipython will > begin to carry other third-party projects we need to sync with, etc. > And at some point, that becomes the problem Sage already faces: you > duplicate lots of stuff, and distributors (like Debian) start to > complain that it doesn't make sense to have this type of > double-packaging. > Well, at Enthought we are working on part of this problem in that we are going to be releasing an Enthought Python Distribution which will have all of it packaged and delivered in one place. It would be a simple thing to add "progressbar" as something that is pre-packaged in that case. Right now we are testing the Windows MSI and finalizing it (actually getting an MSI that contains all these packages was way more complicated than you might first think it should be...) The plans are for a Linux version as well as Mac OS X version as well (although our initial focus is corporate accounts). There will be obvious questions about how the Linux version should integrate with other packaging systems on that platform that we have not entirely resolved. One potential source of difficulty for some is that we are going to charge commercial users for long term use of the binary distribution. Academic and hobbyist users will be able to use it for free (and that is very important for us and so will not be changing). Commercial users can also try it out for free as well. Ideally, the conversation will change as this becomes more widely available and utilized. There will still be issues to resolve in terms of how much each package contains but there will not be quite the pressure for the idea of a monolithic super-package that I once had for SciPy. By the way, if anybody uses Windows and would like to be a beta-tester for the Windows binary MSI (or other platforms as they become available over the next year), please let me know. -Travis O. > In the scipy world, we've taken an approach in many ways opposite to > sage: well-defined and separated tools that do one thing well each > (ipython, numpy, scipy, matplotlib, etc). People are then free to > assemble them as they best see fit. This has both pluses and minuses. > > Sage has gone the other way: include the entire universe inside, down > to python itself, a Fortran compiler, low-level system libraries like > termcap, etc. Sage is almost an operating system (in fact for windows > it is, since Sage for windows ships an entire Linux distribution as a > VMWare image). There's an undeniable advantage of convenience for end > users here, but it also does cause problems on other fronts. > > Asking for progressbar *inside* ipython is asking for that kind of > 'bundle everything that could be useful' approach. > > How do you see that balance? What's the perspective of others? I > have a lot more thoughts on this question, and in fact it's something > I hope to discuss in detail at the Sage/Scipy meeting, but I'd like to > hear other opinions. > > Cheers, > > f > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From ravi.rajagopal at amd.com Thu Jan 10 15:17:02 2008 From: ravi.rajagopal at amd.com (Ravikiran Rajagopal) Date: Thu, 10 Jan 2008 15:17:02 -0500 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> Message-ID: <200801101517.03131.ravi.rajagopal@amd.com> On Thursday 10 January 2008 02:08:25 pm Fernando Perez wrote: > Asking for progressbar *inside* ipython is asking for that kind of > 'bundle everything that could be useful' approach. > > How do you see that balance? What's the perspective of others? I > have a lot more thoughts on this question, and in fact it's something > I hope to discuss in detail at the Sage/Scipy meeting, but I'd like to > hear other opinions. Bundling is a headache for systems administrators in the *nix world, because of the packaging issues you mentioned. It is a hassle for the bundlers to track all bundled packages. As a user, I have had the problem with enthon that I needed some newer versions of some bundled packages because they made my life much easier. So I bit the bullet, uninstalled enthon and installed packages separately on Windows and I am much happier for it. For packages like scipy that are targeted at people who write code for a living, usually there is usually one small package (say finite elements or time series analysis) that one usually wants to track via svn/p4/git in the user's area of expertise which becomes much harder with bundles. From a user's perspective (and from a sysadmin's perspective, even if it is only 2 linux machines that I administer - my desktop and my laptop), bundling is not optimal. Regards, Ravi From jre at enthought.com Thu Jan 10 15:35:14 2008 From: jre at enthought.com (J. Ryan Earl) Date: Thu, 10 Jan 2008 14:35:14 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: References: Message-ID: <47868182.2020004@enthought.com> Hello, I believe I have made some some good short-term improvements regarding the load on the SciPy.org server. It should be much peppier now and I will continue to monitor the effect of the work-around. If you don't care about the details then stop reading here. The following technical details explain the problem, the short-term workaround, and why the long-term fix is best for those who care. The main problem is that the system was starved for memory and swapping excessively. The high-load and low CPU usage is a result of processes waiting on free memory. The "high-memory usage"(1) is a basic result using garbage collected daemon processes that don't get restarted regularly, in our case FastCGI Python daemons. In particular, one virtual hosts uses "Zope" with a large amount of content and was using 45% of the memory alone. I restarted this FastCGI daemon and it went down to 10% memory usage, though as I write this it's just under 20% usage. Over time it'll go back up to around 40%, but I expect it to stabilize around that and go no higher as it reaches a stable heap size. Now what's happening is largely the result of heap memory fragmentation(2). Garbage collected languages tend to fragment their heap more than non-garbage collected languages, but with both it is expected that there is a critical heapsize threshold that once reached will satisfy all out-going and in-going heap requests without the heap having to grow further. This is different than a memory leak where memory consumption will grow indefinitely. Where this threshold is depends on a variety of factors such as typical workload, but it can be empirically measured. Thus the long-term solution is to migrate to hardware that has enough memory to fit stable-sized heaps for all the Python daemons into but this will take a lot of time, effort, and testing so it's weeks out. The short-term solution is to periodically restart the Python daemon processes before they reach max heap fragmentation. However, restarting the daemons severs existing connections users may have and will likely erase any session state that isn't stored in their local web-browser so it is thus not a desirable long-term solution. Right now I'm measuring how fast memory gets fragmented so I can determine the maximum interval to use in a script to restart these processes automatically. ie I may only need to restart them once every few days instead of once per day to minimize severed connections. (1) High is a relative term here. On the scale of modern servers it's not that high, but it's high for this particular hardware. (2) http://en.wikipedia.org/wiki/Heap_fragmentation#External_fragmentation (basic introduction to heap fragmentation) Continue to let me know if you have problems, conversely, let me know if you're having less problems than you've had recently. Both are good to know. Cheers, J. Ryan Earl IT Administrator Enthought, Inc. 512.536.1057 Fernando Perez wrote: > Howdy, > > I keep on getting, frequently, the by now familiar > > """Internal Server Error > > The server encountered an internal error or misconfiguration and was > unable to complete your request. > """ > > so doing anything around the site, using trac, moin, etc, is becoming > rather difficult. I just noticed a load average on the box around 16, > though no process is consuming any significant amount of CPU. > > If there's anything on our side (the individual project admins) we can > do to help, please let us know. > > Cheers, > > f > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From fperez.net at gmail.com Fri Jan 11 00:42:58 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Jan 2008 22:42:58 -0700 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> Message-ID: On Jan 8, 2008 6:15 PM, Matthew Brett wrote: > Hi, > > I've been working for a bit on replacing numpy testing with the nose > testing framework: > > http://somethingaboutorange.com/mrl/projects/nose/ > > Fernando and the guys started this at the recent Berkeley sprint, and > I've since been slogging through. For which you are to be much commended. I did the easy stuff, but the enormous amount of rather mind-numbing work to be done wasn't looking too tasty. I'm really thankful you took it on, and I'm sure so are the rest of the users. I really think that a cleaner, more robust, easier to use testing system will make it easier for us to enforce our 'no untested code' policies :) Anyway, here's what I'm getting: 1. With current trunk, for reference: Ran 1955 tests in 120.958s OK 2. With the testing-cleanup branch: Ran 2530 tests in 293.217s FAILED (errors=6) This is excellent: immediately, it means that there were almost 600 tests not being picked up by the old framework (somehow I doubt that in these two weeks anyone contributed that many new tests into trunk :) I'm posting the few errors below for reference: ====================================================================== ERROR: Failure: (cannot import name _bspline) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/loader.py", line 361, in loadTestsFromName addr.filename, addr.module) File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/tests/test_bspline.py", line 9, in import scipy.stats.models.bspline as B File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/bspline.py", line 23, in from scipy.stats.models import _bspline ImportError: cannot import name _bspline ====================================================================== ERROR: test_huber (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/tests/test_scale.py", line 35, in test_huber m = scale.huber(X) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/core/fromnumeric.py", line 866, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== ERROR: test_huberaxes (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/tests/test_scale.py", line 40, in test_huberaxes m = scale.huber(X, axis=0) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/core/fromnumeric.py", line 866, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== ERROR: Failure: (No module named convolve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/loader.py", line 361, in loadTestsFromName addr.filename, addr.module) File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/fperez/usr/local/lib/python2.5/site-packages/nose-0.10.0-py2.5.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stsci/image/__init__.py", line 2, in from _image import * File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/stsci/image/_image.py", line 2, in import convolve ImportError: No module named convolve ====================================================================== ERROR: no_test_no_check_return (test_wx_spec.TestWxConverter) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/tests/test_wx_spec.py", line 104, in no_test_no_check_return mod.compile() File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/ext_tools.py", line 365, in compile verbose = verbose, **kw) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/build_tools.py", line 271, in build_extension setup(name = module_name, ext_modules = [ext],verbose=verb) File "/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/distutils/core.py", line 178, in setup return old_setup(**new_attr) File "distutils/core.py", line 168, in setup CompileError: error: Command "g++ -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -fPIC -I/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave -I/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/scxx -I/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c /home/fperez/fp/photo/digicam/wx_return.cpp -o /tmp/fperez/python25_intermediate/compiler_2b6a6480521b55c80243706014f9928a/home/fperez/fp/photo/digicam/wx_return.o" failed with exit status 1 ====================================================================== ERROR: test_var_in (test_wx_spec.TestWxConverter) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/tests/test_wx_spec.py", line 63, in test_var_in mod.compile() File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/ext_tools.py", line 365, in compile verbose = verbose, **kw) File "/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/build_tools.py", line 271, in build_extension setup(name = module_name, ext_modules = [ext],verbose=verb) File "/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/distutils/core.py", line 178, in setup return old_setup(**new_attr) File "distutils/core.py", line 168, in setup CompileError: error: Command "g++ -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -fPIC -I/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave -I/home/fperez/usr/opt/lib/python2.5/site-packages/scipy/weave/scxx -I/usr/include/gtk-1.2 -I/usr/include/glib-1.2 -I/usr/lib/glib/include -I/home/fperez/usr/opt/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c /home/fperez/fp/photo/digicam/wx_var_in.cpp -o /tmp/fperez/python25_intermediate/compiler_2b6a6480521b55c80243706014f9928a/home/fperez/fp/photo/digicam/wx_var_in.o -I/usr/lib/wx/include/gtk2-unicode-release-2.6 -I/usr/include/wx-2.6 -DGTK_NO_CHECK_CASTS -D__WXGTK__ -D_FILE_OFFSET_BITS=64 -D_LARGE_FILES -DNO_GCC_PRAGMA -I/usr/lib/wx/include/gtk2-unicode-release-2.6 -I/usr/include/wx-2.6 -DGTK_NO_CHECK_CASTS -D__WXGTK__ -D_FILE_OFFSET_BITS=64 -D_LARGE_FILES -DNO_GCC_PRAGMA" failed with exit status 1 The two last ones are in weave, and I recall Min mentioning they were simply not being picked up before, and that these were the last ones he didn't have a chance to fully clean up. The fact that they are WX-specific means I worry even less, since WX support isn't quite 'core' weave functionality (not that they shouldn't get fixed, but they don't seem critical). The other four probably warrant a look, but I'm headed to bed now, and before I want to also finish testing the ETS release... Overall, I think this is excellent work, and again, Thanks! In order to minimize the headaches of tracking branches, I'd vote for Matthew's work to be merged into the trunk ASAP. Is there any good reason to wait longer, other than further testing (which is more likely to happen if it goes into trunk)? Good job! f From fperez.net at gmail.com Fri Jan 11 00:48:26 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Jan 2008 22:48:26 -0700 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <47868182.2020004@enthought.com> References: <47868182.2020004@enthought.com> Message-ID: Hey Ryan, On Jan 10, 2008 1:35 PM, J. Ryan Earl wrote: > Continue to let me know if you have problems, conversely, let me know if > you're having less problems than you've had recently. Both are good to > know. So far it does look more responsive, but that's just a single data point. In any case, thanks for paying attention to this, it was becoming really a drag. I'll do my best to provide further feedback and mention any oddities if I notice them. Regards, f From oliphant at enthought.com Fri Jan 11 02:08:15 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Fri, 11 Jan 2008 01:08:15 -0600 Subject: [SciPy-dev] Nose testing branch - call for - er - testing In-Reply-To: References: <1e2af89e0801081715m40bbd821n6ffd58d2772dd179@mail.gmail.com> Message-ID: <478715DF.7060407@enthought.com> > Overall, I think this is excellent work, and again, Thanks! In order > to minimize the headaches of tracking branches, I'd vote for Matthew's > work to be merged into the trunk ASAP. Is there any good reason to > wait longer, other than further testing (which is more likely to > happen if it goes into trunk)? > I see no reason to wait. Let's merge it. -Travis O. From nwagner at iam.uni-stuttgart.de Fri Jan 11 03:33:48 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 11 Jan 2008 09:33:48 +0100 Subject: [SciPy-dev] Trouble with different python installations In-Reply-To: <20080110190832.GJ12476@mentat.za.net> References: <20080110190832.GJ12476@mentat.za.net> Message-ID: On Thu, 10 Jan 2008 21:08:32 +0200 Stefan van der Walt wrote: > Hi Nils, > > Just use a different PYTHONPATH depending on which >version you are > running (I assume you manually set such a path?). If >2.3 extensions > are found in the PYTHONPATH whilst running 2.5, you're >bound to have > problems. > > Regards > St?fan > Hi Stefan, Thank you very much for your help ! I have changed the PYTHONPATH and everything works fine. Cheers, Nils From jre at enthought.com Fri Jan 11 03:56:32 2008 From: jre at enthought.com (J. Ryan Earl) Date: Fri, 11 Jan 2008 02:56:32 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: References: <47868182.2020004@enthought.com> Message-ID: <47872F40.6020908@enthought.com> Fernando Perez wrote: > So far it does look more responsive, but that's just a single data > point. In any case, thanks for paying attention to this, it was > becoming really a drag. So now the CPU is pegged through the roof. That's something I can't do anything about. =( -ryan From judah at enthought.com Fri Jan 11 08:39:40 2008 From: judah at enthought.com (Judah De Paula) Date: Fri, 11 Jan 2008 07:39:40 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <47872F40.6020908@enthought.com> References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> Message-ID: <4787719C.7030504@enthought.com> Do you mean Scipy is just too popular of a site for one little server? Judah J. Ryan Earl wrote: > Fernando Perez wrote: > >> So far it does look more responsive, but that's just a single data >> point. In any case, thanks for paying attention to this, it was >> becoming really a drag. >> > > So now the CPU is pegged through the roof. That's something I can't do > anything about. =( > > -ryan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From jre at enthought.com Fri Jan 11 18:12:43 2008 From: jre at enthought.com (J. Ryan Earl) Date: Fri, 11 Jan 2008 17:12:43 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <47872F40.6020908@enthought.com> References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> Message-ID: <4787F7EB.4070402@enthought.com> Update: Apache and thus it's Trac FastCGI scripts are set to automatically restart every 12 hours @ 10AM & 10PM CST. The heavy Zope FastCGI restarts @ 3AM & 3PM CST. So basically, make sure to save any work or changes you're making on the Wiki around those times to be safe though it may not matter. This should keep memory usage under control. Cheers, -ryan J. Ryan Earl wrote: > Fernando Perez wrote: > >> So far it does look more responsive, but that's just a single data >> point. In any case, thanks for paying attention to this, it was >> becoming really a drag. >> > > So now the CPU is pegged through the roof. That's something I can't do > anything about. =( > > -ryan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From ondrej at certik.cz Fri Jan 11 20:52:11 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Sat, 12 Jan 2008 02:52:11 +0100 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <200801101517.03131.ravi.rajagopal@amd.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> <200801101517.03131.ravi.rajagopal@amd.com> Message-ID: <85b5c3130801111752q3f910aa3j220677b9615991db@mail.gmail.com> So what is the conclusion? Maybe I should stress, that the progress bar is just one file with 300 lines of code. So it's not really a package, just a feature. I was thinking of scipy.io, but ipython is maybe better. If it can contain interaction with gnuplot, why not a simple progressbar? :) Ondrej From fperez.net at gmail.com Fri Jan 11 21:26:10 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Jan 2008 19:26:10 -0700 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <85b5c3130801111752q3f910aa3j220677b9615991db@mail.gmail.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> <200801101517.03131.ravi.rajagopal@amd.com> <85b5c3130801111752q3f910aa3j220677b9615991db@mail.gmail.com> Message-ID: On Jan 11, 2008 6:52 PM, Ondrej Certik wrote: > So what is the conclusion? > > Maybe I should stress, that the progress bar is just one file with 300 > lines of code. > > So it's not really a package, just a feature. I was thinking of > scipy.io, but ipython > is maybe better. If it can contain interaction with gnuplot, why not a simple > progressbar? :) A few questions: - Will it run *correctly* under Windows? I don't use it, but I don't want the text-only core of ipython to start sprouting functionality that breaks under windows (or any OS for that matter). Same thing for Linux/OSX. I know it says 'os independent', but I want independent confirmation of that statement. - It makes calls to ioctl() without checks that the import at the top actually worked. That will break somewhere, I guarantee it. - Is signal.SIGWINCH available on all OSes? - termios is also used without checks. That will likely throw an exception under windows. - Etc. This code doesn't look like it's ever been properly tested for cross-platform compatibility. Welcome to the real world. If all of the above were to be addressed, it's small enough that I'm willing to throw it into ipython.externals for convenience. 300 lines is a drop in the bucket at this point, and for interactive work it can come in very handy, I agree. No sysadmin will notice it's hiding in there :) In this case, a few things: - You said that the author would BSD license it, but the current code http://pypi.python.org/pypi/progressbar still shows LGPL-only. I won't put it in until we get from him a BSD-licensed file. - Please send us the patch you suggested writing for the more comon case. - You commit to letting us know when an update is warranted and sending us the corresponding patch, so that the ipython developers can apply it with minimal effort. But first I'd like to see the initial points addressed. Cheers, f From ondrej at certik.cz Fri Jan 11 21:54:19 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Sat, 12 Jan 2008 03:54:19 +0100 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> <200801101517.03131.ravi.rajagopal@amd.com> <85b5c3130801111752q3f910aa3j220677b9615991db@mail.gmail.com> Message-ID: <85b5c3130801111854k4cad8e13i5640356caf2ce424@mail.gmail.com> On Jan 12, 2008 3:26 AM, Fernando Perez wrote: > > On Jan 11, 2008 6:52 PM, Ondrej Certik wrote: > > So what is the conclusion? > > > > Maybe I should stress, that the progress bar is just one file with 300 > > lines of code. > > > > So it's not really a package, just a feature. I was thinking of > > scipy.io, but ipython > > is maybe better. If it can contain interaction with gnuplot, why not a simple > > progressbar? :) > > A few questions: > > - Will it run *correctly* under Windows? I don't use it, but I don't > want the text-only core of ipython to start sprouting functionality > that breaks under windows (or any OS for that matter). Same thing for > Linux/OSX. I know it says 'os independent', but I want independent > confirmation of that statement. Right, I don't know, since I don't use windows and I cannot test it. > > - It makes calls to ioctl() without checks that the import at the top > actually worked. That will break somewhere, I guarantee it. It's used in handle_resize only, which is only used in try: self.handle_resize(None,None) signal.signal(signal.SIGWINCH, self.handle_resize) self.signal_set = True except: self.term_width = 79 I am not saying it's the best programming practice, but I think it will work without ioctl quite well. > - Is signal.SIGWINCH available on all OSes? I don't know. > - termios is also used without checks. That will likely throw an > exception under windows. Again only in handle_resize(), which is only called in the code above, so it should be safe. > - Etc. This code doesn't look like it's ever been properly tested for > cross-platform compatibility. Welcome to the real world. Right. I can only say that it works well in Debian. > If all of the above were to be addressed, it's small enough that I'm > willing to throw it into ipython.externals for convenience. 300 lines > is a drop in the bucket at this point, and for interactive work it can > come in very handy, I agree. No sysadmin will notice it's hiding in > there :) > > In this case, a few things: > > - You said that the author would BSD license it, but the current code > http://pypi.python.org/pypi/progressbar still shows LGPL-only. I > won't put it in until we get from him a BSD-licensed file. Yes, he only sent an email to me, I can ask him to send an email to this list too. > - Please send us the patch you suggested writing for the more comon case. I'll send it to the ipython list. > - You commit to letting us know when an update is warranted and > sending us the corresponding patch, so that the ipython developers can > apply it with minimal effort. I commit. The final decision needs to be made by ipython developers though, if they are willing to maintain such a code if I won't be available (for any reason). But except the points you raised, I think it's very readable, so there shouldn't be a problem with that. Ondrej From fperez.net at gmail.com Fri Jan 11 22:13:38 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Jan 2008 20:13:38 -0700 Subject: [SciPy-dev] adding a nice progressbar to scipy In-Reply-To: <85b5c3130801111854k4cad8e13i5640356caf2ce424@mail.gmail.com> References: <85b5c3130801100711u5aa7a37coff5cf8f94dc8a8ae@mail.gmail.com> <85b5c3130801100906s284e77e4g1bdaaf45cb7c3fba@mail.gmail.com> <200801101517.03131.ravi.rajagopal@amd.com> <85b5c3130801111752q3f910aa3j220677b9615991db@mail.gmail.com> <85b5c3130801111854k4cad8e13i5640356caf2ce424@mail.gmail.com> Message-ID: Note: I moved this discussion to ipython-dev, where it now belongs. Cheers f From matthew.brett at gmail.com Sat Jan 12 05:10:27 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 12 Jan 2008 02:10:27 -0800 Subject: [SciPy-dev] Nose tests - merged Message-ID: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> I can now tell you that I have full knowledge of the problems of SVN and merging. I've just committed the nose testing modifications in a series of commits. Please let me know if there are any problems. Matthew From fperez.net at gmail.com Sat Jan 12 05:29:28 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Sat, 12 Jan 2008 03:29:28 -0700 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: On Jan 12, 2008 3:10 AM, Matthew Brett wrote: > I can now tell you that I have full knowledge of the problems of SVN > and merging. > > I've just committed the nose testing modifications in a series of commits. > > Please let me know if there are any problems. Looks good here (same 6 errors as yesterday with test('full'), but those can be looked at later). Thanks!!! Cheers, f From wnbell at gmail.com Sat Jan 12 06:07:06 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 12 Jan 2008 05:07:06 -0600 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: On Jan 12, 2008 4:29 AM, Fernando Perez wrote: > > Looks good here (same 6 errors as yesterday with test('full'), but > those can be looked at later). > Similar results here. Great work! I like the separation of benchmarks and unittests. I'll update my sparse benchmarks accordingly. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From nwagner at iam.uni-stuttgart.de Sat Jan 12 06:24:40 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 12 Jan 2008 12:24:40 +0100 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: On Sat, 12 Jan 2008 05:07:06 -0600 "Nathan Bell" wrote: > On Jan 12, 2008 4:29 AM, Fernando Perez > wrote: >> >> Looks good here (same 6 errors as yesterday with >>test('full'), but >> those can be looked at later). >> > > Similar results here. Great work! > > I like the separation of benchmarks and unittests. I'll >update my > sparse benchmarks accordingly. > > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev How do I use the new test functionality in scipy ? >>> import scipy >>> scipy.test(1) Traceback (most recent call last): File "", line 1, in File "/usr/local/lib64/python2.5/site-packages/scipy/testing/nosetester.py", line 54, in test nose.run(argv=argv) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/core.py", line 368, in run return TestProgram(*arg, **kw).success File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/core.py", line 219, in __init__ argv=argv, testRunner=testRunner, testLoader=testLoader) File "/usr/lib64/python2.5/unittest.py", line 758, in __init__ self.parseArgs(argv) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/core.py", line 270, in parseArgs self.createTests() File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/core.py", line 284, in createTests self.test = self.testLoader.loadTestsFromNames(self.testNames) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py", line 422, in loadTestsFromNames return unittest.TestLoader.loadTestsFromNames(self, names, module) File "/usr/lib64/python2.5/unittest.py", line 556, in loadTestsFromNames suites = [self.loadTestsFromName(name, module) for name in names] File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py", line 377, in loadTestsFromName module, discovered=discovered) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py", line 282, in loadTestsFromModule if self.selector.wantClass(test): File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/selector.py", line 76, in wantClass plug_wants = self.plugins.wantClass(cls) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/plugins/manager.py", line 81, in __call__ return self.call(*arg, **kw) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/plugins/manager.py", line 145, in simple result = meth(*arg, **kw) File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/plugins/attrib.py", line 197, in wantClass if self.validateAttrib(cls_attr) is not False: File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/plugins/attrib.py", line 162, in validateAttrib if not value(key, attribs): File "/usr/local/lib64/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/plugins/attrib.py", line 116, in eval_in_context return eval(expr, None, ContextHelper(attribs)) TypeError: eval() arg 1 must be a string or code object Nils From wnbell at gmail.com Sat Jan 12 06:35:42 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 12 Jan 2008 05:35:42 -0600 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: On Jan 12, 2008 5:24 AM, Nils Wagner wrote: > How do I use the new test functionality in scipy ? > > >>> import scipy > >>> scipy.test(1) > Traceback (most recent call last): > File "", line 1, in > File > "/usr/local/lib64/python2.5/site-packages/scipy/testing/nosetester.py", > line 54, in test > nose.run(argv=argv) The numeric test levels have been replaced with names: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/testing/examples/README.txt -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Sat Jan 12 13:18:33 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 12 Jan 2008 10:18:33 -0800 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> Hi Nathan, > I like the separation of benchmarks and unittests. I'll update my > sparse benchmarks accordingly. Ah - well - there's the first glitch. I see you very sensibly read the README file, which says: Benchmark tests =============== Routines that don't actually test correctness but instead do performance benchmarking will live in a benchmarks/ directory next to the tests/ directory of each module. There will be a scipy.benchmark() call that does benchmarking, similar to scipy.test() but separate from it. Now, I didn't do this because it would have added an extra layer of nose complexity.as well as requiring some boilerplate in the module init to implement. What I did do was add a test decorator specific for benchmarks. So, test() test('fast') test('full') do not run tests with a label of 'bench' but test('bench') does. Tests can be labeled thus: @dec.bench def test_algorithm_speed(): ... For me, the way it works now seems to cover the need for something obviously benchmarking without adding a extra layer on top of the testing and code layout. But, we could also go the other way. I'd be happy of group thoughts. Matthew From matthew.brett at gmail.com Sat Jan 12 15:01:33 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 12 Jan 2008 12:01:33 -0800 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> Message-ID: <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> Hi, > For me, the way it works now seems to cover the need for something > obviously benchmarking without adding a extra layer on top of the > testing and code layout. But, we could also go the other way. I'd be > happy of group thoughts. Just to say - that moving benchmark tests to a benchmark directory is fine - nose will still pick up the tests - it's just that: you'll need to label the benchmark tests with @dec.bench and run them with module.test('bench') Matthew From matthew.brett at gmail.com Sat Jan 12 15:40:28 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 12 Jan 2008 12:40:28 -0800 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> Message-ID: <1e2af89e0801121240g32f4b3dbtdc475ca6c9f68f3b@mail.gmail.com> Hi, > How do I use the new test functionality in scipy ? I've put in an check for the input to test to make the error more explicit. In particular: import scipy help(scipy.test) Help on method test in module scipy.testing.nosetester: test(self, label='fast', verbose=1, doctests=False, extra_argv=None) method of scipy.testing.nosetester.NoseTester instance Module testing function Parameters ---------- label : {'fast', 'full', '', attribute identifer} Identifies tests to run. This can be a string to pass to the nosetests executable with the'-A' option, or one of several special values. Special values are: 'fast' - the default - which corresponds to nosetests -A option of 'not slow and not bench and not willfail'. 'full' - fast (as above) and slow tests as in nosetests -A option of 'not bench and not willfail'. None or '' - run all tests and benchmarks attribute_identifier - string passed directly to nosetests as '-A' verbose : integer verbosity value for test outputs, 1-10 doctests : boolean If True, run doctests in module, default False extra_argv : list List with any extra args to pass to nosetests From fperez.net at gmail.com Sun Jan 13 12:15:30 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 13 Jan 2008 10:15:30 -0700 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <4787F7EB.4070402@enthought.com> References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> Message-ID: On Jan 11, 2008 4:12 PM, J. Ryan Earl wrote: > > So now the CPU is pegged through the roof. That's something I can't do > > anything about. =( Well, I saw that again for a while: I was trying to do some mailman admin work and the server was just hanging forever. I've taken to the habit of always being logged into the server with top running in a terminal whenever I do anything, since it's about the only way of having an idea of whether any particular action has a hope of succeeding or not. In this case it didn't, and the culprit seemed to be an apache process pegging the CPU at 100% for a long time, with a load level of about 3. I went to do other things and after a while I could get mailman to respond, so I could finish what I was trying to do. Just more data for you... Cheers, f From fperez.net at gmail.com Sun Jan 13 12:35:18 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 13 Jan 2008 10:35:18 -0700 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> Message-ID: On Jan 13, 2008 10:15 AM, Fernando Perez wrote: > I went to do other things and after a while I could get mailman to > respond, so I could finish what I was trying to do. I take that back. It did complete some things, but then when I tried to approve a submission to the list (my own message, with a screenshot that's too big for the default limits), it just hangs until I get the by now familiar Internal Server Error page. The admin requests page for ipython-dev is large, because there's a lot of old held spam in there that I'm also trying to flush. But it seems that it simply can't complete the request. Here's what top shows in the meantime: top - 11:34:52 up 44 days, 23:52, 3 users, load average: 2.05, 1.72, 1.80 Tasks: 274 total, 2 running, 272 sleeping, 0 stopped, 0 zombie Cpu(s): 62.4% us, 2.6% sy, 0.0% ni, 0.0% id, 34.4% wa, 0.2% hi, 0.3% si Mem: 2075100k total, 2021984k used, 53116k free, 56612k buffers Swap: 4192944k total, 1003260k used, 3189684k free, 280760k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 8050 apache 25 0 15648 12m 2472 R 99.9 0.6 1:24.66 python 7121 scipy 15 0 58516 21m 2460 S 25.8 1.1 0:31.03 python 7591 ipython 15 0 40228 14m 3904 S 2.6 0.7 0:18.43 trac.fcgi I'll keep on trying through the day, and will report if it eventually works. Cheers, f From fperez.net at gmail.com Sun Jan 13 13:07:08 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 13 Jan 2008 11:07:08 -0700 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> Message-ID: On Jan 13, 2008 10:35 AM, Fernando Perez wrote: > I'll keep on trying through the day, and will report if it eventually works. OK, I doubt this is going to work ever, I've gotten 5 '500 internal error' pages in a row. The server simply can't complete the request, it seems. The issue is the admin page for ipython-dev, where you approve/delete held messages. It has tons of held spam that I was trying to flush and a single message I wanted to approve. I only visit that page rarely, and if there's a setting to just have mailmain auto-delete anything that SpamAssassin marks, I'd be OK with that. But I couldn't find such a setting after looking around a bit, so all this old spam is accumulating... Cheers, f From travis at enthought.com Sun Jan 13 22:16:34 2008 From: travis at enthought.com (Travis Vaught) Date: Sun, 13 Jan 2008 21:16:34 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> Message-ID: <82E3CA7F-4D52-4F39-8779-17FA2B867DCE@enthought.com> On Jan 13, 2008, at 12:07 PM, Fernando Perez wrote: > On Jan 13, 2008 10:35 AM, Fernando Perez wrote: > >> I'll keep on trying through the day, and will report if it >> eventually works. > > OK, I doubt this is going to work ever, I've gotten 5 '500 internal > error' pages in a row. The server simply can't complete the request, > it seems. The issue is the admin page for ipython-dev, where you > approve/delete held messages. It has tons of held spam that I was > trying to flush and a single message I wanted to approve. I only > visit that page rarely, and if there's a setting to just have mailmain > auto-delete anything that SpamAssassin marks, I'd be OK with that. > But I couldn't find such a setting after looking around a bit, so all > this old spam is accumulating... > ... Same thing happened to me while trying to approve a scipy-dev message the other day ... looks like mailman expects less moderation in our moderation (sorry, couldn't resist). I'll talk with Ryan in the morning to see if there's another way to clean out all the garbage. Travis From strawman at astraw.com Sun Jan 13 23:32:23 2008 From: strawman at astraw.com (Andrew Straw) Date: Sun, 13 Jan 2008 20:32:23 -0800 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <82E3CA7F-4D52-4F39-8779-17FA2B867DCE@enthought.com> References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> <82E3CA7F-4D52-4F39-8779-17FA2B867DCE@enthought.com> Message-ID: <478AE5D7.1000300@astraw.com> I had this issue once before with Mailman, so I suspect the culprit is at least potentially Mailman itself timing out on big backlogs... Not necessarily directly caused by high server load. -Andrew Travis Vaught wrote: > On Jan 13, 2008, at 12:07 PM, Fernando Perez wrote: > > >> On Jan 13, 2008 10:35 AM, Fernando Perez wrote: >> >> >>> I'll keep on trying through the day, and will report if it >>> eventually works. >>> >> OK, I doubt this is going to work ever, I've gotten 5 '500 internal >> error' pages in a row. The server simply can't complete the request, >> it seems. The issue is the admin page for ipython-dev, where you >> approve/delete held messages. It has tons of held spam that I was >> trying to flush and a single message I wanted to approve. I only >> visit that page rarely, and if there's a setting to just have mailmain >> auto-delete anything that SpamAssassin marks, I'd be OK with that. >> But I couldn't find such a setting after looking around a bit, so all >> this old spam is accumulating... >> ... >> > > Same thing happened to me while trying to approve a scipy-dev message > the other day ... looks like mailman expects less moderation in our > moderation (sorry, couldn't resist). I'll talk with Ryan in the > morning to see if there's another way to clean out all the garbage. > > Travis > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From jre at enthought.com Mon Jan 14 01:56:05 2008 From: jre at enthought.com (J. Ryan Earl) Date: Mon, 14 Jan 2008 00:56:05 -0600 Subject: [SciPy-dev] Scipy server suffering again... In-Reply-To: <478AE5D7.1000300@astraw.com> References: <47868182.2020004@enthought.com> <47872F40.6020908@enthought.com> <4787F7EB.4070402@enthought.com> <82E3CA7F-4D52-4F39-8779-17FA2B867DCE@enthought.com> <478AE5D7.1000300@astraw.com> Message-ID: <478B0785.9020907@enthought.com> I suspect I can just find where the spam is being accumulated and delete it directly from the filesystem. -ryan Andrew Straw wrote: > I had this issue once before with Mailman, so I suspect the culprit is > at least potentially Mailman itself timing out on big backlogs... Not > necessarily directly caused by high server load. > > -Andrew > > Travis Vaught wrote: > >> On Jan 13, 2008, at 12:07 PM, Fernando Perez wrote: >> >> >> >>> On Jan 13, 2008 10:35 AM, Fernando Perez wrote: >>> >>> >>> >>>> I'll keep on trying through the day, and will report if it >>>> eventually works. >>>> >>>> >>> OK, I doubt this is going to work ever, I've gotten 5 '500 internal >>> error' pages in a row. The server simply can't complete the request, >>> it seems. The issue is the admin page for ipython-dev, where you >>> approve/delete held messages. It has tons of held spam that I was >>> trying to flush and a single message I wanted to approve. I only >>> visit that page rarely, and if there's a setting to just have mailmain >>> auto-delete anything that SpamAssassin marks, I'd be OK with that. >>> But I couldn't find such a setting after looking around a bit, so all >>> this old spam is accumulating... >>> ... >>> >>> >> Same thing happened to me while trying to approve a scipy-dev message >> the other day ... looks like mailman expects less moderation in our >> moderation (sorry, couldn't resist). I'll talk with Ryan in the >> morning to see if there's another way to clean out all the garbage. >> >> Travis >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From lists at onerussian.com Mon Jan 14 11:34:26 2008 From: lists at onerussian.com (Yaroslav Halchenko) Date: Mon, 14 Jan 2008 11:34:26 -0500 Subject: [SciPy-dev] slicing vs. advanced selection -- can be there smth in the middle? ; -) Message-ID: <20080114163426.GA30915@washoe.onerussian.com> Dear Scipy and Numpy gurus, Per our IRC coversation with tvaught I decided to expose my problem and wishes to the list. At the moment, there are 2 possibilities to select sub region of an array 1. slicing -- efficient memory vise -- no copying is done, it is just a view over the original data (thus even .flags.owndata=False). Needs to be done by specifying Slice instance(s) (explicitly or not) in the index, ie b=a[ 1:4, 2:5 ] 2. advanced selection where either a list of indexes is given or a mask c=a[ [1,2,3], [2,3,4] ] in this case the data gets copied In the application we are developing (pymvpa) we are dealing with relatively large arrays of data (primarily 2D while processing), where first dimension corresponds to different samples, 2nd -- to different features. The problems comes that we often need to sample from an array. For instance to check cross-validation on N-1 fold we are to generate N 'views' over original array. In each such "view" 1 sample (row) is not present while training, and it is used as a sample to test against later on. so at the end instead of N data records, in current implementation we end up with N*(N-1) records (if we are to keep those views for further analysis). But that is not only the case with the 1st dimension -- we have to do similarly 'evil' selection of the features, which again leads to quite a big waste of memory. Thus I wondered, is there any facility which could help us out (may be by sacrificing reasonable computation cost) and have really a view on top of an array. We don't really need a sparse representation -- we are selecting a set of rows and columns, so every column (and similar across rows) for a given 'view' uses the same steps/increments between the elements. I hope that my wording makes some sense ;-) If not -- please don't hesitate to buzz me to provide really a use case. Thank you in advance -- Yaroslav Halchenko Research Assistant, Psychology Department, Rutgers-Newark Student Ph.D. @ CS Dept. NJIT Office: (973) 353-5440x263 | FWD: 82823 | Fax: (973) 353-1171 101 Warren Str, Smith Hall, Rm 4-105, Newark NJ 07102 WWW: http://www.linkedin.com/in/yarik From peridot.faceted at gmail.com Mon Jan 14 11:55:11 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Mon, 14 Jan 2008 11:55:11 -0500 Subject: [SciPy-dev] slicing vs. advanced selection -- can be there smth in the middle? ; -) In-Reply-To: <20080114163426.GA30915@washoe.onerussian.com> References: <20080114163426.GA30915@washoe.onerussian.com> Message-ID: On 14/01/2008, Yaroslav Halchenko wrote: > At the moment, there are 2 possibilities to select sub region of an > array > > 1. slicing -- efficient memory vise -- no copying is done, it is just a > view over the original data (thus even .flags.owndata=False). Needs to > be done by specifying Slice instance(s) (explicitly or not) in the > index, ie > b=a[ 1:4, 2:5 ] > > 2. advanced selection where either a list of indexes is given or a mask > c=a[ [1,2,3], [2,3,4] ] > > in this case the data gets copied The first answer is that numpy cannot do what you want. Every numpy array is a contiguous block of memory, with data elements spaced evenly along it in each dimension. This is built into the C-level indexing throughout numpy. This is why fancy indexing *must* copy. Thus there's no way to do what you want and get a proper numpy array out. But read on... If you're willing to be a little awkward, you can also make lists: d = [ a[i,:] for i in [2,3,7] ] Here the data does not get copied either. Unfortunately, you lose the numpy features on the outer indexing; also note that lists introduce an array of at least four or eight bytes per list element, so you almost certainly do not want to use a list-of-lists. > In the application we are developing (pymvpa) we are dealing with > relatively large arrays of data (primarily 2D while processing), where > first dimension corresponds to different samples, 2nd -- to different > features. > > The problems comes that we often need to sample from an array. For > instance to check cross-validation on N-1 fold we are to generate N > 'views' over original array. In each such "view" 1 sample (row) is not present > while training, and it is used as a sample to test against later on. > so at the end instead of N data records, in current implementation we > end up with N*(N-1) records (if we are to keep those views for further > analysis). > > But that is not only the case with the 1st dimension -- we have to do > similarly 'evil' selection of the features, which again leads to quite > a big waste of memory. > > Thus I wondered, is there any facility which could help us out (may be > by sacrificing reasonable computation cost) and have really a view on > top of an array. We don't really need a sparse representation -- we are > selecting a set of rows and columns, so every column (and similar across > rows) for a given 'view' uses the same steps/increments between > the elements. If I understand you correctly, your selections tend to be "all-but-one" selections, though maybe in both dimensions. In this case, you can get arrays that are two contiguous parts: v = (a[:n],a[n+1:]) These can be views. Of course indexing them is more annoying, but here you are trading convenience for runtime. If you like, you can concatenate these, producing contiguous copies, before processing, and then discard the copies. Alternatively, if your need is simply to keep the selections around for later analysis, remember that selection is a fast process, so you can keep only the selection indices: b = a[l1, l2] analyze(b) keep((l1,l2)) Or even whatever was used to generate them - the number of the omitted row, or even a seed used to seed a random number generator to select random rows. Good luck, Anne From wnbell at gmail.com Mon Jan 14 15:15:10 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 14 Jan 2008 14:15:10 -0600 Subject: [SciPy-dev] 32bit vs 64bit doctest differences Message-ID: It appears that numpy prints the 'dtype=intXX' part only when the size of int differs from the machine word size. For example, consider the following test run on a 64-bit machine: File "/usr/lib/python2.5/site-packages/scipy/stsci/convolve/Convolve.py", line 295, in scipy.stsci.convolve.Convolve.boxcar Failed example: boxcar(num.array([10, 0, 0, 0, 0, 0, 1000]), (3,), mode="wrap").astype(num.longlong) Expected: array([336, 3, 0, 0, 0, 333, 336], dtype=int64) Got: array([336, 3, 0, 0, 0, 333, 336]) Should we just use dtype=int to avoid this issue? It appears dtype=int chooses the native size. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From lists at onerussian.com Mon Jan 14 20:58:12 2008 From: lists at onerussian.com (Yaroslav Halchenko) Date: Mon, 14 Jan 2008 20:58:12 -0500 Subject: [SciPy-dev] slicing vs. advanced selection -- can be there smth in the middle? ; -) In-Reply-To: References: <20080114163426.GA30915@washoe.onerussian.com> Message-ID: <20080115015810.GA30913@washoe.onerussian.com> Hi Anne, First of all -- thank you for your feedback on this question! > If you're willing to be a little awkward, you can also make lists: Sure thing we could do some awkwarding with lists to stay with views instead of copying, but I guess (since I've not tried yet) most of our processing might break terribly since in many places we rely at least on few methods/properties of ndarrays (e.g. shape). > If I understand you correctly, your selections tend to be > "all-but-one" selections, though maybe in both dimensions. In this > case, you can get arrays that are two contiguous parts: > v = (a[:n],a[n+1:]) not really all-but-one, but often it is all-but-one-block, ie may be smth like v=(a[:n],a[n+b:]) where b is the size of that block. Unfortunately it would have the same consequences as above... but if I become desperate I indeed might look in one of this methods (first one with lists seems to be of preference since it seems to be more generic). > Alternatively, if your need is simply to keep the selections around > for later analysis, remember that selection is a fast process, so you > can keep only the selection indices: > b = a[l1, l2] > analyze(b) > keep((l1,l2)) hm... or out of desparation we might simply rely on another level on top of numpy's arrays which would provide us this 'logical array manager' (logical is just after LVM's first word). We will see... Thanks once again -- Yaroslav Halchenko Research Assistant, Psychology Department, Rutgers-Newark Student Ph.D. @ CS Dept. NJIT Office: (973) 353-5440x263 | FWD: 82823 | Fax: (973) 353-1171 101 Warren Str, Smith Hall, Rm 4-105, Newark NJ 07102 WWW: http://www.linkedin.com/in/yarik From ceball at users.sourceforge.net Mon Jan 14 22:07:56 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Tue, 15 Jan 2008 03:07:56 +0000 (UTC) Subject: [SciPy-dev] Status of weave Message-ID: Hi, Recently I asked a question on the Scipy-User list about using weave with numpy arrays of dtype=object (http://article.gmane.org/gmane.comp.python.scientific.user/14642). There has been no reply, so now I'd like to find out the status of weave. I see that various bugs have been fixed recently, and that there has been some work on the converters (e.g. r3309 from the svn log), so I guess someone is working on weave. Are there any plans to allow arrays with dtype=object to be converted? Thanks very much, Chris From mani.sabri at gmail.com Tue Jan 15 04:16:30 2008 From: mani.sabri at gmail.com (mani sabri) Date: Tue, 15 Jan 2008 12:46:30 +0330 Subject: [SciPy-dev] Status of weave In-Reply-To: Message-ID: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Hi I'm not a waeve or even a numpy expert but IMHO weave is a proxy between blitz and numpy, 'object's are completely different deals. Mani Sabri >-----Original Message----- >From: scipy-dev-bounces at scipy.org [mailto:scipy-dev-bounces at scipy.org] On >Behalf Of C. Ball >Hi, > >Recently I asked a question on the Scipy-User list about using weave >with numpy arrays of dtype=object >(http://article.gmane.org/gmane.comp.python.scientific.user/14642). There >has been no reply, so now I'd like to find out the status of weave. > >I see that various bugs have been fixed recently, and that there has >been some work on the converters (e.g. r3309 from the svn log), so I >guess someone is working on weave. Are there any plans to allow arrays >with dtype=object to be converted? > > >Thanks very much, > >Chris > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.org >http://projects.scipy.org/mailman/listinfo/scipy-dev From ondrej at certik.cz Tue Jan 15 08:20:33 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 15 Jan 2008 14:20:33 +0100 Subject: [SciPy-dev] sparse solvers in scipy rock Message-ID: <85b5c3130801150520r18c2edceo2e7e54fe4d6c40cd@mail.gmail.com> Hi, yesterday I wrote some solver for 1D time dependent schrodinger equation to my school, using explicit and implicit methods and using scipy sparse solvers, because I was lazy to use some fortran routines for inverting the tridiagonal matrix, and I was surprised how nice it works in scipy. You did a great job. I am convinced that is the way to call all sparse solvers. Here is the code: http://hg.certik.cz/schrod1D/ I based it on the example: http://www.scipy.org/Cookbook/SchrodingerFDTD which uses central differences explicit method, but I wrote it my way, using complex numbers directly and Cython to speed it up. I also implemented euler differences explicit method, which totally blows up after a few iterations, and also an implicit method, which works really well. Where do you think I could put it together with some documentation how to play with it? to the above wiki, or should I create a new wiki for that? It's an example how to use Cython+Numpy+scipy sparse solvers (superlu currently). Ondrej From ceball at users.sourceforge.net Tue Jan 15 08:58:50 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Tue, 15 Jan 2008 13:58:50 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: > I'm not a waeve or even a numpy expert but IMHO weave is a proxy between > blitz and numpy, 'object's are completely different deals. Thanks for your reply. All I want to do is to be able to iterate through a numpy array of python objects (each of which contains its own numpy array) from within weave C code. That currently fails because each array element is of type 'O'. Am I doing something wrong, or is it not possible to process numpy arrays of objects from within weave? Thanks, Chris From matthieu.brucher at gmail.com Tue Jan 15 09:01:43 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 15 Jan 2008 15:01:43 +0100 Subject: [SciPy-dev] Status of weave In-Reply-To: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: Hi, In fact weave compiles a C++ function on the fly, but is not restricted to Blitz. By default, it doesn't even use Blitz. It uses SCXX wrappers so that Python objects can be used from C++. Matthieu 2008/1/15, mani sabri : > > Hi > I'm not a waeve or even a numpy expert but IMHO weave is a proxy between > blitz and numpy, 'object's are completely different deals. > > Mani Sabri > >-----Original Message----- > >From: scipy-dev-bounces at scipy.org [mailto:scipy-dev-bounces at scipy.org] On > >Behalf Of C. Ball > >Hi, > > > >Recently I asked a question on the Scipy-User list about using weave > >with numpy arrays of dtype=object > >(http://article.gmane.org/gmane.comp.python.scientific.user/14642). There > >has been no reply, so now I'd like to find out the status of weave. > > > >I see that various bugs have been fixed recently, and that there has > >been some work on the converters (e.g. r3309 from the svn log), so I > >guess someone is working on weave. Are there any plans to allow arrays > >with dtype=object to be converted? > > > > > >Thanks very much, > > > >Chris > > > > > >_______________________________________________ > >Scipy-dev mailing list > >Scipy-dev at scipy.org > >http://projects.scipy.org/mailman/listinfo/scipy-dev > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Tue Jan 15 09:04:08 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 15 Jan 2008 15:04:08 +0100 Subject: [SciPy-dev] Status of weave In-Reply-To: References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: 2008/1/15, C. Ball : > > > > I'm not a waeve or even a numpy expert but IMHO weave is a proxy between > > blitz and numpy, 'object's are completely different deals. > > Thanks for your reply. > > All I want to do is to be able to iterate through a numpy array > of python objects (each of which contains its own numpy array) > from within weave C code. That currently fails because each array > element is of type 'O'. > > Am I doing something wrong, or is it not possible to process numpy > arrays of objects from within weave? > I don't think (but I may be wrong) that the current code can handle this. If eventually this works, you will only get an array with PyObject*s, you will have to cast them explicitely. Can you post the exact error ? Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From ceball at users.sourceforge.net Tue Jan 15 09:44:56 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Tue, 15 Jan 2008 14:44:56 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: Matthieu Brucher gmail.com> writes: [...] > I don't think (but I may be wrong) that the current code can handle this. If eventually this works, you will only get an array with PyObject*s, you will have to cast them explicitely. > Can you post the exact error ? Sure, an example and the error it produces are in my original post to the SciPy-Users list: http://article.gmane.org/gmane.comp.python.scientific.user/14642 Thanks From matthieu.brucher at gmail.com Tue Jan 15 10:18:10 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 15 Jan 2008 16:18:10 +0100 Subject: [SciPy-dev] Status of weave In-Reply-To: References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: > > > I don't think (but I may be wrong) that the current code can handle > this. If > eventually this works, you will only get an array with PyObject*s, you > will have > to cast them explicitely. > > Can you post the exact error ? > > Sure, an example and the error it produces are in my original post to > the SciPy-Users list: > > http://article.gmane.org/gmane.comp.python.scientific.user/14642 > Now I'm sure it is not supported. To make it work means writing the converter that will support 'O', but I don't know how to do that (for the moment). Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From ceball at users.sourceforge.net Tue Jan 15 10:33:24 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Tue, 15 Jan 2008 15:33:24 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: > Now I'm sure it is not supported. To make it work means writing the > converter that will support 'O', but I don't know how to do that > (for the moment). You talk about it almost as if you are going to be learning about it sometime soon... I have no idea about how the converters work, or how much work it would be to add a conversion for 'O'. Can anyone comment on that (how much work it would be), or on any related plans he/she might have for developing weave? Thanks for the quick responses! Chris From matthieu.brucher at gmail.com Tue Jan 15 10:37:29 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 15 Jan 2008 16:37:29 +0100 Subject: [SciPy-dev] Status of weave In-Reply-To: References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: 2008/1/15, C. Ball : > > > Now I'm sure it is not supported. To make it work means writing the > > converter that will support 'O', but I don't know how to do that > > (for the moment). > > You talk about it almost as if you are going to be learning about it > sometime soon... Not soon, but perhaps sometime. I have no idea about how the converters work, or how much work it > would be to add a conversion for 'O'. Can anyone comment on that (how > much work it would be), or on any related plans he/she might have for > developing weave? > > > Thanks for the quick responses! > > Chris > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Tue Jan 15 10:59:50 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Tue, 15 Jan 2008 15:59:50 +0000 Subject: [SciPy-dev] Status of weave In-Reply-To: References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: On Jan 15, 2008 3:33 PM, C. Ball wrote: > I have no idea about how the converters work, or how much work it > would be to add a conversion for 'O'. Can anyone comment on that (how > much work it would be), or on any related plans he/she might have for > developing weave? As far as I know, no one is answering this question about development plans for weave because no one has plans to further develop weave. The recent work on weave was mostly a result of the SciPy sprint in December. We were basically trying to clean it up and get it passing all its tests. We decided to devote some attention to it because no one is actively developing or maintaining it. If you are interested in working on weave, that would be a great service to the community. If not, I am afraid you will unfortunately need to convince someone to work on this for you or go without this feature for now. Eventually someone may take an interest in further developing it. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From nwagner at iam.uni-stuttgart.de Tue Jan 15 11:26:00 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 15 Jan 2008 17:26:00 +0100 Subject: [SciPy-dev] sandbox multigrid Message-ID: Hi Nathan, This is to let you know that your sandbox package multigrid comprises an ImportError I am using >>> scipy.__version__ '0.7.0.dev3837' ====================================================================== ERROR: Failure: ImportError (cannot import name augment_candidates) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/nose/loader.py", line 363, in loadTestsFromName module = self.importer.importFromPath( File "/usr/lib/python2.4/site-packages/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/usr/lib/python2.4/site-packages/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/sandbox/multigrid/tests/test_adaptive.py", line 8, in ? from scipy.sandbox.multigrid.adaptive import augment_candidates ImportError: cannot import name augment_candidates Cheers, Nils From fperez.net at gmail.com Tue Jan 15 13:16:10 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 15 Jan 2008 11:16:10 -0700 Subject: [SciPy-dev] sparse solvers in scipy rock In-Reply-To: <85b5c3130801150520r18c2edceo2e7e54fe4d6c40cd@mail.gmail.com> References: <85b5c3130801150520r18c2edceo2e7e54fe4d6c40cd@mail.gmail.com> Message-ID: On Jan 15, 2008 6:20 AM, Ondrej Certik wrote: > Hi, > > yesterday I wrote some solver for 1D time dependent schrodinger > equation to my school, > using explicit and implicit methods and using scipy sparse solvers, > because I was lazy to use > some fortran routines for inverting the tridiagonal matrix, and I was > surprised how nice it works in scipy. > You did a great job. I am convinced that is the way to call all sparse solvers. > > Here is the code: > > http://hg.certik.cz/schrod1D/ > > I based it on the example: > > http://www.scipy.org/Cookbook/SchrodingerFDTD > > which uses central differences explicit method, but I wrote it my way, > using complex numbers directly and Cython to speed it up. > > I also implemented euler differences explicit method, > which totally blows up after a few iterations, and also an implicit > method, which works really well. > > Where do you think I could put it together with some documentation how > to play with it? to the above wiki, or should I create a new wiki for > that? > It's an example how to use Cython+Numpy+scipy sparse solvers (superlu > currently). I'd suggest making another Cookbook entry for it, and linking both Schrodinger ones to each other, so that an interested party can read both the simpler, pure python FDTD approach and your more sophisticated one as well. The better the cookbook gets, the easier it will be for newcomers to find examples that are similart to their needs, from which they can learn and get started. Thanks! Cheers, f From anand.prabhakar.patil at gmail.com Tue Jan 15 13:51:11 2008 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Tue, 15 Jan 2008 10:51:11 -0800 Subject: [SciPy-dev] Sparse BLAS in scipy In-Reply-To: <2bc7a5a50711281901i6fbfd265yd9ee253b6e521469@mail.gmail.com> References: <2bc7a5a50711281755s670db34br466d899609af53bc@mail.gmail.com> <2bc7a5a50711281901i6fbfd265yd9ee253b6e521469@mail.gmail.com> Message-ID: <2bc7a5a50801151051p350c1e1bl8abad96297baa2b5@mail.gmail.com> Hi all, Sorry it took me so long to get to this, not even sure if it's needed any longer, but I think the attached satisfies my earlier need for a sparse triangular solver. Here's my thinking: Gaussian elimination is trivial for sparse upper triangular matrices, so I just wrote a wrapper for scipy.linsolve.splu that makes sure splu gets an upper triangular matrix by transposing the input matrix if necessary. Does this make sense? Cheers, Anand --Begin-- # Author: Anand Patil, 2007 # License: SciPy-compatible import scipy.sparse as sp import scipy.linsolve as ls def reverse_trans(trans): if trans=='N': return 'T' else: return 'N' class sparse_trisolver(object): """__call__(b, trans='N' or 'T')""" def __init__(self, chol_fac, uplo='U'): self.uplo = uplo if uplo=='U': self.splu = ls.splu(chol_fac) else: self.splu = ls.splu(chol_fac.T) def __call__(self, b, trans='N'): if self.uplo=='L': trans = reverse_trans(trans) if len(b.shape)==1: out = self.splu.solve(b, trans) else: out = b.copy() for i in xrange(b.shape[1]): out[:,i] = self.splu.solve(b[:,i], trans) return out # Test if __name__=='__main__': from numpy import asmatrix from numpy.random import normal from numpy.linalg import cholesky, solve A_dense = asmatrix(normal(size=(5,5))) A_dense = cholesky(A_dense.T*A_dense).T A_csc = sp.csc_matrix(A_dense) A_csr = sp.csr_matrix(A_dense) U_csc = sparse_trisolver(A_csc) U_csr = sparse_trisolver(A_csr) L_csr = sparse_trisolver(A_csc.T, 'L') L_csc = sparse_trisolver(A_csr.T, 'L') B = normal(size=(5,30)) U_real = solve(A_dense, B) L_real = solve(A_dense.T, B) U_csc_sol = U_csc(B) U_csr_sol = U_csr(B) L_csc_sol = L_csc(B) L_csr_sol = L_csr(B) for upper in U_csc_sol, U_csr_sol: print abs(upper - U_real).max() for lower in L_csc_sol, L_csr_sol: print abs(lower - L_real).max() --End-- On Nov 28, 2007 7:01 PM, Anand Patil wrote: > On Nov 28, 2007 5:55 PM, Anand Patil > wrote: > > > > > > > > > OK, if you've done the research already and come to that conclusion I > should just do it! Regardless of the backend (LU, gauss-seidel or from > scratch) here's what I'm proposing for the Python interface: > > > > B=trimult(A, B, uplo='U', side='L', inplace=True) > > B=trisolve(A, B, uplo, side, inplace) > > > > I just sat down to write the 'trimult' routines and realized there's no > need, the sparse general matrix-matrix multiplication saves the work anyway! > Sorry for being slow. I'll just work on the 'trisolve' one. > From ollinger at wisc.edu Tue Jan 15 14:00:25 2008 From: ollinger at wisc.edu (John Ollinger) Date: Tue, 15 Jan 2008 13:00:25 -0600 Subject: [SciPy-dev] Mac leopard and scipy sparse status Message-ID: <5b9ba9310801151100j15bcadc9ld28c016d68b63d0a@mail.gmail.com> I have been porting my numpy/scipy apps to my new imac for the last week or so , and have been getting seg faults in the sparse matrix package for the available binary build. I rebuilt lapack, atlas, umfpack and numpy from sources without problems, but the scipy build crashes while wrapping umfpack. I recall seeing some traffic about modifications to the sparse matrix package, and am wondering if I should get the svn version of this package before I starting working on the problem. It is probably just a change in the swig include file, but I use this package a lot and would like to start with a more recent version if there have been significant changes. John p.s. I would be happy to package up the release and contribute it if anyone is interested. Leopard ships gcc 4.0.2, which if I recall correctly, performed pretty badly on the atlas benchmarks. I installed 4.2.2 and am using that for the build, so it might be bit faster. -- John Ollinger University of Wisconsin Waisman Center, T233 1500 Highland Ave Madison, WI 53711 http://brainimaging.waisman.wisc.edu/~jjo/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From wnbell at gmail.com Wed Jan 16 05:57:36 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 16 Jan 2008 04:57:36 -0600 Subject: [SciPy-dev] Mac leopard and scipy sparse status In-Reply-To: <5b9ba9310801151100j15bcadc9ld28c016d68b63d0a@mail.gmail.com> References: <5b9ba9310801151100j15bcadc9ld28c016d68b63d0a@mail.gmail.com> Message-ID: On Jan 15, 2008 1:00 PM, John Ollinger wrote: > > I have been porting my numpy/scipy apps to my new imac for the last week or > so , and have been getting seg faults in the sparse matrix package for the > available binary build. I rebuilt lapack, atlas, umfpack and numpy from > sources without problems, but the scipy build crashes while wrapping > umfpack. I recall seeing some traffic about modifications to the sparse > matrix package, and am wondering if I should get the svn version of this > package before I starting working on the problem. It is probably just a > change in the swig include file, but I use this package a lot and would > like to start with a more recent version if there have been significant > changes. When you say "seg faults in the sparse matrix package" do you mean problems with UMFPACK/SuperLU or general sparse problems? Also, what error do you get when wrapping UMFPACK? Do you have a recent version of SWIG installed? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Jan 16 06:32:25 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 16 Jan 2008 05:32:25 -0600 Subject: [SciPy-dev] Sparse BLAS in scipy In-Reply-To: <2bc7a5a50801151051p350c1e1bl8abad96297baa2b5@mail.gmail.com> References: <2bc7a5a50711281755s670db34br466d899609af53bc@mail.gmail.com> <2bc7a5a50711281901i6fbfd265yd9ee253b6e521469@mail.gmail.com> <2bc7a5a50801151051p350c1e1bl8abad96297baa2b5@mail.gmail.com> Message-ID: On Jan 15, 2008 12:51 PM, Anand Patil wrote: > Here's my thinking: Gaussian elimination is trivial for sparse upper > triangular matrices, so I just wrote a wrapper for scipy.linsolve.splu > that makes sure splu gets an upper triangular matrix by transposing > the input matrix if necessary. Does this make sense? When you had originally mentioned sparse triangular solves, I thought you wanted something more lightweight. Are you sure that spsolve doesn't already do what you want? I would imagine that a reasonably smart LU factorization method would do little work when given an upper *or* lower triangular system. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Wed Jan 16 08:04:32 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 16 Jan 2008 13:04:32 +0000 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> Message-ID: <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> Hi, > > For me, the way it works now seems to cover the need for something > > obviously benchmarking without adding a extra layer on top of the > > testing and code layout. But, we could also go the other way. I'd be > > happy of group thoughts. > > Just to say - that moving benchmark tests to a benchmark directory is > fine Reading further and testing, I find that I was wrong about this. Nose, by default, will discover tests within package directories, or directories matching the usual nose test file regexp, and not otherwise. 'benchmarks' does not match this regexp. So, allowing tests in benchmark directories will involve some further non-default nose tweaking (ah that name). Personally, I would rather keep our use of nose as close to default as possible, and therefore do benchmarks in the test directory, as before, with @dec.bench decorators to identify them. Do y'all agree? Matthew From wnbell at gmail.com Wed Jan 16 08:41:42 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 16 Jan 2008 07:41:42 -0600 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> Message-ID: On Jan 16, 2008 7:04 AM, Matthew Brett wrote: > nose tweaking (ah that name). Personally, I would rather keep our use > of nose as close to default as possible, and therefore do benchmarks > in the test directory, as before, with > > @dec.bench > > decorators to identify them. Do y'all agree? Sounds good to me. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From ollinger at wisc.edu Wed Jan 16 09:29:12 2008 From: ollinger at wisc.edu (John Ollinger) Date: Wed, 16 Jan 2008 08:29:12 -0600 Subject: [SciPy-dev] Mac leopard and scipy sparse status In-Reply-To: References: <5b9ba9310801151100j15bcadc9ld28c016d68b63d0a@mail.gmail.com> Message-ID: <5b9ba9310801160629x31b31d71xd2fc398dc6ddb8e1@mail.gmail.com> I was premature when I concluded that the problem was in the sparse matrix package. At this point I don't know where it is occurring. Thanks for the response and I apolgize for the unnecessary post. John On Jan 16, 2008 4:57 AM, Nathan Bell wrote: > On Jan 15, 2008 1:00 PM, John Ollinger wrote: > > > > I have been porting my numpy/scipy apps to my new imac for the last week > or > > so , and have been getting seg faults in the sparse matrix package for > the > > available binary build. I rebuilt lapack, atlas, umfpack and numpy > from > > sources without problems, but the scipy build crashes while wrapping > > umfpack. I recall seeing some traffic about modifications to the sparse > > matrix package, and am wondering if I should get the svn version of this > > package before I starting working on the problem. It is probably just a > > change in the swig include file, but I use this package a lot and would > > like to start with a more recent version if there have been significant > > changes. > > When you say "seg faults in the sparse matrix package" do you mean > problems with UMFPACK/SuperLU or general sparse problems? > > Also, what error do you get when wrapping UMFPACK? Do you have a > recent version of SWIG installed? > > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- John Ollinger University of Wisconsin Waisman Center, T233 1500 Highland Ave Madison, WI 53711 http://brainimaging.waisman.wisc.edu/~jjo/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From wnbell at gmail.com Wed Jan 16 10:03:38 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 16 Jan 2008 09:03:38 -0600 Subject: [SciPy-dev] Mac leopard and scipy sparse status In-Reply-To: <5b9ba9310801160629x31b31d71xd2fc398dc6ddb8e1@mail.gmail.com> References: <5b9ba9310801151100j15bcadc9ld28c016d68b63d0a@mail.gmail.com> <5b9ba9310801160629x31b31d71xd2fc398dc6ddb8e1@mail.gmail.com> Message-ID: On Jan 16, 2008 8:29 AM, John Ollinger wrote: > > I was premature when I concluded that the problem was in the sparse matrix > package. At this point I don't know where it is occurring. Thanks for the > response and I apolgize for the unnecessary post. Not a problem. We've had a handful of reports in the past regarding difficulties with older GCC versions and OSX, so it's possible that some platform-specific issues exist. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From peter.skomoroch at gmail.com Wed Jan 16 10:22:51 2008 From: peter.skomoroch at gmail.com (Peter Skomoroch) Date: Wed, 16 Jan 2008 10:22:51 -0500 Subject: [SciPy-dev] Fedora Core 6 x86_64? Message-ID: I'm having some trouble tracking down the best recipe for building an optimized numpy on FC6 x86_64 with BLAS/LAPACK. David's repository seems empty right now: http://software.opensuse.org/download/home:/ashigabou/ -Pete -- Peter N. Skomoroch peter.skomoroch at gmail.com http://www.datawrangling.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Wed Jan 16 11:52:03 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 16 Jan 2008 08:52:03 -0800 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> Message-ID: On Jan 16, 2008 5:04 AM, Matthew Brett wrote: > Reading further and testing, I find that I was wrong about this. > Nose, by default, will discover tests within package directories, or > directories matching the usual nose test file regexp, and not > otherwise. 'benchmarks' does not match this regexp. So, allowing > tests in benchmark directories will involve some further non-default > nose tweaking (ah that name). Personally, I would rather keep our use > of nose as close to default as possible, and therefore do benchmarks > in the test directory, as before, with > > @dec.bench > > decorators to identify them. Do y'all agree? +1 -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From fperez.net at gmail.com Wed Jan 16 12:02:24 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 16 Jan 2008 10:02:24 -0700 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> Message-ID: On Jan 16, 2008 9:52 AM, Jarrod Millman wrote: > On Jan 16, 2008 5:04 AM, Matthew Brett wrote: > > Reading further and testing, I find that I was wrong about this. > > Nose, by default, will discover tests within package directories, or > > directories matching the usual nose test file regexp, and not > > otherwise. 'benchmarks' does not match this regexp. So, allowing > > tests in benchmark directories will involve some further non-default > > nose tweaking (ah that name). Personally, I would rather keep our use > > of nose as close to default as possible, and therefore do benchmarks > > in the test directory, as before, with > > > > @dec.bench > > > > decorators to identify them. Do y'all agree? > > +1 As we discussed the other day, we could also have a convention for benchmark-only files, *still in the test directory*, to be named something like bench_test_*.py This way nose still picks them up, yet to the human reader it's clear that those are benchmarks. Cheers, f From matthew.brett at gmail.com Wed Jan 16 13:17:54 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 16 Jan 2008 18:17:54 +0000 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> Message-ID: <1e2af89e0801161017i13997a4ocbd65efabbb2993d@mail.gmail.com> Actually, it would not be too hard to have tests in a benchmark directory, with names beginning with 'bench', no decorators. It means that the tests have to be renamed from test_ to bench_, and that the tests would not, by default, be found by the vanilla 'nosetests /my/directory/somewhere', but I guess that's OK (backtracking on previous opinion). Better? Matthew From robert.kern at gmail.com Wed Jan 16 14:07:22 2008 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 16 Jan 2008 13:07:22 -0600 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801161017i13997a4ocbd65efabbb2993d@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> <1e2af89e0801161017i13997a4ocbd65efabbb2993d@mail.gmail.com> Message-ID: <478E55EA.1060204@gmail.com> Matthew Brett wrote: > Actually, it would not be too hard to have tests in a benchmark > directory, with names beginning with 'bench', no decorators. It means > that the tests have to be renamed from test_ to bench_, and that the > tests would not, by default, be found by the vanilla 'nosetests > /my/directory/somewhere', but I guess that's OK (backtracking on > previous opinion). Better? I would like the benchmarks to be *not* found unless specifically asked for, even when just using nosetests instead of scipy.test(). Our use of nose to collect and run these benchmarks is convenient (and a good idea, IMO), but it is not the purpose of the tool or the framework to run benchmarks. I think that explicit action should be required to convince nose to run the benchmarks. We/I can rig up a simple script (nosebench?) that will configure nose with the appropriate regexes to find the benchmarks. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From nwagner at iam.uni-stuttgart.de Wed Jan 16 14:50:04 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 16 Jan 2008 20:50:04 +0100 Subject: [SciPy-dev] SyntaxError in scikits/learn arffread.py Message-ID: Hi all, I have installed the scikits package learn from scratch. Compiling /usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py ... File "/usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py", line 401 finally: ^ SyntaxError: invalid syntax Cheers, Nils From nwagner at iam.uni-stuttgart.de Wed Jan 16 15:00:13 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 16 Jan 2008 21:00:13 +0100 Subject: [SciPy-dev] Trouble with scikits/ann Message-ID: Hi all, I had some trouble with the installation of the scikits package ann. Traceback (most recent call last): File "setup.py", line 110, in ? test_suite = 'scikits.ann.tests.test_ann', File "/usr/lib/python2.4/site-packages/numpy/distutils/core.py", line 147, in setup config = configuration() File "setup.py", line 90, in configuration config.add_subpackage(DISTNAME) File "/usr/lib/python2.4/site-packages/numpy/distutils/misc_util.py", line 834, in add_subpackage caller_level = 2) File "/usr/lib/python2.4/site-packages/numpy/distutils/misc_util.py", line 817, in get_subpackage caller_level = caller_level + 1) File "/usr/lib/python2.4/site-packages/numpy/distutils/misc_util.py", line 749, in _get_configuration_from_setup_py ('.py', 'U', 1)) File "scikits/ann/setup.py", line 38, in ? ANN_ROOT = os.path.join(os.path.sep,*(__file__.split(os.path.sep)[-6])) IndexError: list index out of range How can I resolve the problem ? Cheers, Nils From nwagner at iam.uni-stuttgart.de Wed Jan 16 15:09:35 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 16 Jan 2008 21:09:35 +0100 Subject: [SciPy-dev] KeyError in scikits/audiolab Message-ID: Hi all, The installation of scikits/audiolab also fails. Am I missing something ? /usr/bin/python setup.py install sndfile_info: libraries sndfile not found in /usr/local/lib FOUND: libraries = ['sndfile'] library_dirs = ['/usr/lib'] fulllibloc = /usr/lib/libsndfile.so.1 fullheadloc = /usr/include/sndfile.h include_dirs = ['/usr/include'] Traceback (most recent call last): File "setup.py", line 214, in ? classifiers = File "/usr/lib/python2.4/site-packages/numpy/distutils/core.py", line 147, in setup config = configuration() File "setup.py", line 158, in configuration from scikits.audiolab.info import VERSION as audiolab_version File "/home/nwagner/svn/audiolab/scikits/audiolab/__init__.py", line 24, in ? from pysndfile import formatinfo, sndfile File "/home/nwagner/svn/audiolab/scikits/audiolab/pysndfile.py", line 216, in ? py_to_snd_file_format_dic = { KeyError: 'SF_FORMAT_FLAC' Cheers, Nils From matthieu.brucher at gmail.com Wed Jan 16 15:53:43 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 16 Jan 2008 21:53:43 +0100 Subject: [SciPy-dev] SyntaxError in scikits/learn arffread.py In-Reply-To: References: Message-ID: Hi, What Python version ? Matthieu 2008/1/16, Nils Wagner : > > Hi all, > > I have installed the scikits package learn from scratch. > > Compiling > /usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py > ... > File > "/usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py", > line 401 > finally: > ^ > SyntaxError: invalid syntax > > Cheers, > Nils > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Wed Jan 16 15:59:21 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 16 Jan 2008 20:59:21 +0000 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <478E55EA.1060204@gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> <1e2af89e0801161017i13997a4ocbd65efabbb2993d@mail.gmail.com> <478E55EA.1060204@gmail.com> Message-ID: <1e2af89e0801161259u679a5b4u2a5beea0f861a935@mail.gmail.com> > I would like the benchmarks to be *not* found unless specifically asked for, > even when just using nosetests instead of scipy.test(). Our use of nose to > collect and run these benchmarks is convenient (and a good idea, IMO), but it is > not the purpose of the tool or the framework to run benchmarks. I think that > explicit action should be required to convince nose to run the benchmarks. > > We/I can rig up a simple script (nosebench?) that will configure nose with the > appropriate regexes to find the benchmarks. Ok - I think that's a vote for the separate benchmarks directory and tests named bench_something within it, run with module.bench() or nosetests --match '(?:^|[\b_\./-])[Bb]ench' or equivalent. Matthew Matthew From nwagner at iam.uni-stuttgart.de Wed Jan 16 17:16:36 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 16 Jan 2008 23:16:36 +0100 Subject: [SciPy-dev] SyntaxError in scikits/learn arffread.py In-Reply-To: References: Message-ID: On Wed, 16 Jan 2008 21:53:43 +0100 "Matthieu Brucher" wrote: > Hi, > > What Python version ? python2.4 From anand.prabhakar.patil at gmail.com Wed Jan 16 18:11:03 2008 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Wed, 16 Jan 2008 15:11:03 -0800 Subject: [SciPy-dev] Sparse BLAS in scipy In-Reply-To: References: <2bc7a5a50711281755s670db34br466d899609af53bc@mail.gmail.com> <2bc7a5a50711281901i6fbfd265yd9ee253b6e521469@mail.gmail.com> <2bc7a5a50801151051p350c1e1bl8abad96297baa2b5@mail.gmail.com> Message-ID: <2bc7a5a50801161511p491002cfi593a230f6fb1c2cb@mail.gmail.com> > When you had originally mentioned sparse triangular solves, I thought > you wanted something more lightweight. Ideally, I did... but this solution is trivial to write & maintain, and I figured it would waste no floating-point ops and only relatively few memory and logic ops. > Are you sure that spsolve doesn't already do what you want? I would > imagine that a reasonably smart LU factorization method would do > little work when given an upper *or* lower triangular system. I emailed the maintainer of SuperLU just now, and she said she doubted it would do anything when asked to factorize an upper triangular system, but didn't comment on a lower triangular system. Is there any way to get the lu factors out of a factorized_lu object and check? Anand From matthieu.brucher at gmail.com Thu Jan 17 02:06:24 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 17 Jan 2008 08:06:24 +0100 Subject: [SciPy-dev] SyntaxError in scikits/learn arffread.py In-Reply-To: References: Message-ID: This may then be related to : http://docs.python.org/whatsnew/pep-341.html Matthieu 2008/1/16, Nils Wagner : > > On Wed, 16 Jan 2008 21:53:43 +0100 > "Matthieu Brucher" wrote: > > Hi, > > > > What Python version ? > > python2.4 > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Jan 17 08:55:51 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 17 Jan 2008 13:55:51 +0000 Subject: [SciPy-dev] Nose tests - merged In-Reply-To: <1e2af89e0801161259u679a5b4u2a5beea0f861a935@mail.gmail.com> References: <1e2af89e0801120210h6be3e4d4rdb4c9d96143d7591@mail.gmail.com> <1e2af89e0801121018h76bdbd71od4ef89c1b4006bc@mail.gmail.com> <1e2af89e0801121201h677c2f2cl28d464d9bb591216@mail.gmail.com> <1e2af89e0801160504r252450bdvfa65def2544eab99@mail.gmail.com> <1e2af89e0801161017i13997a4ocbd65efabbb2993d@mail.gmail.com> <478E55EA.1060204@gmail.com> <1e2af89e0801161259u679a5b4u2a5beea0f861a935@mail.gmail.com> Message-ID: <1e2af89e0801170555m6dd209a1hb9eaa5e7be5e5ca9@mail.gmail.com> > Ok - I think that's a vote for the separate benchmarks directory and > tests named bench_something within it, run with module.bench() or > > nosetests --match '(?:^|[\b_\./-])[Bb]ench' > > or equivalent. I'll go ahead and implement this unless I hear otherwise from the team... Matthew From ceball at users.sourceforge.net Thu Jan 17 11:09:54 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Thu, 17 Jan 2008 16:09:54 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: Jarrod Millman berkeley.edu> writes: [...] > As far as I know, no one is answering this question about development > plans for weave because no one has plans to further develop weave. Thanks for the information. Weave is important for our software [1], so we'd like to try to work on this problem. Before we start, any example of an existing converter of this type - or some guidance about where to start in the weave code - would be very helpful. Presumably the required additions will be quite small: All we need to do is allow conversion of dtype='O' (as indicated by the example and error given earlier [2]), since looping through a list of objects, for instance, is already no problem [3]. Thanks again, Chris [1] Topographica (topographica.org) - a package for computational modeling of neural maps [2] http://article.gmane.org/gmane.comp.python.scientific.user/14642 [3] import weave def test_list_loop(alist): """ Loop through alist (assumed to be 2D), setting 'through_loop' to be True on every item. """ rows,cols = len(alist),len(alist[1]) code = """ for (int r=0; r References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: <478F7FDB.9020304@enthought.com> Jarrod Millman wrote: > On Jan 15, 2008 3:33 PM, C. Ball wrote: > >> I have no idea about how the converters work, or how much work it >> would be to add a conversion for 'O'. Can anyone comment on that (how >> much work it would be), or on any related plans he/she might have for >> developing weave? >> > > As far as I know, no one is answering this question about development > plans for weave because no one has plans to further develop weave. > This is not quite accurate. Eric has spent time recently, in fact on updating and fixing the weave type converters. I don't know of plans to support 'O' types, but I don't believe it would be difficult with what I know about it. There has also been talk of merging f2py and weave together into a larger compiled-code library and/or subsuming weave into f2py (depending on who does what work...) But, absolutely, you have to either convince someone to work on it, or dig in and do it yourself. -Travis O. From ceball at users.sourceforge.net Thu Jan 17 11:19:40 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Thu, 17 Jan 2008 16:19:40 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: writes: > test_function(list_of_objects) Sorry, the last line of my previous email should have been: test_list_loop(list_of_object) From oliphant at enthought.com Thu Jan 17 11:20:28 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Thu, 17 Jan 2008 10:20:28 -0600 Subject: [SciPy-dev] Status of weave In-Reply-To: References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> Message-ID: <478F804C.6080304@enthought.com> C. Ball wrote: > Jarrod Millman berkeley.edu> writes: > [...] > >> As far as I know, no one is answering this question about development >> plans for weave because no one has plans to further develop weave. >> > > Thanks for the information. > > Weave is important for our software [1], so we'd like to try to work > on this problem. > Weave is also important to our software, and so it is not disappearing you may be assured. > Before we start, any example of an existing converter of this type - > or some guidance about where to start in the weave code - would be > very helpful. > > Presumably the required additions will be quite small: All we need to > do is allow conversion of dtype='O' (as indicated by the example and > error given earlier [2]), since looping through a list of objects, for > instance, is already no problem [3]. > I suspect it would not be difficult to add and kind of surprised it is not already there. The problem might be handling reference counting, correctly, but I'm not really sure. -Travis O. From pearu at cens.ioc.ee Thu Jan 17 11:42:31 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 17 Jan 2008 18:42:31 +0200 (EET) Subject: [SciPy-dev] Status of weave In-Reply-To: <478F7FDB.9020304@enthought.com> References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> <478F7FDB.9020304@enthought.com> Message-ID: <52643.85.166.31.187.1200588151.squirrel@cens.ioc.ee> On Thu, January 17, 2008 6:18 pm, Travis E. Oliphant wrote: > > This is not quite accurate. Eric has spent time recently, in fact on > updating and fixing the weave type converters. I don't know of plans > to support 'O' types, but I don't believe it would be difficult with > what I know about it. > > There has also been talk of merging f2py and weave together into a > larger compiled-code library and/or subsuming weave into f2py (depending > on who does what work...) Anyone, who is planning to take this work, please let me know as this task is in the top part of my todo list. We could create a google project, for instance, that would aim at merging f2py/weave/extgen and other extension generator tools to use a unified extension generator library. I have done lots of work in this direction already. At the moment I am invloved in developing sympycore but the above would be my next project when sympycore takes it shape. Regards, Pearu From ellisonbg.net at gmail.com Thu Jan 17 12:19:01 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 17 Jan 2008 10:19:01 -0700 Subject: [SciPy-dev] Scipy website and svn down? Message-ID: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> Hi, I look like all of the scipy/numpy stuff is down. I can't reach the website/repo for numpy/scipy/ipython/mpi4py/etc. Brian From fperez.net at gmail.com Thu Jan 17 12:30:11 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 17 Jan 2008 10:30:11 -0700 Subject: [SciPy-dev] Scipy website and svn down? In-Reply-To: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> References: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> Message-ID: On Jan 17, 2008 10:19 AM, Brian Granger wrote: > Hi, > > I look like all of the scipy/numpy stuff is down. I can't reach the > website/repo for numpy/scipy/ipython/mpi4py/etc. Yup, neither can I. I can ssh in, but that's it. I don't seem to see any apache processes running, perhaps they're restarting something? The machine looks fine otherwise (low load, responsive, etc). Cheers, f From jre at enthought.com Thu Jan 17 12:42:11 2008 From: jre at enthought.com (J. Ryan Earl) Date: Thu, 17 Jan 2008 11:42:11 -0600 Subject: [SciPy-dev] Scipy website and svn down? In-Reply-To: References: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> Message-ID: <478F9373.40106@enthought.com> The webserver didn't come up after the bi-daily restart at 10AM. I'm not sure why I'm still investigating. The primary error when starting is: [Thu Jan 17 10:48:23 2008] [error] (28)No space left on device: Cannot create SSLMutex Configuration Failed Yet there is definitely space: Filesystem Size Used Avail Use% Mounted on /dev/md1 108G 71G 33G 69% / /dev/md0 122M 96M 21M 83% /boot none 1014M 0 1014M 0% /dev/shm And inodes: [root at scipy logs]# df -i Filesystem Inodes IUsed IFree IUse% Mounted on /dev/md1 14385152 816449 13568703 6% / /dev/md0 32128 128 32000 1% /boot none 223879 1 223878 1% /dev/shm I'm working the issue. -ryan Fernando Perez wrote: > On Jan 17, 2008 10:19 AM, Brian Granger wrote: > >> Hi, >> >> I look like all of the scipy/numpy stuff is down. I can't reach the >> website/repo for numpy/scipy/ipython/mpi4py/etc. >> > > Yup, neither can I. I can ssh in, but that's it. I don't seem to see > any apache processes running, perhaps they're restarting something? > The machine looks fine otherwise (low load, responsive, etc). > > Cheers, > > f > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From jre at enthought.com Thu Jan 17 12:49:06 2008 From: jre at enthought.com (J. Ryan Earl) Date: Thu, 17 Jan 2008 11:49:06 -0600 Subject: [SciPy-dev] Scipy website and svn down? In-Reply-To: <478F9373.40106@enthought.com> References: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> <478F9373.40106@enthought.com> Message-ID: <478F9512.7070308@enthought.com> It appears to be an IPC issue. Looks like it can't create a semaphore because they aren't getting cleaned up for some reason. -ryan J. Ryan Earl wrote: > The webserver didn't come up after the bi-daily restart at 10AM. I'm > not sure why I'm still investigating. The primary error when starting is: > > [Thu Jan 17 10:48:23 2008] [error] (28)No space left on device: Cannot create SSLMutex > Configuration Failed > > > Yet there is definitely space: > > Filesystem Size Used Avail Use% Mounted on > /dev/md1 108G 71G 33G 69% / > /dev/md0 122M 96M 21M 83% /boot > none 1014M 0 1014M 0% /dev/shm > > And inodes: > > [root at scipy logs]# df -i > Filesystem Inodes IUsed IFree IUse% Mounted on > /dev/md1 14385152 816449 13568703 6% / > /dev/md0 32128 128 32000 1% /boot > none 223879 1 223878 1% /dev/shm > > > > > I'm working the issue. > -ryan > > Fernando Perez wrote: > >> On Jan 17, 2008 10:19 AM, Brian Granger wrote: >> >> >>> Hi, >>> >>> I look like all of the scipy/numpy stuff is down. I can't reach the >>> website/repo for numpy/scipy/ipython/mpi4py/etc. >>> >>> >> Yup, neither can I. I can ssh in, but that's it. I don't seem to see >> any apache processes running, perhaps they're restarting something? >> The machine looks fine otherwise (low load, responsive, etc). >> >> Cheers, >> >> f >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From jre at enthought.com Thu Jan 17 13:08:01 2008 From: jre at enthought.com (J. Ryan Earl) Date: Thu, 17 Jan 2008 12:08:01 -0600 Subject: [SciPy-dev] Scipy website and svn down? In-Reply-To: <478F9512.7070308@enthought.com> References: <6ce0ac130801170919p5095f081l8961c0370c017f46@mail.gmail.com> <478F9373.40106@enthought.com> <478F9512.7070308@enthought.com> Message-ID: <478F9981.3090004@enthought.com> Increasing the number of system semaphores was the quickest, least intrusive fix. The web-server is running again. -ryan J. Ryan Earl wrote: > It appears to be an IPC issue. Looks like it can't create a semaphore > because they aren't getting cleaned up for some reason. > > -ryan > > J. Ryan Earl wrote: > >> The webserver didn't come up after the bi-daily restart at 10AM. I'm >> not sure why I'm still investigating. The primary error when starting is: >> >> [Thu Jan 17 10:48:23 2008] [error] (28)No space left on device: Cannot create SSLMutex >> Configuration Failed >> >> >> Yet there is definitely space: >> >> Filesystem Size Used Avail Use% Mounted on >> /dev/md1 108G 71G 33G 69% / >> /dev/md0 122M 96M 21M 83% /boot >> none 1014M 0 1014M 0% /dev/shm >> >> And inodes: >> >> [root at scipy logs]# df -i >> Filesystem Inodes IUsed IFree IUse% Mounted on >> /dev/md1 14385152 816449 13568703 6% / >> /dev/md0 32128 128 32000 1% /boot >> none 223879 1 223878 1% /dev/shm >> >> >> >> >> I'm working the issue. >> -ryan >> >> Fernando Perez wrote: >> >> >>> On Jan 17, 2008 10:19 AM, Brian Granger wrote: >>> >>> >>> >>>> Hi, >>>> >>>> I look like all of the scipy/numpy stuff is down. I can't reach the >>>> website/repo for numpy/scipy/ipython/mpi4py/etc. >>>> >>>> >>>> >>> Yup, neither can I. I can ssh in, but that's it. I don't seem to see >>> any apache processes running, perhaps they're restarting something? >>> The machine looks fine otherwise (low load, responsive, etc). >>> >>> Cheers, >>> >>> f >>> _______________________________________________ >>> Scipy-dev mailing list >>> Scipy-dev at scipy.org >>> http://projects.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> >>> >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From david at ar.media.kyoto-u.ac.jp Thu Jan 17 23:49:54 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 18 Jan 2008 13:49:54 +0900 Subject: [SciPy-dev] SyntaxError in scikits/learn arffread.py In-Reply-To: References: Message-ID: <47902FF2.6050505@ar.media.kyoto-u.ac.jp> Nils Wagner wrote: > Hi all, > > I have installed the scikits package learn from scratch. > > Compiling > /usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py > ... > File > "/usr/lib/python2.4/site-packages/scikits/learn/utils/arffread.py", > line 401 > finally: > ^ > SyntaxError: invalid syntax > The problem is that I used the try/except/finally syntax, which was introduced in python 2.5 only. The learn package is a bit in limbo because I focused my free-time (and more :) ) on numscons, but once this is finished, I intend to go back on scikits.learn (there is a strong guarantee I will work on it in the next few months, if only for the reason that I need it for my PhD). cheers, David From david at ar.media.kyoto-u.ac.jp Fri Jan 18 00:05:45 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 18 Jan 2008 14:05:45 +0900 Subject: [SciPy-dev] scikits: mailing list for tickets, etc... Message-ID: <479033A9.9050501@ar.media.kyoto-u.ac.jp> Hi, Since the deprecation of the sandboxes, I think it is reasonable to expect more people getting scikits. Would it be possible to get at least a ML for the tickets and svn commit ? Ideally, a ML for users/dev would be good, too, cheers, David From fperez.net at gmail.com Fri Jan 18 02:26:43 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 18 Jan 2008 00:26:43 -0700 Subject: [SciPy-dev] scikits: mailing list for tickets, etc... In-Reply-To: <479033A9.9050501@ar.media.kyoto-u.ac.jp> References: <479033A9.9050501@ar.media.kyoto-u.ac.jp> Message-ID: On Jan 17, 2008 10:05 PM, David Cournapeau wrote: > Hi, > > Since the deprecation of the sandboxes, I think it is reasonable to > expect more people getting scikits. Would it be possible to get at least > a ML for the tickets and svn commit ? Ideally, a ML for users/dev would > be good, too, -1 on separate user/dev lists. I think if the traffic for scikits really grows out of control that may become necessary, but at least initially it might be beneficial to keep those discussions together with the rest of scipy. That would provide a mechanism for all to learn about useful toolkits, it would promote better interoperability and flow of ideas/code into the core (when appropriate), etc. Having separate lists seems to me like an unnecessary extra barrier in this case. Cheers, f From stefan at sun.ac.za Fri Jan 18 02:51:34 2008 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 18 Jan 2008 09:51:34 +0200 Subject: [SciPy-dev] 32bit vs 64bit doctest differences In-Reply-To: References: Message-ID: <20080118075134.GA11941@mentat.za.net> Hi Nathan On Mon, Jan 14, 2008 at 02:15:10PM -0600, Nathan Bell wrote: > It appears that numpy prints the 'dtype=intXX' part only when the size > of int differs from the machine word size. For example, consider the > following test run on a 64-bit machine: > > File "/usr/lib/python2.5/site-packages/scipy/stsci/convolve/Convolve.py", > line 295, in scipy.stsci.convolve.Convolve.boxcar > Failed example: > boxcar(num.array([10, 0, 0, 0, 0, 0, 1000]), (3,), > mode="wrap").astype(num.longlong) > Expected: > array([336, 3, 0, 0, 0, 333, 336], dtype=int64) > Got: > array([336, 3, 0, 0, 0, 333, 336]) > > > Should we just use dtype=int to avoid this issue? It appears > dtype=int chooses the native size. This situation does make tests more difficult to write. We can't use "int", because then interpreting x.__repr__ wont necessarily give x. Why don't we always print the dtype, whether it is an int32, int64, float32 or float64? Regards St?fan From matthieu.brucher at gmail.com Fri Jan 18 04:32:17 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 18 Jan 2008 10:32:17 +0100 Subject: [SciPy-dev] Status of arpack Message-ID: Hi, Did someone take the job of putting it into the trunk in some place, patch it and test it ? I'd like to use it in the near future :) Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnar.flatberg at gmail.com Fri Jan 18 07:30:12 2008 From: arnar.flatberg at gmail.com (Arnar Flatberg) Date: Fri, 18 Jan 2008 13:30:12 +0100 Subject: [SciPy-dev] Status of arpack In-Reply-To: References: Message-ID: <5d3194020801180430v492606delcc22a99241085de2@mail.gmail.com> Hi, Ive been using the sandbox version for a while, and have experienced no problems with the *patched* version. However, FYI, the highlevel interface (arpack.eigen and arpack.eigen_symmetric) still doesnt implement the generalised eigendecomposition, which may be important to you. (Im guessing that you will use this for the manifold learning scikit, and lots of these algortihms commonly use gen.eig). I dont think this would be terrible hard to implement (arpack.speigs supports it) and would be nice to add prior to inclusion in the trunk. Arnar On Jan 18, 2008 10:32 AM, Matthieu Brucher wrote: > Hi, > > Did someone take the job of putting it into the trunk in some place, patch > it and test it ? > I'd like to use it in the near future :) > > Matthieu > -- > French PhD student > Website : http://matthieu-brucher.developpez.com/ > Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 > LinkedIn : http://www.linkedin.com/in/matthieubrucher > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From matthieu.brucher at gmail.com Fri Jan 18 07:50:45 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 18 Jan 2008 13:50:45 +0100 Subject: [SciPy-dev] Status of arpack In-Reply-To: <5d3194020801180430v492606delcc22a99241085de2@mail.gmail.com> References: <5d3194020801180430v492606delcc22a99241085de2@mail.gmail.com> Message-ID: 2008/1/18, Arnar Flatberg : > > Hi, > Ive been using the sandbox version for a while, and have experienced > no problems with the *patched* version. OK, I'll just wait for someone to put it in the correct module. However, FYI, the highlevel > interface (arpack.eigen and arpack.eigen_symmetric) still doesnt > implement the generalised eigendecomposition, which may be important > to you. (Im guessing that you will use this for the manifold learning > scikit, and lots of these algortihms commonly use gen.eig). In fact, it is not needed, it can be rewritten as a classic eigendecomposition problem (at least for Laplacian Eigenmaps). I dont > think this would be terrible hard to implement (arpack.speigs supports > it) and would be nice to add prior to inclusion in the trunk. > Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From mattknox_ca at hotmail.com Fri Jan 18 09:15:58 2008 From: mattknox_ca at hotmail.com (Matt Knox) Date: Fri, 18 Jan 2008 14:15:58 +0000 (UTC) Subject: [SciPy-dev] scikits: mailing list for tickets, etc... References: <479033A9.9050501@ar.media.kyoto-u.ac.jp> Message-ID: > -1 on separate user/dev lists. I think if the traffic for scikits > really grows out of control that may become necessary, but at least > initially it might be beneficial to keep those discussions together > with the rest of scipy. That would provide a mechanism for all to > learn about useful toolkits, it would promote better interoperability > and flow of ideas/code into the core (when appropriate), etc. Having > separate lists seems to me like an unnecessary extra barrier in this > case. I agree completely. Actually, I think the current situation of three lists could probably be trimmed down to two even (one for numpy, one for scipy), but I won't lose any sleep over it one way or the other. The numpy list has both development and "user" topics on it and seems to do just fine. I think scipy could just as easily work with a unified mailing list. People often send "development" related posts to the scipy-user list already anyway. - Matt From david at ar.media.kyoto-u.ac.jp Fri Jan 18 09:09:59 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 18 Jan 2008 23:09:59 +0900 Subject: [SciPy-dev] scikits: mailing list for tickets, etc... In-Reply-To: References: <479033A9.9050501@ar.media.kyoto-u.ac.jp> Message-ID: <4790B337.1040004@ar.media.kyoto-u.ac.jp> Matt Knox wrote: >> -1 on separate user/dev lists. I think if the traffic for scikits >> really grows out of control that may become necessary, but at least >> initially it might be beneficial to keep those discussions together >> with the rest of scipy. That would provide a mechanism for all to >> learn about useful toolkits, it would promote better interoperability >> and flow of ideas/code into the core (when appropriate), etc. Having >> separate lists seems to me like an unnecessary extra barrier in this >> case. >> > > I agree completely. Actually, I think the current situation of three lists could > probably be trimmed down to two even (one for numpy, one for scipy), but I won't > lose any sleep over it one way or the other. The numpy list has both development > and "user" topics on it and seems to do just fine. I think scipy could just as > easily work with a unified mailing list. People often send "development" related > posts to the scipy-user list already anyway. > Actually, I agree too, it was stupid to suggest the user/dev ML. I would still prefer having the ML for tickets and commit, though, cheers, David From guyer at nist.gov Fri Jan 18 09:55:18 2008 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 18 Jan 2008 09:55:18 -0500 Subject: [SciPy-dev] scikits: mailing list for tickets, etc... In-Reply-To: <4790B337.1040004@ar.media.kyoto-u.ac.jp> References: <479033A9.9050501@ar.media.kyoto-u.ac.jp> <4790B337.1040004@ar.media.kyoto-u.ac.jp> Message-ID: <92E335E0-7104-4129-AF1B-B1BAD14D7B56@nist.gov> On Jan 18, 2008, at 9:09 AM, David Cournapeau wrote: > Actually, I agree too, it was stupid to suggest the user/dev ML. I > would > still prefer having the ML for tickets and commit, though, Are you aware that you can get an RSS feed of this info? There are various points you can tap in, but the equivalent of http:// projects.scipy.org/scipy/scipy/timeline is what I use to monitor our FiPy project. Much nicer than the stream of emails that I used to get. From david at ar.media.kyoto-u.ac.jp Sun Jan 20 08:47:43 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sun, 20 Jan 2008 22:47:43 +0900 Subject: [SciPy-dev] scikits: mailing list for tickets, etc... In-Reply-To: <92E335E0-7104-4129-AF1B-B1BAD14D7B56@nist.gov> References: <479033A9.9050501@ar.media.kyoto-u.ac.jp> <4790B337.1040004@ar.media.kyoto-u.ac.jp> <92E335E0-7104-4129-AF1B-B1BAD14D7B56@nist.gov> Message-ID: <479350FF.8070609@ar.media.kyoto-u.ac.jp> Jonathan Guyer wrote: > On Jan 18, 2008, at 9:09 AM, David Cournapeau wrote: > > >> Actually, I agree too, it was stupid to suggest the user/dev ML. I >> would >> still prefer having the ML for tickets and commit, though, >> > > Are you aware that you can get an RSS feed of this info? There are > various points you can tap in, but the equivalent of http:// > projects.scipy.org/scipy/scipy/timeline is what I use to monitor our > FiPy project. Much nicer than the stream of emails that I used to get. > Well, I guess this is a matter of preferences, but I don't like RSS so much (not even 30, and already new-tech averse :) ). cheers, David From ggellner at uoguelph.ca Sun Jan 20 10:36:53 2008 From: ggellner at uoguelph.ca (Gabriel Gellner) Date: Sun, 20 Jan 2008 08:36:53 -0700 Subject: [SciPy-dev] WxPython required? Message-ID: <20080120153653.GA12003@giton> Upon a recent svn update, I get this error when I run scipy.test(): ====================================================================== ERROR: Failure: (Could not locate wxPython base directory.) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/loader.py", line 344, in loadTestsFromName addr.filename, addr.module) File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/weave/tests/test_wx_spec.py", line 13, in from scipy.weave import ext_tools, wx_spec File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/weave/wx_spec.py", line 20, in wx_base = find_base_dir() File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/weave/wx_spec.py", line 16, in find_base_dir raise RuntimeError("Could not locate wxPython base directory.") RuntimeError: Could not locate wxPython base directory. ---------------------------------------------------------------------- Upon installing wxpython this error disappears, does this mean scipy depends on wxpython now? Gabriel From travis at enthought.com Sun Jan 20 15:56:20 2008 From: travis at enthought.com (Travis Vaught) Date: Sun, 20 Jan 2008 14:56:20 -0600 Subject: [SciPy-dev] WxPython required? In-Reply-To: <20080120153653.GA12003@giton> References: <20080120153653.GA12003@giton> Message-ID: <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> On Jan 20, 2008, at 9:36 AM, Gabriel Gellner wrote: > Upon a recent svn update, I get this error when I run scipy.test(): > > ====================================================================== > ERROR: Failure: (Could not locate > wxPython base directory.) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/ > loader.py", line 344, in loadTestsFromName > addr.filename, addr.module) > File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/ > importer.py", line 39, in importFromPath > return self.importFromDir(dir_path, fqname) > File "/usr/lib/python2.5/site-packages/nose-0.10.0b1-py2.5.egg/nose/ > importer.py", line 84, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/ > weave/tests/test_wx_spec.py", line 13, in > from scipy.weave import ext_tools, wx_spec > File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/ > weave/wx_spec.py", line 20, in > wx_base = find_base_dir() > File "/home/ggellner/lib/python/lib/python2.5/site-packages/scipy/ > weave/wx_spec.py", line 16, in find_base_dir > raise RuntimeError("Could not locate wxPython base directory.") > RuntimeError: Could not locate wxPython base directory. > > ---------------------------------------------------------------------- > > Upon installing wxpython this error disappears, does this mean scipy > depends on wxpython now? Absolutely not. This is an issue we come across quite a bit with the enthought tool suite packages (now 'Projects') wherein tests are not really 'unit' tests (they are integration or functional test) and introduce a dependency in order to test integration with another lib or a particular interface function. There are various approaches to solve this (different test levels comes to mind, the decorator approach to performance tests seen on this list as well). The default test level certainly shouldn't call this test, though. Travis (V.) > > > Gabriel > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ggellner at uoguelph.ca Sun Jan 20 17:01:21 2008 From: ggellner at uoguelph.ca (Gabriel Gellner) Date: Sun, 20 Jan 2008 15:01:21 -0700 Subject: [SciPy-dev] sparse solvers in scipy rock In-Reply-To: References: <85b5c3130801150520r18c2edceo2e7e54fe4d6c40cd@mail.gmail.com> Message-ID: <20080120220120.GA6338@giton> Yes! Please, if you have time, make a cookbook entry . . . I would love to see how cython is used in this way. Gabriel On Tue, Jan 15, 2008 at 11:16:10AM -0700, Fernando Perez wrote: > On Jan 15, 2008 6:20 AM, Ondrej Certik wrote: > > Hi, > > > > yesterday I wrote some solver for 1D time dependent schrodinger > > equation to my school, > > using explicit and implicit methods and using scipy sparse solvers, > > because I was lazy to use > > some fortran routines for inverting the tridiagonal matrix, and I was > > surprised how nice it works in scipy. > > You did a great job. I am convinced that is the way to call all sparse solvers. > > > > Here is the code: > > > > http://hg.certik.cz/schrod1D/ > > > > I based it on the example: > > > > http://www.scipy.org/Cookbook/SchrodingerFDTD > > > > which uses central differences explicit method, but I wrote it my way, > > using complex numbers directly and Cython to speed it up. > > > > I also implemented euler differences explicit method, > > which totally blows up after a few iterations, and also an implicit > > method, which works really well. > > > > Where do you think I could put it together with some documentation how > > to play with it? to the above wiki, or should I create a new wiki for > > that? > > It's an example how to use Cython+Numpy+scipy sparse solvers (superlu > > currently). > > I'd suggest making another Cookbook entry for it, and linking both > Schrodinger ones to each other, so that an interested party can read > both the simpler, pure python FDTD approach and your more > sophisticated one as well. > > The better the cookbook gets, the easier it will be for newcomers to > find examples that are similart to their needs, from which they can > learn and get started. > > Thanks! > > Cheers, > > f > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From matthew.brett at gmail.com Sun Jan 20 18:14:23 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 20 Jan 2008 23:14:23 +0000 Subject: [SciPy-dev] WxPython required? In-Reply-To: <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> References: <20080120153653.GA12003@giton> <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> Message-ID: <1e2af89e0801201514x3415e9aeo16fb7d904aa4e5bc@mail.gmail.com> Hi, > > Upon installing wxpython this error disappears, does this mean scipy > > depends on wxpython now? > > There are various approaches to > solve this (different test levels comes to mind, the decorator > approach to performance tests seen on this list as well). The default > test level certainly shouldn't call this test, though. This prompts me to ask for advice on what to do in this kind of situation. When I ported the tests to nose, there were a couple of tests that errored due to failed dependencies, on umfpack, and PIL. I just decorated the tests with an import-time check for the dependency, so they cannot be run without umfpack and PIL respectively, for example misc/tests/test_pilutil.py try: import PIL.Image except ImportError: _have_PIL = False else: _have_PIL = True import scipy.misc.pilutil as pilutil TestCase.__test__ = _have_PIL I guess the options are: Completely disable tests with absent optional dependencies (as above) Add such tests at the 'full' testing level with a decorator specific for optional dependencies, like @optdeps Don't run these tests with standard test levels, just allow them to be run with: >>> module.test('optdeps') or nosetests -A optdeps /path/to/module Preferences anyone? Matthew From robert.kern at gmail.com Sun Jan 20 20:24:35 2008 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 20 Jan 2008 19:24:35 -0600 Subject: [SciPy-dev] WxPython required? In-Reply-To: <1e2af89e0801201514x3415e9aeo16fb7d904aa4e5bc@mail.gmail.com> References: <20080120153653.GA12003@giton> <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> <1e2af89e0801201514x3415e9aeo16fb7d904aa4e5bc@mail.gmail.com> Message-ID: <4793F453.4060008@gmail.com> Matthew Brett wrote: > Hi, > >>> Upon installing wxpython this error disappears, does this mean scipy >>> depends on wxpython now? >> There are various approaches to >> solve this (different test levels comes to mind, the decorator >> approach to performance tests seen on this list as well). The default >> test level certainly shouldn't call this test, though. > > This prompts me to ask for advice on what to do in this kind of situation. > > When I ported the tests to nose, there were a couple of tests that > errored due to failed dependencies, on umfpack, and PIL. I just > decorated the tests with an import-time check for the dependency, so > they cannot be run without umfpack and PIL respectively, for example > misc/tests/test_pilutil.py > > try: > import PIL.Image > except ImportError: > _have_PIL = False > else: > _have_PIL = True > import scipy.misc.pilutil as pilutil > TestCase.__test__ = _have_PIL > > I guess the options are: > > Completely disable tests with absent optional dependencies (as above) > > Add such tests at the 'full' testing level with a decorator specific > for optional dependencies, like @optdeps > > Don't run these tests with standard test levels, just allow them to be run with: >>>> module.test('optdeps') or nosetests -A optdeps /path/to/module > > Preferences anyone? One thing to keep in mind is that trying to import wx is not always a safe thing to do. On some Linux machines, I have ssh'ed into them without a local X server. Trying to import wx on them caused the process to exit immediately; and ImportError was not raised. I'm happy to try to import PIL and have nose skip the tests if PIL is not found; however, this approach cannot be safely extended to wx. You might want to raise SkipTest instead of assigning TestCase.__test__=False, though. That tells the user that a test is being skipped and gives them a command line option to force nosetests to run the test anyways. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From fperez.net at gmail.com Sun Jan 20 20:51:11 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 20 Jan 2008 18:51:11 -0700 Subject: [SciPy-dev] WxPython required? In-Reply-To: <4793F453.4060008@gmail.com> References: <20080120153653.GA12003@giton> <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> <1e2af89e0801201514x3415e9aeo16fb7d904aa4e5bc@mail.gmail.com> <4793F453.4060008@gmail.com> Message-ID: On Jan 20, 2008 6:24 PM, Robert Kern wrote: > > Matthew Brett wrote: > > Hi, > > > >>> Upon installing wxpython this error disappears, does this mean scipy > >>> depends on wxpython now? > >> There are various approaches to > >> solve this (different test levels comes to mind, the decorator > >> approach to performance tests seen on this list as well). The default > >> test level certainly shouldn't call this test, though. > > > > This prompts me to ask for advice on what to do in this kind of situation. > > > > When I ported the tests to nose, there were a couple of tests that > > errored due to failed dependencies, on umfpack, and PIL. I just > > decorated the tests with an import-time check for the dependency, so > > they cannot be run without umfpack and PIL respectively, for example > > misc/tests/test_pilutil.py > > > > try: > > import PIL.Image > > except ImportError: > > _have_PIL = False > > else: > > _have_PIL = True > > import scipy.misc.pilutil as pilutil > > TestCase.__test__ = _have_PIL > > > > I guess the options are: > > > > Completely disable tests with absent optional dependencies (as above) > > > > Add such tests at the 'full' testing level with a decorator specific > > for optional dependencies, like @optdeps > > > > Don't run these tests with standard test levels, just allow them to be run with: > >>>> module.test('optdeps') or nosetests -A optdeps /path/to/module > > > > Preferences anyone? > > One thing to keep in mind is that trying to import wx is not always a safe thing > to do. On some Linux machines, I have ssh'ed into them without a local X server. > Trying to import wx on them caused the process to exit immediately; and > ImportError was not raised. > > I'm happy to try to import PIL and have nose skip the tests if PIL is not found; > however, this approach cannot be safely extended to wx. You might want to raise > SkipTest instead of assigning TestCase.__test__=False, though. That tells the > user that a test is being skipped and gives them a command line option to force > nosetests to run the test anyways. I was wondering if for things like GUI tests, we could use nosepipe: http://pypi.python.org/pypi/nosepipe/ The issue is that even if you do have WX, GTK and Qt installed, there's a very good chance that importing and then using more than one of them in a single process will make something go bonk. We could then have a double-pronged approach to this problem: - Tests with optional dependencies start by testing for their dependency and raising SkipTest if it's not there. This way users still see that they were skipped, but no harm is done. - Tests that are potentially unsafe in-process (such as GUI ones) are marked with a @dec.process or similar decorator, so that they *always* get run in a separate process. Would this work and solve the existing issues? Cheers, f From ceball at users.sourceforge.net Mon Jan 21 01:11:15 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Mon, 21 Jan 2008 06:11:15 +0000 (UTC) Subject: [SciPy-dev] numpy and scipy required for weave? Message-ID: > Hi, I'm trying to use the latest SVN version of weave, but I find I can't use it unless I have both numpy and scipy installed. First I tried checking out weave alone, but I got the following error when I tried to install: $ svn update At revision 3852. $ python setup.py install Traceback (most recent call last): File "setup.py", line 16, in ? from numpy.distutils.core import setup ImportError: No module named numpy.distutils.core A reply to someone else having this problem mentions that weave should work alone: http://thread.gmane.org/gmane.comp.python.scientific.user/9096/focus=9097 Installing numpy is easy, so weave depending on it is not really a problem from my point of view. After installing numpy, however, I find that scipy is also required: $ python [...] >>> import weave Traceback (most recent call last): File "", line 1, in File "[...]/lib/python2.5/site-packages/weave/__init__.py", line 21, in from scipy.testing.pkgtester import Tester ImportError: No module named scipy.testing.pkgtester >>> In the past, weave did not require scipy. To prevent this error, all I have to do is comment out the last two lines of weave/__init__.py: # from scipy.testing.pkgtester import Tester # test = Tester().test Our software depends on weave, but we cannot require our users to install scipy. Can I file a tracker item about this, or is the plan for weave to become inseparably dependent on scipy? Thanks, Chris From matthew.brett at gmail.com Mon Jan 21 04:54:51 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 21 Jan 2008 09:54:51 +0000 Subject: [SciPy-dev] numpy and scipy required for weave? In-Reply-To: References: Message-ID: <1e2af89e0801210154l27b4951ej1ac3f43c42b45500@mail.gmail.com> Hi, > Installing numpy is easy, so weave depending on it is not really a > problem from my point of view. After installing numpy, however, I find > that scipy is also required: > > $ python > [...] > >>> import weave > Traceback (most recent call last): > File "", line 1, in > File "[...]/lib/python2.5/site-packages/weave/__init__.py", line 21, in > from scipy.testing.pkgtester import Tester > ImportError: No module named scipy.testing.pkgtester We recently changed over the scipy testing to have a different framework from numpy testing, and that's the cause of your error. As a temporary hack, you could make a pretend scipy package on your python path, with a scipy directory, empty __init__.py file, and a copy of the scipy source testing directory - attached. But it shouldn't be like that of course. The plan was to have both of the numpy and scipy testing supported from the numpy testing utilities - this has not been done yet, partly because I don't have numpy commit access, and partly because the resulting namespace for numpy.testing would be a somewhat confusing mix between the two incompatible frameworks. I think they'd both work, but it wouldn't be pretty. The long term plan is to split off weave and f2py into a separate package. Perhaps it's time to think about a schedule for that? Any opinions on the testing fusion? Matthew -------------- next part -------------- A non-text attachment was scrubbed... Name: scipy_testing_only.tar.gz Type: application/x-gzip Size: 3348 bytes Desc: not available URL: From matthew.brett at gmail.com Mon Jan 21 05:12:15 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 21 Jan 2008 10:12:15 +0000 Subject: [SciPy-dev] WxPython required? In-Reply-To: References: <20080120153653.GA12003@giton> <39911DC7-A9A8-4489-9F39-40DF398ABF11@enthought.com> <1e2af89e0801201514x3415e9aeo16fb7d904aa4e5bc@mail.gmail.com> <4793F453.4060008@gmail.com> Message-ID: <1e2af89e0801210212q43076228j2807df1759ab298b@mail.gmail.com> Hi, > > I'm happy to try to import PIL and have nose skip the tests if PIL is not found; > > however, this approach cannot be safely extended to wx. You might want to raise > > SkipTest instead of assigning TestCase.__test__=False, though. That tells the > > user that a test is being skipped and gives them a command line option to force > > nosetests to run the test anyways. Thanks - SkipTest does seem like the right thing to do. > I was wondering if for things like GUI tests, we could use nosepipe: > > http://pypi.python.org/pypi/nosepipe/ Good pointer. The rule of thumb being that any test that imported wx or other GUI thing would use nosepipe, and nosetests vanilla otherwise. Things may start to get a bit complex though. First we've got the extra dependency on nosepipe - minor. Then we've got the question of whether we want to run all tests for one module through nosepipe if only one test has GUI code - there's only one nosetests call per module. The alternative, would be to decorate tests destined for nosepipe, and have two test calls, one for the nosepipe tests, and one for the non-nosepipe tests, as in: module.test() ... def test(): nose.run(argv = ['', directory_name, '-A', 'not needpipe']) nose.run(argv = ['', directory_name, '-A', 'needpipe', '--with-process-isolation']) This moves us further away from the simple nosetests /my/scipy/directory from the command line, but... So - more functionality, less simplicity and transparency - the usual thing. Matthew From ondrej at certik.cz Mon Jan 21 06:50:25 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 21 Jan 2008 12:50:25 +0100 Subject: [SciPy-dev] sparse solvers in scipy rock In-Reply-To: <20080120220120.GA6338@giton> References: <85b5c3130801150520r18c2edceo2e7e54fe4d6c40cd@mail.gmail.com> <20080120220120.GA6338@giton> Message-ID: <85b5c3130801210350o4818b1cbhbddb1d6de6617af1@mail.gmail.com> On Jan 20, 2008 11:01 PM, Gabriel Gellner wrote: > Yes! Please, if you have time, make a cookbook entry . . . I would love to see > how cython is used in this way. Yes, I'll do it soon, I was currently busy with school, but it's on my todo. Ondrej From ceball at users.sourceforge.net Mon Jan 21 13:40:32 2008 From: ceball at users.sourceforge.net (C. Ball) Date: Mon, 21 Jan 2008 18:40:32 +0000 (UTC) Subject: [SciPy-dev] Status of weave References: <478c7a3f.02ab100a.7dfa.088c@mx.google.com> <478F7FDB.9020304@enthought.com> Message-ID: Travis E. Oliphant enthought.com> writes: [...] > This is not quite accurate. Eric has spent time recently, in fact on > updating and fixing the weave type converters. I don't know of plans > to support 'O' types, but I don't believe it would be difficult with > what I know about it. [...] > But, absolutely, you have to either convince someone to work on it, or > dig in and do it yourself. Ok, we (at topographica.org) had a go at adding a converter for 'O' types. In fact, we did almost nothing to the weave code; I have included a diff against r3652 [1]. These changes allowed us to run the previous example I posted, but then we went on to run something a little more realistic [2]. There seems to be a bizarre problem with the indexing. In test_loop_objs(), "j-1" appears rather than "j" because it was the only way to get the correct elements. Using "j", the numbers returned were all for the next element (rather than the expected one), with garbage or a segmentation fault at the end. Somehow it seems that the starting value is off by maybe the size of one PyObject. I've also included a version of test_loop_objs() that does not use a macro to access the elements; we have the same problem with this version. (The macros are described in the Guide to NumPy, page 336.) There might be a stupid mistake in the example, or we might have missed something, so I'd be really grateful if someone could take a look. For one thing, the Guide to NumPy says (also on page 336): "ensure the array is aligned and in correct byte-swap order in order to get useful results". I don't understand what that means, so that could be the problem. Thanks very much, Chris [1] Changes to weave: --- lib/python2.5/site-packages/weave/c_spec.py +++ lib/python2.5/site-packages/weave/c_spec.py @@ -314,6 +314,8 @@ num_to_c_types['?'] = 'bool' num_to_c_types['l'] = 'long' num_to_c_types['L'] = 'npy_ulong' +num_to_c_types['O'] = 'py::object' + num_to_c_types['q'] = 'npy_longlong' num_to_c_types['Q'] = 'npy_ulonglong' --- lib/python2.5/site-packages/weave/standard_array_spec.py +++ lib/python2.5/site-packages/weave/standard_array_spec.py @@ -20,6 +20,7 @@ num_typecode['g'] = 'PyArray_LONGDOUBLE' num_typecode['F'] = 'PyArray_CFLOAT' num_typecode['D'] = 'PyArray_CDOUBLE' num_typecode['G'] = 'PyArray_CLONGDOUBLE' +num_typecode['O'] = 'PyArray_OBJECT' type_check_code = \ """ [2] Example code: import weave # Version using floats def test_loop_floats(cfs): code = """ for (int i=0; idata); for (int i=0; i Hi there, could someone grant me access to the scikits svn repository so I can begin porting the sandbox timeseries module to a scikit package? Pierre GM will also require access at some point too if you want to set that up at the same time. If I could also get access to the scikits trac wiki so I can create a page for the timeseries scikit, that would be greatly appreciated as well. I created an account on the trac site with the username "mattknox_ca", so if you can tweak the permissions for that account, that would be fine. Or if creating a new account is easier, that's fine too. Thanks, - Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Mon Jan 21 20:12:21 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 21 Jan 2008 17:12:21 -0800 Subject: [SciPy-dev] scikits svn access In-Reply-To: References: Message-ID: Hey Matt, Thanks for taking the initiative on this. I am about to head out now, but will be back at a computer a little later tonight. If you haven't been helped by then, I will give you the access you need. I will just contact you directly about this later tonight. Thanks, On Jan 21, 2008 12:13 PM, Matt Knox wrote: > > > Hi there, > > could someone grant me access to the scikits svn repository so I can begin > porting the sandbox timeseries module to a scikit package? Pierre GM will > also require access at some point too if you want to set that up at the same > time. > > If I could also get access to the scikits trac wiki so I can create a page > for the timeseries scikit, that would be greatly appreciated as well. I > created an account on the trac site with the username "mattknox_ca", so if > you can tweak the permissions for that account, that would be fine. Or if > creating a new account is easier, that's fine too. > > Thanks, > > - Matt > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From jre at enthought.com Mon Jan 21 20:59:43 2008 From: jre at enthought.com (J. Ryan Earl) Date: Mon, 21 Jan 2008 19:59:43 -0600 Subject: [SciPy-dev] scikits svn access In-Reply-To: References: Message-ID: <47954E0F.8060507@enthought.com> I probably should have written about this. I looked at it and figured out where the permissions needed to be added, but then I had other tasks pull me away. Does Pierre GM have an account already set up? If you finish this before I do Jarrod, let me know. I have an internal ticket on. -ryan Jarrod Millman wrote: > Hey Matt, > > Thanks for taking the initiative on this. I am about to head out now, > but will be back at a computer a little later tonight. If you haven't > been helped by then, I will give you the access you need. I will just > contact you directly about this later tonight. > > Thanks, > > On Jan 21, 2008 12:13 PM, Matt Knox wrote: > >> Hi there, >> >> could someone grant me access to the scikits svn repository so I can begin >> porting the sandbox timeseries module to a scikit package? Pierre GM will >> also require access at some point too if you want to set that up at the same >> time. >> >> If I could also get access to the scikits trac wiki so I can create a page >> for the timeseries scikit, that would be greatly appreciated as well. I >> created an account on the trac site with the username "mattknox_ca", so if >> you can tweak the permissions for that account, that would be fine. Or if >> creating a new account is easier, that's fine too. >> >> Thanks, >> >> - Matt >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> >> > > > > From fperez.net at gmail.com Tue Jan 22 01:49:04 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 21 Jan 2008 23:49:04 -0700 Subject: [SciPy-dev] Sage/Scipy Days 8 reminder: Feb 29-March 4. Message-ID: Hi all, Just a quick reminder for all about the upcoming Sage/Scipy Days 8 at Enthought collaborative meeting: http://wiki.sagemath.org/days8 Email me directly (Fernando.Perez at Colorado.edu) if you plan on coming, so we can have a proper count and plan accordingly. Cheers, f From jre at enthought.com Tue Jan 22 02:30:33 2008 From: jre at enthought.com (J. Ryan Earl) Date: Tue, 22 Jan 2008 01:30:33 -0600 Subject: [SciPy-dev] VirtualMin usage Message-ID: <47959B99.3020802@enthought.com> I was wondering, how many of you use or have used VirtualMin on the SciPy site? -ryan From millman at berkeley.edu Tue Jan 22 02:56:36 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 21 Jan 2008 23:56:36 -0800 Subject: [SciPy-dev] scikits svn access In-Reply-To: <47954E0F.8060507@enthought.com> References: <47954E0F.8060507@enthought.com> Message-ID: On Jan 21, 2008 5:59 PM, J. Ryan Earl wrote: > Does Pierre GM have an account already set up? If you finish this > before I do Jarrod, let me know. I have an internal ticket on. Hi Ryan, It looks like you already gave both Matt and Pierre scikits svn access. I just gave Matt and Pierre developer rights to the scikits' Trac site. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From jre at enthought.com Tue Jan 22 03:07:10 2008 From: jre at enthought.com (J. Ryan Earl) Date: Tue, 22 Jan 2008 02:07:10 -0600 Subject: [SciPy-dev] scikits svn access In-Reply-To: References: <47954E0F.8060507@enthought.com> Message-ID: <4795A42E.2010008@enthought.com> You can thank Robert Kern for that. -ryan Jarrod Millman wrote: > On Jan 21, 2008 5:59 PM, J. Ryan Earl wrote: > >> Does Pierre GM have an account already set up? If you finish this >> before I do Jarrod, let me know. I have an internal ticket on. >> > > Hi Ryan, > > It looks like you already gave both Matt and Pierre scikits svn > access. I just gave Matt and Pierre developer rights to the scikits' > Trac site. > > Thanks, > > From matthieu.brucher at gmail.com Tue Jan 22 08:43:38 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 22 Jan 2008 14:43:38 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: > > > I'd like to see arpack in the sparse folder (?) very fast as some my > code > > would need a sparse solver (I proposed that it could be moved in a > scikit > > but it makes sense to keep it in scipy so that sparse solvers are > available > > in scipy). > > Yes, arpack should go into the sparse package. If you have the time, > it would be great if you could help get it moved over. Ideally, we > can get it moved into scipy.sparse before the 0.7 release around the > end of March. I have some time to do this (besides I need the package), so if the whereabouts of arpack are solved (ie where it should be put with which interface), , I can help moving arpack from the sandbox to the trunk and apply the long awaited patch (if I'm given commits privileges). Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From wnbell at gmail.com Tue Jan 22 09:21:35 2008 From: wnbell at gmail.com (Nathan Bell) Date: Tue, 22 Jan 2008 08:21:35 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: On Jan 22, 2008 7:43 AM, Matthieu Brucher wrote: > > Yes, arpack should go into the sparse package. If you have the time, > > it would be great if you could help get it moved over. Ideally, we > > can get it moved into scipy.sparse before the 0.7 release around the > > end of March. > > I have some time to do this (besides I need the package), so if the > whereabouts of arpack are solved (ie where it should be put with which > interface), , I can help moving arpack from the sandbox to the trunk and > apply the long awaited patch (if I'm given commits privileges). AFAIK the current proposal is as follows scipy.sparse Will contain the sparse matrix classes and perhaps construction functions (e.g. spdiags) scipy.splinalg New home for sparse linear algebra (i.e. anything that has a dense analog in scipy.linalg) Possible home for sparse construction functions (e.g. spkron) splinalg.eigen Sparse eigensolvers: sandbox.lobpcg -> splinalg.eigen.lobpcg sandbox.arpack -> splinalg.eigen.arpack a function splinalg.eigen.eigs() should support a simplified interface to ARPACK, without exposing many ARPACK-specific parameters (allowing the backend to be changed in the future) splinalg.isolve Iterative solvers for linear systems (e.g. cg, gmres): linalg.iterative -> splinalg.isolve splinalg.dsolve Direct solvers for linear systems (e.g. SuperLU): scipy.linsolve -> splinalg.dsolve scipy.linsolve.umfpack -> scikit -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From fperez.net at gmail.com Tue Jan 22 12:30:55 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 22 Jan 2008 10:30:55 -0700 Subject: [SciPy-dev] VirtualMin usage In-Reply-To: <47959B99.3020802@enthought.com> References: <47959B99.3020802@enthought.com> Message-ID: On Jan 22, 2008 12:30 AM, J. Ryan Earl wrote: > I was wondering, how many of you use or have used VirtualMin on the > SciPy site? I do use it, mostly when adding new developers to ipython. But I have no particular attachment to it, and I'm willing to use any tool that replaces it (including command line calls directly at the console). But for my simple needs, so far it has worked OK. Cheers, f From oliphant at enthought.com Tue Jan 22 15:04:41 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Tue, 22 Jan 2008 14:04:41 -0600 Subject: [SciPy-dev] VirtualMin usage In-Reply-To: <47959B99.3020802@enthought.com> References: <47959B99.3020802@enthought.com> Message-ID: <47964C59.9040302@enthought.com> J. Ryan Earl wrote: > I was wondering, how many of you use or have used VirtualMin on the > SciPy site? > I have used it to manage users (add accounts and adjust permissions and passwords) and that is all. -Travis O. From millman at berkeley.edu Tue Jan 22 15:31:16 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Tue, 22 Jan 2008 12:31:16 -0800 Subject: [SciPy-dev] VirtualMin usage In-Reply-To: <47959B99.3020802@enthought.com> References: <47959B99.3020802@enthought.com> Message-ID: On Jan 21, 2008 11:30 PM, J. Ryan Earl wrote: > I was wondering, how many of you use or have used VirtualMin on the > SciPy site? I used it a bit, but didn't like it. I just didn't have the patience to deal with the slow interface. I haven't tried it recently, so it might have improved. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From mattknox_ca at hotmail.com Tue Jan 22 23:28:14 2008 From: mattknox_ca at hotmail.com (Matt Knox) Date: Tue, 22 Jan 2008 23:28:14 -0500 Subject: [SciPy-dev] timeseries moved to scikits Message-ID: The timeseries module has been moved to the scikits svn repository (http://svn.scipy.org/svn/scikits/trunk/timeseries) and removed from the sandbox. It installs as a scikits namespace package as per the scikits convention. The maskedarray branch of numpy (only available in svn) is currently required for the timeseries scikit. This requirement will go away once the maskedarray merging is complete and an official release of numpy has been made with the new masked array module. I'll try to whip up a quick page on the trac site for the timeseries module some time this week and port the existing documentation over to there. - Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Wed Jan 23 15:46:49 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 23 Jan 2008 21:46:49 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: > > AFAIK the current proposal is as follows > > scipy.sparse > Will contain the sparse matrix classes and perhaps construction > functions (e.g. spdiags) > > scipy.splinalg > New home for sparse linear algebra (i.e. anything that has a dense > analog in scipy.linalg) > Possible home for sparse construction functions (e.g. spkron) > > splinalg.eigen > Sparse eigensolvers: > sandbox.lobpcg -> splinalg.eigen.lobpcg > sandbox.arpack -> splinalg.eigen.arpack > > a function splinalg.eigen.eigs() should support a simplified > interface to ARPACK, > without exposing many ARPACK-specific parameters (allowing the > backend to be > changed in the future) > > splinalg.isolve > Iterative solvers for linear systems (e.g. cg, gmres): > linalg.iterative -> splinalg.isolve > > splinalg.dsolve > Direct solvers for linear systems (e.g. SuperLU): > scipy.linsolve -> splinalg.dsolve > scipy.linsolve.umfpack -> scikit > Is there any concern about this ? If not, it may be time to make it happen ? Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Thu Jan 24 00:38:04 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Wed, 23 Jan 2008 23:38:04 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: <4798243C.3030303@enthought.com> Matthieu Brucher wrote: > > AFAIK the current proposal is as follows > > scipy.sparse > Will contain the sparse matrix classes and perhaps construction > functions (e.g. spdiags) > > scipy.splinalg > New home for sparse linear algebra (i.e. anything that has a dense > analog in scipy.linalg) > Possible home for sparse construction functions (e.g. spkron) > > splinalg.eigen > Sparse eigensolvers: > sandbox.lobpcg -> splinalg.eigen.lobpcg > sandbox.arpack -> splinalg.eigen.arpack > > a function splinalg.eigen.eigs() should support a simplified > interface to ARPACK, > without exposing many ARPACK-specific parameters (allowing the > backend to be > changed in the future) > > splinalg.isolve > Iterative solvers for linear systems (e.g. cg, gmres): > linalg.iterative -> splinalg.isolve > > splinalg.dsolve > Direct solvers for linear systems (e.g. SuperLU): > scipy.linsolve -> splinalg.dsolve > scipy.linsolve.umfpack -> scikit > > > Is there any concern about this ? If not, it may be time to make it > happen ? Looks good to me. -Travis From bradford.n.cross at gmail.com Thu Jan 24 01:03:35 2008 From: bradford.n.cross at gmail.com (Bradford Cross) Date: Wed, 23 Jan 2008 22:03:35 -0800 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: References: Message-ID: congrats matt...i am working with someone now on merging our code that creates a time series repository using numy + pytables + timeseries ... ability to store timeseries in pytables as timeseries, numpy arrays, or ad hoc for heterogeneous arrays... On Jan 22, 2008 8:28 PM, Matt Knox wrote: > The timeseries module has been moved to the scikits svn repository ( > http://svn.scipy.org/svn/scikits/trunk/timeseries) and removed from the > sandbox. It installs as a scikits namespace package as per the scikits > convention. > > The maskedarray branch of numpy (only available in svn) is currently > required for the timeseries scikit. This requirement will go away once the > maskedarray merging is complete and an official release of numpy has been > made with the new masked array module. > > I'll try to whip up a quick page on the trac site for the timeseries > module some time this week and port the existing documentation over to > there. > > - Matt > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pgmdevlist at gmail.com Thu Jan 24 01:08:55 2008 From: pgmdevlist at gmail.com (Pierre GM) Date: Thu, 24 Jan 2008 01:08:55 -0500 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: References: Message-ID: <200801240108.55336.pgmdevlist@gmail.com> On Thursday 24 January 2008 01:03:35 Bradford Cross wrote: > congrats matt...i am working with someone now on merging our code that > creates a time series repository using numy + pytables + timeseries ... > ability to store timeseries in pytables as timeseries, numpy arrays, or ad > hoc for heterogeneous arrays... Sounds great ! Keep us posted. The use of recarrays as an interface could be a solution. Unfortunately, I doubt I will have much time to work on that in the near future, but I'd be quite happy to help. Thanks again P. From cimrman3 at ntc.zcu.cz Thu Jan 24 03:31:31 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 24 Jan 2008 09:31:31 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: <47984CE3.3040803@ntc.zcu.cz> Matthieu Brucher wrote: >> AFAIK the current proposal is as follows >> >> scipy.sparse >> Will contain the sparse matrix classes and perhaps construction >> functions (e.g. spdiags) >> >> scipy.splinalg >> New home for sparse linear algebra (i.e. anything that has a dense >> analog in scipy.linalg) >> Possible home for sparse construction functions (e.g. spkron) >> >> splinalg.eigen >> Sparse eigensolvers: >> sandbox.lobpcg -> splinalg.eigen.lobpcg >> sandbox.arpack -> splinalg.eigen.arpack >> >> a function splinalg.eigen.eigs() should support a simplified >> interface to ARPACK, >> without exposing many ARPACK-specific parameters (allowing the >> backend to be >> changed in the future) >> >> splinalg.isolve >> Iterative solvers for linear systems (e.g. cg, gmres): >> linalg.iterative -> splinalg.isolve >> >> splinalg.dsolve >> Direct solvers for linear systems (e.g. SuperLU): >> scipy.linsolve -> splinalg.dsolve >> scipy.linsolve.umfpack -> scikit >> > > Is there any concern about this ? If not, it may be time to make it happen ? +1 Besides the sparse stuff, I would also like to have support for the symmetric eigenvalue functions of lapack that are currently in the symeig package, but are not in scipy.linalg. lobpcg needs them to work correctly. r. From faltet at carabos.com Thu Jan 24 04:27:18 2008 From: faltet at carabos.com (Francesc Altet) Date: Thu, 24 Jan 2008 10:27:18 +0100 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: References: Message-ID: <200801241027.18699.faltet@carabos.com> A Thursday 24 January 2008, Bradford Cross escrigu?: > congrats matt...i am working with someone now on merging our code > that creates a time series repository using numy + pytables + > timeseries ... ability to store timeseries in pytables as timeseries, > numpy arrays, or ad hoc for heterogeneous arrays... Very interesting. I've followed your discussion about this subject in the PyTables list a couple of months ago. We would like to see your efforts to succeed, so, if there is something that we can do on the PyTables side, please tell us about. Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From wnbell at gmail.com Thu Jan 24 07:53:22 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 24 Jan 2008 06:53:22 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: On Jan 23, 2008 2:46 PM, Matthieu Brucher wrote: > > Is there any concern about this ? If not, it may be time to make it happen ? > I should have some time this weekend to make the changes. You're welcome to start now if you like. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Thu Jan 24 08:16:54 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 24 Jan 2008 07:16:54 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <47984CE3.3040803@ntc.zcu.cz> References: <47984CE3.3040803@ntc.zcu.cz> Message-ID: On Jan 24, 2008 2:31 AM, Robert Cimrman wrote: > Besides the sparse stuff, I would also like to have support for the > symmetric eigenvalue functions of lapack that are currently in the > symeig package, but are not in scipy.linalg. lobpcg needs them to work > correctly. We should formalize your idea to make a standard "dummy" matrix soon also. How about the name LinearOperator? Our goal with this should be to eliminate all of the code that checks for matvec(), psolve(), etc. I imagine a definition like so: class LinearOperator: def __init__(self, shape, matvec, rmatvec=None,psolve=None): self.shape = shape self.matvec = matvec if rmatvec is not None: self.rmatvec else: def rmatvec(x): raise NotImplementedError('LinearOperator does not define the operation x*A") A question arises when dealing with psolve() (the preconditioner). We could either continue checking for the existence of psolve() in each method, or we could make LinearOperator have a dummy routine psolve(x) -> x and then write the methods so that the preconditioner is always applied. The downside to this approach is that unnecessary copies may be performed. OTOH one could write the method to avoid such problems (at worst, by checking to psolve() as is currently done). Ideas? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From cimrman3 at ntc.zcu.cz Thu Jan 24 09:02:39 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 24 Jan 2008 15:02:39 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: <47984CE3.3040803@ntc.zcu.cz> Message-ID: <47989A7F.1050900@ntc.zcu.cz> Nathan Bell wrote: > On Jan 24, 2008 2:31 AM, Robert Cimrman wrote: > >> Besides the sparse stuff, I would also like to have support for the >> symmetric eigenvalue functions of lapack that are currently in the >> symeig package, but are not in scipy.linalg. lobpcg needs them to work >> correctly. > > We should formalize your idea to make a standard "dummy" matrix soon > also. How about the name LinearOperator? Our goal with this should > be to eliminate all of the code that checks for matvec(), psolve(), > etc. Good name! > I imagine a definition like so: > > class LinearOperator: > def __init__(self, shape, matvec, rmatvec=None,psolve=None): > self.shape = shape > self.matvec = matvec > > if rmatvec is not None: > self.rmatvec self.rmatvec = rmatvec > else: > def rmatvec(x): > raise NotImplementedError('LinearOperator does not > define the operation x*A") > > > A question arises when dealing with psolve() (the preconditioner). We > could either continue checking for the existence of psolve() in each > method, or we could make LinearOperator have a dummy routine psolve(x) > -> x and then write the methods so that the preconditioner is always > applied. > > The downside to this approach is that unnecessary copies may be > performed. OTOH one could write the method to avoid such problems (at > worst, by checking to psolve() as is currently done). Ideas? I am not familiar with internals of scipy.linalg, but looking at iterative.py, it seems to me that functions in it check for psolve and if it is not defined in A, they use the no-op method psolve(x) -> x. So there is a precedent :) IMHO if one must use sparse matrices, a (possible) copy of a vector or two does not matter much, as the main memory+performance hogger is the matrix. -> I would start with the dummy psolve. When it is done, profiling may indicate some better way. A side-note: currently, the iterative solvers are defined as, for example: bicg(A, b, x0=None, tol=1e-5, maxiter=None, xtype=None, callback=None) - the preconditioner is passed in as an attribute of A. This is not too transparent, IMHO. It might be better to use instead bicg(A, b, x0=None, tol=1e-5, maxiter=None, xtype=None, callback=None, precond = None), where both A, precond would be LinearOperator instances. The preconditioning would then be performed by precond.matvec/rmatvec. just my 1.5c r. From wnbell at gmail.com Thu Jan 24 09:21:53 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 24 Jan 2008 08:21:53 -0600 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: <47989A7F.1050900@ntc.zcu.cz> References: <47984CE3.3040803@ntc.zcu.cz> <47989A7F.1050900@ntc.zcu.cz> Message-ID: On Jan 24, 2008 8:02 AM, Robert Cimrman wrote: > currently, the iterative solvers are defined as, for example: > bicg(A, b, x0=None, tol=1e-5, maxiter=None, xtype=None, callback=None) > - the preconditioner is passed in as an attribute of A. This is not too > transparent, IMHO. > > It might be better to use instead > bicg(A, b, x0=None, tol=1e-5, maxiter=None, xtype=None, callback=None, > precond = None), > where both A, precond would be LinearOperator instances. The > preconditioning would then be performed by precond.matvec/rmatvec. I agree, that would be better. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From david.huard at gmail.com Thu Jan 24 10:22:31 2008 From: david.huard at gmail.com (David Huard) Date: Thu, 24 Jan 2008 10:22:31 -0500 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: References: Message-ID: <91cf711d0801240722w570572a3v9cf946820fe620c4@mail.gmail.com> Hi Bradford, Before putting too much time on our respective side, maybe we should discuss the basic implementation plan to make sure we agree on requirements and the general layout of the code. 1) In my draft, I divided the date sequence, the data and the mask into three separate entities. Do this seem also reasonable to you ? 2) Data is stored in a table for record arrays, and EArrray for simple ndarrays. Using this setup, object arrays cannot be stored. One solution could be to convert them to record arrays and store them in a table with a flag indicating that when loaded, the data should be returned as an object array. 3) Dates are stored in a ISO compliant string EArray. I used EArray because they are enlargeable. This could become handy eventually, but I don't know if there are any counter-indications (file size, load speed, etc) 4) The mask is stored similary to the data. The only exception is that when the mask is scalar (eg. all masked values are False), the mask is simply stored as an attribute of the data. 5) We should decide on how and where attributes are stored (eg. fill_value, frequency). 6) Each array has its own group, so that multiple arrays can be stored in the same file. Although this is flexible, we should think about how to access each individual array, and possibly provide a list of the available arrays within a file. Waiting for your comments, David 2008/1/24, Bradford Cross : > > congrats matt...i am working with someone now on merging our code that > creates a time series repository using numy + pytables + timeseries ... > ability to store timeseries in pytables as timeseries, numpy arrays, or ad > hoc for heterogeneous arrays... > > On Jan 22, 2008 8:28 PM, Matt Knox wrote: > > > The timeseries module has been moved to the scikits svn repository ( > > http://svn.scipy.org/svn/scikits/trunk/timeseries) and removed from the > > sandbox. It installs as a scikits namespace package as per the scikits > > convention. > > > > The maskedarray branch of numpy (only available in svn) is currently > > required for the timeseries scikit. This requirement will go away once the > > maskedarray merging is complete and an official release of numpy has been > > made with the new masked array module. > > > > I'll try to whip up a quick page on the trac site for the timeseries > > module some time this week and port the existing documentation over to > > there. > > > > - Matt > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.huard at gmail.com Thu Jan 24 10:24:40 2008 From: david.huard at gmail.com (David Huard) Date: Thu, 24 Jan 2008 10:24:40 -0500 Subject: [SciPy-dev] timeseries moved to scikits Message-ID: <91cf711d0801240724s617e4b1dm2fa1551de92c6112@mail.gmail.com> Hum. This was for Bradford. In any case, if anyone has suggestions or ideas about the implementation details, please chime in ! Cheers, David 2008/1/24, David Huard : > > Hi Bradford, > > Before putting too much time on our respective side, maybe we should > discuss the basic implementation plan to make sure we agree on requirements > and the general layout of the code. > > 1) In my draft, I divided the date sequence, the data and the mask into > three separate entities. Do this seem also reasonable to you ? > > 2) Data is stored in a table for record arrays, and EArrray for simple > ndarrays. Using this setup, object arrays cannot be stored. One solution > could be to convert them to record arrays and store them in a table with a > flag indicating that when loaded, the data should be returned as an object > array. > > 3) Dates are stored in a ISO compliant string EArray. I used EArray > because they are enlargeable. This could become handy eventually, but I > don't know if there are any counter-indications (file size, load speed, etc) > > > 4) The mask is stored similary to the data. The only exception is that > when the mask is scalar (eg. all masked values are False), the mask is > simply stored as an attribute of the data. > > 5) We should decide on how and where attributes are stored (eg. > fill_value, frequency). > > 6) Each array has its own group, so that multiple arrays can be stored in > the same file. Although this is flexible, we should think about how to > access each individual array, and possibly provide a list of the available > arrays within a file. > > Waiting for your comments, > > > David > > > > > > > > > > > 2008/1/24, Bradford Cross : > > > > congrats matt...i am working with someone now on merging our code that > > creates a time series repository using numy + pytables + timeseries ... > > ability to store timeseries in pytables as timeseries, numpy arrays, or ad > > hoc for heterogeneous arrays... > > > > On Jan 22, 2008 8:28 PM, Matt Knox < mattknox_ca at hotmail.com> wrote: > > > > > The timeseries module has been moved to the scikits svn repository ( > > > http://svn.scipy.org/svn/scikits/trunk/timeseries) and removed from > > > the sandbox. It installs as a scikits namespace package as per the scikits > > > convention. > > > > > > The maskedarray branch of numpy (only available in svn) is currently > > > required for the timeseries scikit. This requirement will go away once the > > > maskedarray merging is complete and an official release of numpy has been > > > made with the new masked array module. > > > > > > I'll try to whip up a quick page on the trac site for the timeseries > > > module some time this week and port the existing documentation over to > > > there. > > > > > > - Matt > > > > > > _______________________________________________ > > > Scipy-dev mailing list > > > Scipy-dev at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From faltet at carabos.com Thu Jan 24 11:16:35 2008 From: faltet at carabos.com (Francesc Altet) Date: Thu, 24 Jan 2008 17:16:35 +0100 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: <91cf711d0801240724s617e4b1dm2fa1551de92c6112@mail.gmail.com> References: <91cf711d0801240724s617e4b1dm2fa1551de92c6112@mail.gmail.com> Message-ID: <200801241716.35843.faltet@carabos.com> A Thursday 24 January 2008, David Huard escrigu?: > Hum. This was for Bradford. In any case, if anyone has suggestions or > ideas about the implementation details, please chime in ! Yeah. I've some > 2008/1/24, David Huard : > > Hi Bradford, > > > > Before putting too much time on our respective side, maybe we > > should discuss the basic implementation plan to make sure we agree > > on requirements and the general layout of the code. > > > > 1) In my draft, I divided the date sequence, the data and the mask > > into three separate entities. Do this seem also reasonable to you ? > > > > 2) Data is stored in a table for record arrays, and EArrray for > > simple ndarrays. Using this setup, object arrays cannot be stored. > > One solution could be to convert them to record arrays and store > > them in a table with a flag indicating that when loaded, the data > > should be returned as an object array. Perhaps another more elegant solution would be to use a VLArray using an ObjectAtom pseudo-atom. > > 3) Dates are stored in a ISO compliant string EArray. I used EArray > > because they are enlargeable. This could become handy eventually, > > but I don't know if there are any counter-indications (file size, > > load speed, etc) Do not expect problems with that, but if you want to be sure that you are getting maximum load speed, you should specify the `expectedrows` argument in EArray constructor (see section 5.1 of manual for an explanation on why). Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From david.huard at gmail.com Thu Jan 24 11:24:26 2008 From: david.huard at gmail.com (David Huard) Date: Thu, 24 Jan 2008 11:24:26 -0500 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: <200801241716.35843.faltet@carabos.com> References: <91cf711d0801240724s617e4b1dm2fa1551de92c6112@mail.gmail.com> <200801241716.35843.faltet@carabos.com> Message-ID: <91cf711d0801240824t577546a6xa5916d0630bfb7bd@mail.gmail.com> 2008/1/24, Francesc Altet : > > A Thursday 24 January 2008, David Huard escrigu?: > > Hum. This was for Bradford. In any case, if anyone has suggestions or > > ideas about the implementation details, please chime in ! > > Yeah. I've some > > > 2008/1/24, David Huard : > > > Hi Bradford, > > > > > > Before putting too much time on our respective side, maybe we > > > should discuss the basic implementation plan to make sure we agree > > > on requirements and the general layout of the code. > > > > > > 1) In my draft, I divided the date sequence, the data and the mask > > > into three separate entities. Do this seem also reasonable to you ? > > > > > > 2) Data is stored in a table for record arrays, and EArrray for > > > simple ndarrays. Using this setup, object arrays cannot be stored. > > > One solution could be to convert them to record arrays and store > > > them in a table with a flag indicating that when loaded, the data > > > should be returned as an object array. > > Perhaps another more elegant solution would be to use a VLArray using an > ObjectAtom pseudo-atom. Thanks for the tip. I didn't know about those. > > 3) Dates are stored in a ISO compliant string EArray. I used EArray > > > because they are enlargeable. This could become handy eventually, > > > but I don't know if there are any counter-indications (file size, > > > load speed, etc) > > Do not expect problems with that, but if you want to be sure that you > are getting maximum load speed, you should specify the `expectedrows` > argument in EArray constructor (see section 5.1 of manual for an > explanation on why). Ok, thanks. Cheers, > > -- > >0,0< Francesc Altet http://www.carabos.com/ > V V C?rabos Coop. V. Enjoy Data > "-" > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bradford.n.cross at gmail.com Thu Jan 24 12:40:09 2008 From: bradford.n.cross at gmail.com (Bradford Cross) Date: Thu, 24 Jan 2008 09:40:09 -0800 Subject: [SciPy-dev] timeseries moved to scikits In-Reply-To: <91cf711d0801240722w570572a3v9cf946820fe620c4@mail.gmail.com> References: <91cf711d0801240722w570572a3v9cf946820fe620c4@mail.gmail.com> Message-ID: On Jan 24, 2008 7:22 AM, David Huard wrote: > Hi Bradford, > > Before putting too much time on our respective side, maybe we should > discuss the basic implementation plan to make sure we agree on requirements > and the general layout of the code. > > 1) In my draft, I divided the date sequence, the data and the mask into > three separate entities. Do this seem also reasonable to you ? Yes, we really have no reasonable alternative. Homogeneous data can be stored as arrays, mask needs to be stored on its own, and datetime must be converted to ISO string - there is no other good way to store datetime without losing milliseconds, which is unacceptable for many kinds of timeseries (that is also one of the problems with the timeseries package's internal dates, no current sub second implementation.) > > > 2) Data is stored in a table for record arrays, and EArrray for simple > ndarrays. Using this setup, object arrays cannot be stored. One solution > could be to convert them to record arrays and store them in a table with a > flag indicating that when loaded, the data should be returned as an object > array. We can use the VLArray as mentioned in the reply from Francesc. As I did in the first prototype, I think we can provide a nice API based on the Repository pattern that encapsulates our pyTables mapping implementation regardless of whether we are using the timeseries package, user defined time series based on objects, etc. We can try a few things - maybe even the mapper function approach that I took in the first prototype that maps from objects to rows in a pytables - this allows for element-by-element reads/writes and steals the idea from the DataMapper patterns that are often seen in O/R mapping frameworks. > > 3) Dates are stored in a ISO compliant string EArray. I used EArray > because they are enlargeable. This could become handy eventually, but I > don't know if there are any counter-indications (file size, load speed, etc) > Yep, sounds good. Nice tip in response from Francesc. > > > 4) The mask is stored similary to the data. The only exception is that > when the mask is scalar (eg. all masked values are False), the mask is > simply stored as an attribute of the data. Cool. > > > 5) We should decide on how and where attributes are stored (eg. > fill_value, frequency). It has been a couple months since I did the first prototype, but I think I recall that the pyTables docs have something about storing metadata associated with a table. > > > 6) Each array has its own group, so that multiple arrays can be stored in > the same file. Although this is flexible, we should think about how to > access each individual array, and possibly provide a list of the available > arrays within a file. I am a fan of the one-timeseries-per-table approach, which can scale nicely to distributed databases for parallelization. One of the cool parts about the initial prototype that I did is that I noticed you can drill into pyTables hierarchically with the same syntax that you drill into the unix file system hierarchy, which makes it easy to scale in and out from pyTables files into distributed pyTables. In this case the hierarchical structure just needs to be laid out in a way that makes sense for the domain that the data comes from, but I am pretty sure that the functions/objects I used in the initial prototype can be generalized for that. It makes accessing each array easy, scaling to a distributed database easy, and providing a list of arrays at any level in the hierarchy easy. > > Waiting for your comments, > > > David > > > > > > > > > > > 2008/1/24, Bradford Cross : > > > congrats matt...i am working with someone now on merging our code that > > creates a time series repository using numy + pytables + timeseries ... > > ability to store timeseries in pytables as timeseries, numpy arrays, or ad > > hoc for heterogeneous arrays... > > > > On Jan 22, 2008 8:28 PM, Matt Knox < mattknox_ca at hotmail.com> wrote: > > > > > The timeseries module has been moved to the scikits svn repository ( > > > http://svn.scipy.org/svn/scikits/trunk/timeseries) and removed from > > > the sandbox. It installs as a scikits namespace package as per the scikits > > > convention. > > > > > > The maskedarray branch of numpy (only available in svn) is currently > > > required for the timeseries scikit. This requirement will go away once the > > > maskedarray merging is complete and an official release of numpy has been > > > made with the new masked array module. > > > > > > I'll try to whip up a quick page on the trac site for the timeseries > > > module some time this week and port the existing documentation over to > > > there. > > > > > > - Matt > > > > > > _______________________________________________ > > > Scipy-dev mailing list > > > Scipy-dev at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Fri Jan 25 02:55:56 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 25 Jan 2008 08:55:56 +0100 Subject: [SciPy-dev] The future of the scipy.sandbox and a reminder of upcoming doc-day In-Reply-To: References: Message-ID: 2008/1/24, Nathan Bell : > > On Jan 23, 2008 2:46 PM, Matthieu Brucher > wrote: > > > > Is there any concern about this ? If not, it may be time to make it > happen ? > > > > I should have some time this weekend to make the changes. You're > welcome to start now if you like. > Unfortunately I don't think I have commit privileges, so I'm looking forward to see your future changes :) Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From mattknox_ca at hotmail.com Fri Jan 25 09:58:25 2008 From: mattknox_ca at hotmail.com (Matt Knox) Date: Fri, 25 Jan 2008 14:58:25 +0000 (UTC) Subject: [SciPy-dev] scikits binaries, tarballs, etc... Message-ID: Has there been any decisions / discussion regarding a standard location for distributing binaries, tarballs, docs, etc... for various scikits? I see that the mlabwrap scikit has a sourceforge page, and the openopt scikit has a file attached to the wiki page. I would tend to lean towards having a unified sourceforge (or equivalent) page for scikits (similar to what the mingw project has for example: http://sourceforge.net/project/showfiles.php?group_id=2435). Also, while I'm on the topic of scikits... the first line of the scikits front page (http://scipy.org/scipy/scikits/wiki) says "This is a wiki meant for the "hard-hat" area of the SciPy? ToolKits?." To me this makes it sound like a "sand box" or play ground of sorts, which isn't necessarily reflective of all the scikits (well, maybe it is reflective of the current state somewhat, but probably won't be in the future). I'd vote for just chopping off that leading sentence completely because I don't think getting rid of it makes things less clear - and in fact makes it clearer from my point of view. - Matt From oliphant at enthought.com Fri Jan 25 10:39:46 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Fri, 25 Jan 2008 09:39:46 -0600 Subject: [SciPy-dev] Doc-days are today! Come join on irc.freenode.net (#scipy) Message-ID: <479A02C2.2060606@enthought.com> From wnbell at gmail.com Fri Jan 25 15:30:18 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Jan 2008 14:30:18 -0600 Subject: [SciPy-dev] scipy.splinalg or bust Message-ID: I've started migrating code to scipy.splinalg. Completed tasks: (1) linalg.iterative -> splinalg.isolve http://projects.scipy.org/scipy/scipy/changeset/3861 For backwards compatibility scipy.linalg imports all the iterative solvers from splinalg.isolve . This is needed to avoid breaking things like 'from scipy.linalg import cg' What's the best way to inform the user that these functions have moved? Is there a nice way to decorate these functions to issue a warning? Remaining tasks: (1) sandbox.lobpcg -> splinalg.eigen.lobpcg (2) sandbox.arpack -> splinalg.eigen.arpack (3) create splinalg.eigen.speigs (4) scipy.linsolve -> splinalg.dsolve (5) scipy.linsolve.umfpack -> scikit (6) create LinearOperator wrapper class and document it (7) document splinalg.dsolve (8) document splinalg.isolve (9) document splinalg.eigen I will start working on (4) now. Should I remove umfpack in the process? linalg Robert C, can you give us a timeline on (1) and (5)? Can (1) be done without symeig (perhaps using scipy.linalg.eig as a fallback)? Any volunteers for (2) and (3)? For reference: http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper Here are the remaining ARPACK tickets: http://projects.scipy.org/scipy/scipy/ticket/231 http://projects.scipy.org/scipy/scipy/ticket/366 http://projects.scipy.org/scipy/scipy/ticket/554 We discussed a plan for (6) in this thread: http://thread.gmane.org/gmane.comp.python.scientific.devel/7070/focus=7285 Any comments/suggestions are most welcome. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From hagberg at lanl.gov Fri Jan 25 16:08:19 2008 From: hagberg at lanl.gov (Aric Hagberg) Date: Fri, 25 Jan 2008 14:08:19 -0700 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: Message-ID: <20080125210819.GB3543@bigjim1.lanl.gov> On Fri, Jan 25, 2008 at 02:30:18PM -0600, Nathan Bell wrote: > > Any volunteers for (2) and (3)? For reference: > http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper > Here are the remaining ARPACK tickets: > http://projects.scipy.org/scipy/scipy/ticket/231 > http://projects.scipy.org/scipy/scipy/ticket/366 > http://projects.scipy.org/scipy/scipy/ticket/554 I can look at those ARPACK tickets this weekend and help get the code moved. Aric From wnbell at gmail.com Fri Jan 25 17:45:41 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Jan 2008 16:45:41 -0600 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: <20080125210819.GB3543@bigjim1.lanl.gov> References: <20080125210819.GB3543@bigjim1.lanl.gov> Message-ID: On Jan 25, 2008 3:08 PM, Aric Hagberg wrote: > I can look at those ARPACK tickets this weekend and help get the code moved. > Aric Excellent. I know a lot of people (myself included) look forward to mainline ARPACK support. I've taken care of item (4). For now, UMFPACK support is retained, but this should be removed by 0.7 Let me know if I've broken anything in the process. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From oliphant at enthought.com Fri Jan 25 23:08:52 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Fri, 25 Jan 2008 22:08:52 -0600 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: Message-ID: <479AB254.5030004@enthought.com> Nathan Bell wrote: > I've started migrating code to scipy.splinalg. > > Completed tasks: > > (1) linalg.iterative -> splinalg.isolve > http://projects.scipy.org/scipy/scipy/changeset/3861 > > For backwards compatibility scipy.linalg imports all the > iterative solvers from splinalg.isolve . This is needed to > avoid breaking things like 'from scipy.linalg import cg' > What's the best way to inform the user that these functions > have moved? Is there a nice way to decorate these functions > to issue a warning? > Yes. There are decorators in numpy specifically for this purpose. deprecate_with_doc deprecate Thanks so much for tackling this one. I look forward to the improved organization. -Travis O. From jh at physics.ucf.edu Fri Jan 25 23:19:17 2008 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 25 Jan 2008 23:19:17 -0500 Subject: [SciPy-dev] changes to Developer Zone Message-ID: I have reorganized and updated the Developer Zone page. I also edited down some of the text, hopefully without removing any information. My goal is to make clear how we operate, who is doing what, what status of each effort is, what help is needed, and whom to contact to volunteer. It can do all these things with your help. However, there is now a lot of FILL IN to be done. Please search for FILL IN and see if you can do so. Some of the names on the old page were several years old and I haven't seen many of those people on the lists for a long time. So, I've put those names at the bottoms of the sections with a note to move them up into the team lists if they are still active. I'll remove them completely in a couple of weeks if they're still there. I've also made notes ("FILL IN") asking teams to fill in information about what they do. Teams, this may be your best recruiting tool, so I hope you'll identify your leader(s) and anyone who has taken on a key area, as well as what it is that you do and how. If this information is somewhere else, such as in Trac, please link it. The more links into actual work areas the better. I've identified Travis as coding lead. I think there is a packaging lead but I don't recall offhand who that is (Jarrod?). Are there "official" doc and web leads? Thanks for your help in whipping the new page into shape. --jh-- From jh at physics.ucf.edu Sat Jan 26 00:20:16 2008 From: jh at physics.ucf.edu (Joe Harrington) Date: Sat, 26 Jan 2008 00:20:16 -0500 Subject: [SciPy-dev] documentation - looking to hire Message-ID: I'd like to use Python to teach my astronomical data analysis class next semester. I tried using it last semester, and it was a big problem because the students could not find docs for the routines. They (and I, it was my first real use of Python) spent many hours at a stretch looking for docs on basic routines that had little or no docstrings. Lots of other things were missing, too, like searchable lists of routines, routines categorized by topic areas, getting-started and user guides, etc. Travis's book was too high-level for this purpose; it's written for a different audience. To use Python next semester, I'd need at least adequate docstrings for all of numpy and certain parts of scipy, and for some of the above-mentioned lists to be made, ideally automatically. I'm looking for options to hire one or more people to do the job, under some guidance from me and in concert with the community efforts. This could be as a direct hire or on a subcontract to Enthought or another institution. I have not made any detailed plans for this, but I'm guessing that one full time or two half-time people (a coder and a writer) could do a good-enough first cut at the job, provided they were already competent at numpy/scipy and were good writers. If you are interested in taking on this work, from roughly now until the end of the summer, please email me directly and let me know a bit about yourself and your availability. If you have ideas on how this might be done to best integrate with the community effort, respond to the list. Your collective response will shape my thinking on how to do this best, such as the level at which to hire. If you look at http://scipy.org/Developer_Zone , you'll see that I've written some basic description of the first two needs (which needs editing, but it is late), but that I haven't identified a doc lead nor a doc process. It would be very helpful if someone could describe for me what this process is. Is it as simple as writing a docstring according to http://projects.scipy.org/scipy/numpy/browser/trunk/numpy/doc/HOWTO_DOCUMENT.txt and http://svn.scipy.org/svn/numpy/trunk/numpy/doc/example.py and entering it as a Trac ticket? Is there any review? How about keywords for indexing? What tools exist or are contemplated for searching? Is there any tool for extracting all the docstrings and putting them out either as files or in one big file, for searching or reading while not running python (or even being on the web)? Thanks, --jh-- Prof. Joseph Harrington Department of Physics MAP 414 4000 Central Florida Blvd. University of Central Florida Orlando, FL 32816-2385 (407) 823-3416 voice (407) 823-5112 fax (407) 823-2325 physics office jh at physics.ucf.edu From nwagner at iam.uni-stuttgart.de Sat Jan 26 02:19:57 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 26 Jan 2008 08:19:57 +0100 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: Message-ID: On Fri, 25 Jan 2008 14:30:18 -0600 "Nathan Bell" wrote: > I've started migrating code to scipy.splinalg. > > Completed tasks: > > (1) linalg.iterative -> splinalg.isolve > http://projects.scipy.org/scipy/scipy/changeset/3861 > > For backwards compatibility scipy.linalg >imports all the > iterative solvers from splinalg.isolve . > This is needed to > avoid breaking things like 'from scipy.linalg >import cg' > What's the best way to inform the user that >these functions > have moved? Is there a nice way to decorate >these functions > to issue a warning? > > Remaining tasks: > > (1) sandbox.lobpcg -> splinalg.eigen.lobpcg > (2) sandbox.arpack -> splinalg.eigen.arpack > (3) create splinalg.eigen.speigs > (4) scipy.linsolve -> splinalg.dsolve > (5) scipy.linsolve.umfpack -> scikit > (6) create LinearOperator wrapper class and >document it > (7) document splinalg.dsolve > (8) document splinalg.isolve > (9) document splinalg.eigen > > I will start working on (4) now. Should I remove >umfpack in the process? > linalg > Robert C, can you give us a timeline on (1) and (5)? > Can (1) be done > without symeig (perhaps using scipy.linalg.eig as a >fallback)? > > Any volunteers for (2) and (3)? For reference: > http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper > Here are the remaining ARPACK tickets: > http://projects.scipy.org/scipy/scipy/ticket/231 > http://projects.scipy.org/scipy/scipy/ticket/366 > http://projects.scipy.org/scipy/scipy/ticket/554 > > There is another ticket with respect to ARPACK http://projects.scipy.org/scipy/scipy/ticket/418 Cheers, Nils > We discussed a plan for (6) in this thread: > http://thread.gmane.org/gmane.comp.python.scientific.devel/7070/focus=7285 > Any comments/suggestions are most welcome. > > > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From nwagner at iam.uni-stuttgart.de Sat Jan 26 02:22:35 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 26 Jan 2008 08:22:35 +0100 Subject: [SciPy-dev] Ticket #593 Message-ID: Hi all, Please can someone confirm ticket #593. Thanks in advance. Nils From dmitrey.kroshko at scipy.org Sat Jan 26 04:30:57 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Sat, 26 Jan 2008 11:30:57 +0200 Subject: [SciPy-dev] what happened to scikits svn? Message-ID: <479AFDD1.90004@scipy.org> yields svn: PROPFIND request failed on '/svn/scikits/!svn/vcc/default' svn: PROPFIND of '/svn/scikits/!svn/vcc/default': could not connect to server (http://svn.scipy.org) From karol.langner at kn.pl Sat Jan 26 06:28:35 2008 From: karol.langner at kn.pl (Karol M. Langner) Date: Sat, 26 Jan 2008 12:28:35 +0100 Subject: [SciPy-dev] Ticket #593 In-Reply-To: References: Message-ID: <200801261228.35222.karol.langner@kn.pl> On Saturday 26 January 2008 08:22, Nils Wagner wrote: > Hi all, > > Please can someone confirm ticket #593. > > Thanks in advance. > > > Nils I tried the test on two 64-bit machines, both running Linux. One has no problem with it, the other gives the same error you posted. What information do you need about the installations? Karol -- written by Karol Langner Sat Jan 26 12:27:26 CET 2008 From dmitrey.kroshko at scipy.org Sat Jan 26 08:01:29 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Sat, 26 Jan 2008 15:01:29 +0200 Subject: [SciPy-dev] svn.scipy.org still down Message-ID: <479B2F29.3030408@scipy.org> AFAIK it was up for several seconds but it's down again for now, at least for several hours. ping svn.scipy.org PING new.scipy.org (216.62.213.231) 56(84) bytes of data. ^C --- new.scipy.org ping statistics --- 362 packets transmitted, 0 received, 100% packet loss, time 361196ms D. From matthew.brett at gmail.com Sat Jan 26 08:14:08 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 26 Jan 2008 13:14:08 +0000 Subject: [SciPy-dev] svn.scipy.org still down In-Reply-To: <479B2F29.3030408@scipy.org> References: <479B2F29.3030408@scipy.org> Message-ID: <1e2af89e0801260514ve4c6d6fp1e2e581be03b4aae@mail.gmail.com> It's been up for me for the last few hours, as it is now... On Jan 26, 2008 1:01 PM, dmitrey wrote: > AFAIK it was up for several seconds but it's down again for now, at > least for several hours. > > ping svn.scipy.org > PING new.scipy.org (216.62.213.231) 56(84) bytes of data. > ^C > --- new.scipy.org ping statistics --- > 362 packets transmitted, 0 received, 100% packet loss, time 361196ms > > D. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From robince at gmail.com Sat Jan 26 08:14:41 2008 From: robince at gmail.com (Robin) Date: Sat, 26 Jan 2008 13:14:41 +0000 Subject: [SciPy-dev] svn.scipy.org still down In-Reply-To: <479B2F29.3030408@scipy.org> References: <479B2F29.3030408@scipy.org> Message-ID: Ping fails for me too, but I can still access the server through the web: http://svn.scipy.org/svn/numpy/trunk/numpy/doc/example.py for example loads fine. I suspect it's behind a router/firewall that blocks ICMP ping packets. svn update also works, although I do get an error: svn: Failed to add directory 'scipy/linsolve': object of the same name already exists I guess I should delete the existing linsolve directory. Robin On Jan 26, 2008 1:01 PM, dmitrey wrote: > AFAIK it was up for several seconds but it's down again for now, at > least for several hours. > > ping svn.scipy.org > PING new.scipy.org (216.62.213.231) 56(84) bytes of data. > ^C > --- new.scipy.org ping statistics --- > 362 packets transmitted, 0 received, 100% packet loss, time 361196ms > > D. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From wnbell at gmail.com Sat Jan 26 08:58:48 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 26 Jan 2008 07:58:48 -0600 Subject: [SciPy-dev] svn.scipy.org still down In-Reply-To: References: <479B2F29.3030408@scipy.org> Message-ID: On Jan 26, 2008 7:14 AM, Robin wrote: > svn update also works, although I do get an error: > svn: Failed to add directory 'scipy/linsolve': object of the same name > already exists > > I guess I should delete the existing linsolve directory. Correct. That was my fault :) -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From jre at enthought.com Sun Jan 27 05:57:40 2008 From: jre at enthought.com (J. Ryan Earl) Date: Sun, 27 Jan 2008 04:57:40 -0600 Subject: [SciPy-dev] svn.scipy.org still down In-Reply-To: <479B2F29.3030408@scipy.org> References: <479B2F29.3030408@scipy.org> Message-ID: <479C63A4.7040906@enthought.com> The firewall blocks ICMP packets. This is normal. -ryan dmitrey wrote: > AFAIK it was up for several seconds but it's down again for now, at > least for several hours. > > ping svn.scipy.org > PING new.scipy.org (216.62.213.231) 56(84) bytes of data. > ^C > --- new.scipy.org ping statistics --- > 362 packets transmitted, 0 received, 100% packet loss, time 361196ms > > D. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From matthieu.brucher at gmail.com Mon Jan 28 03:23:01 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Mon, 28 Jan 2008 09:23:01 +0100 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: <20080125210819.GB3543@bigjim1.lanl.gov> References: <20080125210819.GB3543@bigjim1.lanl.gov> Message-ID: Was it done ? If it was not, I could do it today. Matthieu 2008/1/25, Aric Hagberg : > > On Fri, Jan 25, 2008 at 02:30:18PM -0600, Nathan Bell wrote: > > > > Any volunteers for (2) and (3)? For reference: > > http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper > > Here are the remaining ARPACK tickets: > > http://projects.scipy.org/scipy/scipy/ticket/231 > > http://projects.scipy.org/scipy/scipy/ticket/366 > > http://projects.scipy.org/scipy/scipy/ticket/554 > > I can look at those ARPACK tickets this weekend and help get the code > moved. > Aric > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Mon Jan 28 05:34:26 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 28 Jan 2008 11:34:26 +0100 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: Message-ID: <479DAFB2.8070200@ntc.zcu.cz> Nathan Bell wrote: > I've started migrating code to scipy.splinalg. > > Completed tasks: > > (1) linalg.iterative -> splinalg.isolve > http://projects.scipy.org/scipy/scipy/changeset/3861 > > For backwards compatibility scipy.linalg imports all the > iterative solvers from splinalg.isolve . This is needed to > avoid breaking things like 'from scipy.linalg import cg' > What's the best way to inform the user that these functions > have moved? Is there a nice way to decorate these functions > to issue a warning? > > Remaining tasks: > > (1) sandbox.lobpcg -> splinalg.eigen.lobpcg > (2) sandbox.arpack -> splinalg.eigen.arpack > (3) create splinalg.eigen.speigs > (4) scipy.linsolve -> splinalg.dsolve > (5) scipy.linsolve.umfpack -> scikit > (6) create LinearOperator wrapper class and document it > (7) document splinalg.dsolve > (8) document splinalg.isolve > (9) document splinalg.eigen > > I will start working on (4) now. Should I remove umfpack in the process? > linalg > Robert C, can you give us a timeline on (1) and (5)? Can (1) be done > without symeig (perhaps using scipy.linalg.eig as a fallback)? Our entire lab is now moving into a new building, my computer currently sits on a table in a room full of boxes and stuff, so I am not sure about a timeline. Moving-man career awaits me. I will try to make a scikit of umfpack asap, keep it functional untill it is done, please. (1) can wait a bit more imho. The problem with using scipy.linalg.eig is that the LOBPCG algorithm does not work correctly with it - originally I was using it but kept getting large numerical errors when comparing step by step with the reference matlab implementation by A. Knyazev, the author of the algorithm. Symeig solved those problems. thanks for tackling this! r. From cimrman3 at ntc.zcu.cz Mon Jan 28 09:43:43 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 28 Jan 2008 15:43:43 +0100 Subject: [SciPy-dev] umfpack scikit Message-ID: <479DEA1F.7070901@ntc.zcu.cz> Hi I am learning about scikits from http://scipy.org/scipy/scikits/wiki/ScikitsForDevelopers. I have tried to check-out the scikits trunk with the following result: svn co http://projects.scipy.org/scipy/scikits/trunk scikits svn: PROPFIND request failed on '/scipy/scikits' svn: PROPFIND of '/scipy/scikits': 200 Ok (http://projects.scipy.org) do I need some special account to do that (I have normal svn access (username rc))? I admit I did not follow all the discussion on scikits very closely, so sorry if I ask something obvious. After I get the sources, I should be able to copy shamelessly the structure of another scikit project to get started :) r. From hagberg at lanl.gov Mon Jan 28 10:48:27 2008 From: hagberg at lanl.gov (Aric Hagberg) Date: Mon, 28 Jan 2008 08:48:27 -0700 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: <20080125210819.GB3543@bigjim1.lanl.gov> Message-ID: <20080128154827.GB10459@bigjim1.lanl.gov> On Mon, Jan 28, 2008 at 09:23:01AM +0100, Matthieu Brucher wrote: > Was it done ? If it was not, I could do it today. > > 2008/1/25, Aric Hagberg <[1]hagberg at lanl.gov>: > > I can look at those ARPACK tickets this weekend and help get the code > moved. > Aric I'm working on it. The tickets are fairly easily addressed except for adding more eigensolver modes (shifted etc) and a high-level interface. None of the complex matrix methods work for me on an Intel Mac (with OSX 10.5.1 fink, gfortran) and the tests result in a bus error. I'd like to track that down before I check it into the trunk. It appears at this point to be a LAPACK problem. Are there any known issues with this setup? Aric From pav at iki.fi Mon Jan 28 18:37:09 2008 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 29 Jan 2008 01:37:09 +0200 Subject: [SciPy-dev] Docstrings: scipy.linalg, scipy.integrate.ode, scipy.integrate.odepack In-Reply-To: References: Message-ID: <1201563429.7396.29.camel@localhost.localdomain> Hi all, During the last doc-day, I did some ReSTful cleaning up of the docstrings in scipy.linalg, and polished up scipy.integrate a bit. I put the patches to Trac, Scipy tickets #596 and #597. Review is appreciated, although I attempted to be careful. There's also Epydoc-generated output at http://www.iki.fi/pav/tmp/scipy/doc/index.html if you want to check what the final result looks like. Unfortunately, not all of it is well-formed ReST, so I (or someone) needs to look at fixing the markup a bit later. Also, the default Epydoc stylesheet is a bit difficult for the eyes here. Without RST formatting: http://www.iki.fi/pav/tmp/scipy/scipy-part.html Some questions popped up when writing this: - Where to put documentation that concerns whole modules, eg. scipy.integrate.ode (currently in the module docstring) or scipy.integrate.quadpack (currently printed by quad_explain())? Typically these submodule docstrings in scipy are not easily accessible, as module contents are imported via "from xxx import *". Should they be appended to the main module's docstring? - Should there be "official" epydoc-generated API documentation for Scipy somewhere? - Grouping related parameters together makes sense sometimes: minx : float maxx : float Specify that x must lie in the range [minx, maxx] However, this is not valid ReST, because in a description list, each entry must have its own separate description. What to do? - Epydoc generates lots of pages that are of no interest to the end-user. How to improve the situation? .. #596 http://scipy.org/scipy/scipy/attachment/ticket/596/scipy-linalg-doc.patch .. #597 http://scipy.org/scipy/scipy/attachment/ticket/597/scipy-integrate-ode-odepack-doc.patch -- Pauli Virtanen -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: Digitaalisesti allekirjoitettu viestin osa URL: From charlesr.harris at gmail.com Tue Jan 29 01:27:54 2008 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 28 Jan 2008 23:27:54 -0700 Subject: [SciPy-dev] Docstrings: scipy.linalg, scipy.integrate.ode, scipy.integrate.odepack In-Reply-To: <1201563429.7396.29.camel@localhost.localdomain> References: <1201563429.7396.29.camel@localhost.localdomain> Message-ID: On Jan 28, 2008 4:37 PM, Pauli Virtanen wrote: > Hi all, > > During the last doc-day, I did some ReSTful cleaning up of the > docstrings in scipy.linalg, and polished up scipy.integrate a bit. > > I put the patches to Trac, Scipy tickets #596 and #597. Review is > appreciated, although I attempted to be careful. > > There's also Epydoc-generated output at > http://www.iki.fi/pav/tmp/scipy/doc/index.html if you want to check what > the final result looks like. Unfortunately, not all of it is well-formed > ReST, so I (or someone) needs to look at fixing the markup a bit later. > Also, the default Epydoc stylesheet is a bit difficult for the eyes > here. > > Without RST formatting: http://www.iki.fi/pav/tmp/scipy/scipy-part.html > You need to checkout the latest version of epydoc from svn, the older versions don't handle the new document format. I had the same problem. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Tue Jan 29 04:26:03 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 29 Jan 2008 10:26:03 +0100 Subject: [SciPy-dev] umfpack scikit In-Reply-To: <479DEA1F.7070901@ntc.zcu.cz> References: <479DEA1F.7070901@ntc.zcu.cz> Message-ID: <479EF12B.90801@ntc.zcu.cz> Robert Cimrman wrote: > I am learning about scikits from > http://scipy.org/scipy/scikits/wiki/ScikitsForDevelopers. > I have tried to check-out the scikits trunk with the following result: > > svn co http://projects.scipy.org/scipy/scikits/trunk scikits > svn: PROPFIND request failed on '/scipy/scikits' > svn: PROPFIND of '/scipy/scikits': 200 Ok (http://projects.scipy.org) So I guessed the correct command/url: 'svn co http://svn.scipy.org/svn/scikits/trunk scikits' but it should be mentioned on http://scipy.org/scipy/scikits/wiki/ScikitsForDevelopers, IMHO. I do not seem able to edit the page. r. From wnbell at gmail.com Wed Jan 30 22:06:02 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 30 Jan 2008 21:06:02 -0600 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: <479DAFB2.8070200@ntc.zcu.cz> References: <479DAFB2.8070200@ntc.zcu.cz> Message-ID: On Jan 28, 2008 4:34 AM, Robert Cimrman wrote: > I will try to make a scikit of umfpack asap, keep it functional untill > it is done, please. Will do. I've added LinearOperator and aslinearoperator() to SVN: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/splinalg/interface.py aslinearoperator() accepts all of the things that made sense for the 'A' parameter of the iterative solvers (dense matrices, sparse matrices, and objects with .shape and .matvec() attributes) Take a look at it and let me know if you see any issues. I included a dtype attribute in LinearOperator to take the place of the xtype parameter of the iterative solvers. Next on the TODO list: Update iterative solvers to use aslinearoperator()/LinearOperator Replace A.psolve with precond=M parameter (where M is also a LinearOperator) Clean up MINRES translation: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/splinalg/isolve/minres.py And possibly: Generalize the stopping criteria used by the iterative solvers -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From cimrman3 at ntc.zcu.cz Thu Jan 31 10:21:58 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 31 Jan 2008 16:21:58 +0100 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: References: <479DAFB2.8070200@ntc.zcu.cz> Message-ID: <47A1E796.5040607@ntc.zcu.cz> Nathan Bell wrote: > I've added LinearOperator and aslinearoperator() to SVN: > http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/splinalg/interface.py > > aslinearoperator() accepts all of the things that made sense for the > 'A' parameter of the iterative solvers (dense matrices, sparse > matrices, and objects with .shape and .matvec() attributes) Great! what about allowing the LinearOperator.__init__() to take care of the cases that aslinearoperator() takes care of? Or LinearOperator.fromany()/fromAny() static/class method? cheers, r. From wnbell at gmail.com Thu Jan 31 12:29:42 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 31 Jan 2008 11:29:42 -0600 Subject: [SciPy-dev] scipy.splinalg or bust In-Reply-To: <47A1E796.5040607@ntc.zcu.cz> References: <479DAFB2.8070200@ntc.zcu.cz> <47A1E796.5040607@ntc.zcu.cz> Message-ID: On Jan 31, 2008 9:21 AM, Robert Cimrman wrote: > Great! what about allowing the LinearOperator.__init__() to take care of > the cases that aslinearoperator() takes care of? Or > LinearOperator.fromany()/fromAny() static/class method? I considered that approach but decided against it. I like keeping LinearOperator dead simple since it's what end users will have to understand if they want to hook up their matrix-like objects to scipy solvers. Currently aslinearoperator() is the part that "knows" about scipy/numpy while LinearOperator is just an abstract interface. Another practical issue is that LinearOperator needs to be initialized with at least two arguments (shape and matvec) while aslinearoperator() only accepts one. The classmethod approach avoids this, but I have a (possibly irrational) dislike for classmethods :) Scipy users should understand asarray(), so I think aslinearoperator() is a fairly natural extension. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/