From ctw at cogsci.info Wed Apr 2 23:10:40 2008 From: ctw at cogsci.info (Christoph T. Weidemann) Date: Wed, 2 Apr 2008 23:10:40 -0400 Subject: [SciPy-dev] Scipy-dev Digest, Vol 53, Issue 28 In-Reply-To: References: Message-ID: On Thu, Mar 20, 2008 at 00:49:36 +0100, St?fan van der Walt " wrote: > Hi Christoph > > On Wed, Mar 19, 2008 at 6:48 PM, Christoph T. Weidemann wrote: > > I'll try to write a test case. Once I have the code, what's the best > > way for me to submit it? > > For now, please attach it as a ticket. As promised, I've added several new unit tests for the scipy.signal.wavelets.morlet() function. The ticket is here: http://scipy.org/scipy/scipy/ticket/628 Cheers, Christoph From lists at onerussian.com Wed Apr 2 23:21:11 2008 From: lists at onerussian.com (Yaroslav Halchenko) Date: Wed, 2 Apr 2008 23:21:11 -0400 Subject: [SciPy-dev] Google Summer of Code and scipy.learn (another trying) In-Reply-To: <11abf2720803310313w63f5922dl2dfe972ea2ec863b@mail.gmail.com> References: <11abf2720803160541i7ce7d5ebq7c3e1b762d11ff20@mail.gmail.com> <11abf2720803310313w63f5922dl2dfe972ea2ec863b@mail.gmail.com> Message-ID: <20080403032110.GM26722@washoe.rutgers.edu> Hi Anton Thank you for the interest in PyMVPA ;-) Indeed PyMVPA itself might be not the best basis to jump off for scikits.learn since we see PyMVPA more of a framework, not a toolbox. As I said earlier, on the other hand some ideas might be borrowed from PyMVPA. Please don't take me wrong -- if you think that PyMVPA looks like a good starting point to develop on -- that is cool, you are welcome to the team. On the other hand, if you develop .learn independently we will be interested into migrating some of functionality into wider-audience .learn and just rely on importing .learn internally within PyMVPA May be we could have some quick (or not so quick) chit chat (voice via skype or any other voip + VNC screen) some time next week. There is additional discussion happening on exppsy-pymvpa mailing list which you are welcome to join as well http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa With best regards Yaroslav On Mon, 31 Mar 2008, Anton Slesarev wrote: > Main problem how to connect it with your idea to use PyMVPA. I don't > think that it is a good idea to duplicate its functionality. But I hope > we can find approach to cooperate without writing useless code. > I hope you find mentor who want to take this application. -- .-. =------------------------------ /v\ ----------------------------= Keep in touch // \\ (yoh@|www.)onerussian.com Yaroslav Halchenko /( )\ ICQ#: 60653192 Linux User ^^-^^ [175555] From lists at onerussian.com Thu Apr 3 02:06:04 2008 From: lists at onerussian.com (Yaroslav Halchenko) Date: Thu, 3 Apr 2008 02:06:04 -0400 Subject: [SciPy-dev] Google Summer of Code and scipy.learn (another trying) In-Reply-To: References: <11abf2720803160541i7ce7d5ebq7c3e1b762d11ff20@mail.gmail.com> <11abf2720803240241j69b5ec0au1e5dc132949e6be8@mail.gmail.com> <20080324143230.GF26722@washoe.rutgers.edu> <20080324174332.GG26722@washoe.rutgers.edu> Message-ID: <20080403060603.GN26722@washoe.rutgers.edu> > That's what I want, yes. What is the value it contains if not the wx+b > one ? with shogun interface (branch _tent/sg) for binary classifier it is. (for multiclass it looses its meaning since you will have multiple values and number would depend on the scheme used to perform multiclass). For libsvm as I said since it wasn't exposed in API we just stored probabilities, and just never got to fix it -- but it will be fixed (or will be gone by itself if switch completely over to shogun) -- .-. =------------------------------ /v\ ----------------------------= Keep in touch // \\ (yoh@|www.)onerussian.com Yaroslav Halchenko /( )\ ICQ#: 60653192 Linux User ^^-^^ [175555] From matthieu.brucher at gmail.com Thu Apr 3 02:22:25 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 3 Apr 2008 08:22:25 +0200 Subject: [SciPy-dev] Google Summer of Code and scipy.learn (another trying) In-Reply-To: <20080403060603.GN26722@washoe.rutgers.edu> References: <11abf2720803160541i7ce7d5ebq7c3e1b762d11ff20@mail.gmail.com> <11abf2720803240241j69b5ec0au1e5dc132949e6be8@mail.gmail.com> <20080324143230.GF26722@washoe.rutgers.edu> <20080324174332.GG26722@washoe.rutgers.edu> <20080403060603.GN26722@washoe.rutgers.edu> Message-ID: 2008/4/3, Yaroslav Halchenko : > > > That's what I want, yes. What is the value it contains if not the > wx+b > > one ? > > with shogun interface (branch _tent/sg) for binary classifier it is. > (for multiclass it looses its meaning since you will have multiple > values and number would depend on the scheme used to perform > multiclass). For libsvm as I said since it wasn't exposed in API we just > stored probabilities, and just never got to fix it -- but it will be > fixed (or will be gone by itself if switch completely over to shogun) Then I will try today with Shogun :D Thanks a lot, Yarick ! Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Thu Apr 3 03:49:51 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 3 Apr 2008 09:49:51 +0200 Subject: [SciPy-dev] Google Summer of Code and scipy.learn (another trying) In-Reply-To: <20080403032110.GM26722@washoe.rutgers.edu> References: <11abf2720803160541i7ce7d5ebq7c3e1b762d11ff20@mail.gmail.com> <11abf2720803310313w63f5922dl2dfe972ea2ec863b@mail.gmail.com> <20080403032110.GM26722@washoe.rutgers.edu> Message-ID: 2008/4/3, Yaroslav Halchenko : > > Hi Anton > > Thank you for the interest in PyMVPA ;-) Indeed PyMVPA itself might be > not the best basis to jump off for scikits.learn since we see > PyMVPA more of a framework, not a toolbox. As I said earlier, on the > other hand some ideas might be borrowed from PyMVPA. Please don't take > me wrong -- if you think that PyMVPA looks like a good starting point to > develop on -- that is cool, you are welcome to the team. On the other > hand, if you develop .learn independently we will be interested into > migrating some of functionality into wider-audience .learn and just rely > on importing .learn internally within PyMVPA This would be great, there are a lot of things that could be usefull for a toolbox (like the Shogun wrapper). The learn scikit has some good ideas as well, for instance the libsvm wrapper. The wrapper constructs the library as well, this way you have the correct version (I had to get libsvm from the debian repository as Fedora 8 does not provide the correct one), with the correct patch, and no include/library dir sweat. May be we could have some quick (or not so quick) chit chat (voice via > skype or any other voip + VNC screen) some time next week. There is > additional discussion happening on exppsy-pymvpa mailing list which you > are welcome to join as well > http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa > Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Fri Apr 4 03:35:41 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 4 Apr 2008 00:35:41 -0700 Subject: [SciPy-dev] NumPy/SciPy sprint in Berkeley next week Message-ID: Hello, I have just organized a little NumPy/SciPy mini-sprint for next week. David Cournapeau is visiting me for a week and several other developers (Eric Jones, Robert Kern, Peter Wang, Jonathan Taylor, and Karl Young) will be stopping by during the week to work with the Berkeley team (Fernando Perez, Chris Burns, Tom Waite, and I). There may be a few others who will join us as well. I am still working on a preliminary list of topics that I hope for us to work on and I will send it out to the list before Monday. Among other things, I will be trying to push NumPy 1.0.5 out the door. I will send out another announcement as the date draws near, but I hope that some of you will be able to join us next week at irc.freenode.net (channel scipy). Please consider next week an official Bug/Doc week. Once we get NumPy 1.0.5 released, I will start focusing on getting SciPy 0.7.0 released--which means we need to start squashing SciPy bugs as relentlessly as we have been squashing NumPy's bugs during the last month. Thanks to everyone who is working so hard to make NumPy/SciPy the best foundation for scientific and numerical computing. Cheers, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From jh at physics.ucf.edu Fri Apr 4 21:24:26 2008 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 04 Apr 2008 21:24:26 -0400 Subject: [SciPy-dev] job postings on the web site Message-ID: I've just had a pretty tough search to hire a couple of people who have skill in numpy/scipy. A search on Monster for "numpy" got just one (1) hit out of a few million resumes, yet I know they must be out there. How would people feel about a page on the website where people could post job ads and job-seeker ads for work using or related to numerical uses of Python? Or how about just for development work *on* numpy/scipy/matplotlib/etc.? We'd have to find some reasonable set of constraints, but how does the principle of it strike you? This would be a commercial use of the site, and the first, so I didn't want to make a page following the "wiki way" without some feedback. --jh-- From aisaac at american.edu Fri Apr 4 22:08:07 2008 From: aisaac at american.edu (Alan G Isaac) Date: Fri, 4 Apr 2008 22:08:07 -0400 Subject: [SciPy-dev] job postings on the web site In-Reply-To: References: Message-ID: I hope a user response is welcome. After all, users have a strong interest in seeing a growing and successful SciPy community. Not only would it be great to have such a Wiki page, but I would hope that once a week someone would scrape any new postings off the page and send them to the NumPy users list. (With a consistent header, so that they can be filtered out by anyone who does not want to see them.) Cheers, Alan Isaac From robert.kern at gmail.com Fri Apr 4 22:27:04 2008 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 4 Apr 2008 21:27:04 -0500 Subject: [SciPy-dev] job postings on the web site In-Reply-To: References: Message-ID: <3d375d730804041927n14d4579am664062b02095a753@mail.gmail.com> On Fri, Apr 4, 2008 at 8:24 PM, Joe Harrington wrote: > I've just had a pretty tough search to hire a couple of people who > have skill in numpy/scipy. A search on Monster for "numpy" got just > one (1) hit out of a few million resumes, yet I know they must be out > there. How would people feel about a page on the website where people > could post job ads and job-seeker ads for work using or related to > numerical uses of Python? Or how about just for development work *on* > numpy/scipy/matplotlib/etc.? We'd have to find some reasonable set of > constraints, but how does the principle of it strike you? This would > be a commercial use of the site, and the first, so I didn't want to > make a page following the "wiki way" without some feedback. I suspect that most of the time, the page will be mostly empty. Previously, I've stated that it would be permissible to post job announcements to the numpy and scipy mailing lists if they have a [JOB] marker in the subject line (and no one objected, so it's essentially official policy). For the level of activity that I'm estimating, it seems that that would be more appropriate. Fixtures like job pages don't really work unless if there is a de minimis amount of activity; if there aren't usually any jobs posted, no one looking for a job checks it. I suspect that you would get more quality hits by posting to the general Python Job Board than a numpy-specific one. http://www.python.org/community/jobs/ -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From millman at berkeley.edu Fri Apr 4 23:17:56 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 4 Apr 2008 20:17:56 -0700 Subject: [SciPy-dev] job postings on the web site In-Reply-To: <3d375d730804041927n14d4579am664062b02095a753@mail.gmail.com> References: <3d375d730804041927n14d4579am664062b02095a753@mail.gmail.com> Message-ID: On Fri, Apr 4, 2008 at 7:27 PM, Robert Kern wrote: > I suspect that most of the time, the page will be mostly empty. > Previously, I've stated that it would be permissible to post job > announcements to the numpy and scipy mailing lists if they have a > [JOB] marker in the subject line (and no one objected, so it's > essentially official policy). For the level of activity that I'm > estimating, it seems that that would be more appropriate. Fixtures > like job pages don't really work unless if there is a de minimis > amount of activity; if there aren't usually any jobs posted, no one > looking for a job checks it. I suspect that you would get more quality > hits by posting to the general Python Job Board than a numpy-specific > one. > > http://www.python.org/community/jobs/ I agree with Robert on this; I don't think having a job posting page would be of any use (I don't mind if you want to set it up; but there are plenty of useful things you could do for the project instead). I think that sending an email to the mailing lists and posting on the Python Job Board make more sense. I have hired several programmers to work on or with NumPy/SciPy over the last several years. Finding good people is hard and personal networking is the most effective solution. This isn't a NumPy/SciPy problem; finding good technical people is hard in all fields (e.g., sysadmins, network admins, DB admins, scientific programmers, informatics programmers). I tend to use services like Monster or Craigslist, but they are rarely (if ever) useful. Personal contacts always work best. For NumPy/SciPy programmers, I would recommend trying the following (at least, these are the types of things I have found the most useful): 1. Become an active member of NumPy/SciPy. Follow the mailing list discussions and code development. Contribute code and documentation yourself. Attend the SciPy conference and let people know what you need help with. 2. Become active in your local Python group. This will put you in touch with local Python programmers at least. 3. Form a local NumPy/SciPy community. This has been, by far, the most useful thing I have done. Host coding sprints, meetings, or hands-on workshops (e.g., invite Eric andTravis or Fernando and John Hunter to give one of their introduction to scientific programming with Python). Each of these activities bring in different groups of people (the sprints and meetings will increase your exposure to the NumPy/SciPy developers, while the workshops will introduce you to developers/scientists who are starting to use Python for scientific development). An important side-effect of these activities is that it also gives you the opportunity to develop local NumPy/SciPy expertise as well as help build the community. When I host sprints or meetings at Berkeley or elsewhere, I often meet new people who I may work with in the future. I would be interested in hearing what other people have done to find good NumPy/SciPy programmers. By the way, Joe, did you post both jobs to the list? I only remember one job posting of yours. Did I miss the other? -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From ctw at cogsci.info Sun Apr 6 15:51:21 2008 From: ctw at cogsci.info (Christoph T. Weidemann) Date: Sun, 6 Apr 2008 15:51:21 -0400 Subject: [SciPy-dev] bug(s) in scipy.stats Message-ID: Hi all! I just submitted the following ticket but wanted to summarize the main message from the ticket here, because it raises some issues of general concern: http://scipy.org/scipy/scipy/ticket/631 There's currently a bug in the scipy.stats.ttest_1samp function that stems from the fact that it assumes that scipy.stats.var defaults to axis=None (like the numpy.var function) when it really defaults to axis=0. The function scipy.stats.obrientransform might also be affected by this bug. Of course this could be easily fixed by specifying the axis keyword in the calls to scipy.stats.var, but I think it is not a good idea to have multiple functions with the same name with different interfaces that calculate (slightly) different things. I think it would be much better if all var functions (in numpy and scipy) had the same interface with a boolean keyword to determine if they return the biased or the unbiased variance (with identical defaults). Of course, changing the interface (and default behavior) of a basic function such as scipy.stats.var is likely to break code, so it would require some thought how to best achieve more consistency. I'd like to hear what people here think about this issue. Best, Christoph PS: The ticket raises some more specific concerns with the code for scipy.stats.ttest_1samp, but I thought I'd focus on the scipy.stats.var function in this message. From vshah at interactivesupercomputing.com Sun Apr 6 23:22:06 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Sun, 6 Apr 2008 20:22:06 -0700 Subject: [SciPy-dev] Sparse matrix indexing. Message-ID: <660143EE-C355-4CEB-8BB6-57A9D549B957@interactivesupercomputing.com> Hello, We are working to convert Circuitscape (a landscape ecology tool) from MATLAB to Python. http://www.nceas.ucsb.edu/~mcrae/circuitscape.html It exercises a lot of sparse array functionality (indexing, assignment, concatenation, and MATLAB sparse()/find(), and sparse linear algebra). I am new to Numpy/Scipy, and I'd be delighted to figure out if there were a better way to do sparse matrix indexing. The general array indexing didn't seem to work for arbitrary index sets, at least in the latest release. I did come across this on scipy-dev, but it seems like it works for only CSR ? http://article.gmane.org/gmane.comp.python.scientific.devel/6702/match=sparse+indexing I currently use a wrapper that can be used to implement general sparse matrix indexing and row/col deletion using sparse matrix multiplication. It implements sparse indexing. Deletion, is then implemented by simply taking arange(0,len) and removing the indices you want to delete, and then calling the indexing code. This may be a useful way to complete the functionality for data structures that may not have full indexing yet. I'm not sure if this can be made to handle slices correctly. This is the code snippet I use. If its useful, I can help make it robust for inclusion. # Implement B = A[I, J] def subsref(self,A,I,J): nr = A.shape[0] nc = A.shape[1] nI = len(I) nJ = len(J) IM = sparse.csc_matrix((ones(nI), c_[arange(0,nI), I].T), dims=[nI,nr]) JM = sparse.csc_matrix((ones(nJ), c_[J, arange(0,nJ)].T), dims=[nc,nJ]) B = IM*A*JM return B -viral From wnbell at gmail.com Sun Apr 6 23:35:27 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sun, 6 Apr 2008 22:35:27 -0500 Subject: [SciPy-dev] Sparse matrix indexing. In-Reply-To: <660143EE-C355-4CEB-8BB6-57A9D549B957@interactivesupercomputing.com> References: <660143EE-C355-4CEB-8BB6-57A9D549B957@interactivesupercomputing.com> Message-ID: On Sun, Apr 6, 2008 at 10:22 PM, Viral Shah wrote: > I did come across this on scipy-dev, but it seems like it works for > only CSR ? > http://article.gmane.org/gmane.comp.python.scientific.devel/6702/match=sparse+indexing Viral, thanks for your interest in scipy.sparse. Both csr_matrix and csc_matrix now support fancy indexing (in the current developer's version). http://article.gmane.org/gmane.comp.python.scientific.devel/7392 FWIW the internals of csr_matrix.__getitem__ use an approach similar to you subsref() function, so their efficiency should be comparable. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From millman at berkeley.edu Mon Apr 7 06:00:44 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 03:00:44 -0700 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: <200712240856.08804.faltet@carabos.com> References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: Hello, I would like remove numexpr from the scipy sandbox now that it is a stand alone project: http://code.google.com/p/numexpr/ If there are any problems with this, please let me know in the next couple of days. Otherwise I am going to go ahead and clean it up. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From millman at berkeley.edu Mon Apr 7 06:56:14 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 03:56:14 -0700 Subject: [SciPy-dev] Status of sandbox.spline and my other code. In-Reply-To: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> References: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> Message-ID: On Sun, Jun 10, 2007 at 9:55 AM, John Travers wrote: > 1. sandbox.spline stuff > The module here was supposed to be a tidy up of scipy.interpolate. All > of the dierkx functionality has been moved to f2py wrappers, which I > think makes maintenance easier. However, it does not add any new > functionality, and in retrospect appears to have been a waste of my > time. So I'll leave to you to decide if you want to integrate it into > scipy.interpolate or not; though it was originaly planned to be a > seperate module to clear up the ambiguity between interpolation and > smoothing spline functionality. > However I have added 20 unit tests to the code (most of which simply > check the wrapper against the pure fortran output, but still useful I > think) which could easily be moved over to the current > scipy.interpolate module. Hey John, I am hosting a NumPy/SciPy sprint this week at UC Berkeley and among other things am hoping to make some progress on cleaning up the various interpolation/spline code in SciPy as well as continue removing the scipy sandbox. Do you have any more plans for your sandbox.spline code? Any thoughts on where it should go or what still needs to be done? Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From matthieu.brucher at gmail.com Mon Apr 7 08:19:44 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Mon, 7 Apr 2008 14:19:44 +0200 Subject: [SciPy-dev] Creating a swig C++ wrapper with numpy.distutils Message-ID: hi, I'd like to create an extension with SWIG. By default, the extension is a C one, but I need to create a C++ one with swig, that is adding the -c++ flag to the swig commandline. With setuptools, I just have to add the argument options={'build_ext':{'swig_opts':'-c++'}} to the setup() command. Is there something like this in numpy.distutils ? If yes is it possible to add the option only to one extension with the configuration tool ? Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Mon Apr 7 09:53:29 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 7 Apr 2008 09:53:29 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca><200712240856.08804.faltet@carabos.com> Message-ID: On Mon, 7 Apr 2008, Jarrod Millman apparently wrote: > I would like remove numexpr from the scipy sandbox now > that it is a stand alone project: > http://code.google.com/p/numexpr/ Out of curiosity, why is it not a scikit, and how are new users supposed to find it now? (I suppose David and Tim have already discussed this.) Cheers, Alan Isaac From ondrej at certik.cz Mon Apr 7 10:03:12 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 7 Apr 2008 16:03:12 +0200 Subject: [SciPy-dev] 0.7.0 release Message-ID: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> Hi, what are the release plans for 0.7.0? SciPy svn contains a lot of new improvements (especially the sparse solvers) that I need. I tried to create a Debian package from the scipy svn version, just to see if it works and it seems to be ok. However, when I run tests, I got a lot of errors (see below). Are those known problems, or just Debian related? Running on: In [2]: import scipy In [3]: scipy.__version__ Out[3]: '0.7.0.dev4087' Ondrej ------------------ ondra at fuji:~/debian/packages/result$ ipython Python 2.4.5 (#2, Mar 12 2008, 00:15:51) Type "copyright", "credits" or "license" for more information. IPython 0.8.2 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. In [1]: import scipy In [2]: scipy.test() /usr/lib/python2.4/site-packages/scipy/linsolve/__init__.py:4: DeprecationWarning: scipy.linsolve has moved to scipy.sparse.linalg.dsolve warn('scipy.linsolve has moved to scipy.sparse.linalg.dsolve', DeprecationWarning) /usr/lib/python2.4/site-packages/scipy/splinalg/__init__.py:3: DeprecationWarning: scipy.splinalg has moved to scipy.sparse.linalg warn('scipy.splinalg has moved to scipy.sparse.linalg', DeprecationWarning) E.............EE...caxpy:n=4 ..caxpy:n=3 ....ccopy:n=4 ..ccopy:n=3 .............cscal:n=4 ....cswap:n=4 ..cswap:n=3 .....daxpy:n=4 ..daxpy:n=3 ....dcopy:n=4 ..dcopy:n=3 .............dscal:n=4 ....dswap:n=4 ..dswap:n=3 .....saxpy:n=4 ..saxpy:n=3 ....scopy:n=4 ..scopy:n=3 .............sscal:n=4 ....sswap:n=4 ..sswap:n=3 .....zaxpy:n=4 ..zaxpy:n=3 ....zcopy:n=4 ..zcopy:n=3 .............zscal:n=4 ....zswap:n=4 ..zswap:n=3 ..EEE............................................................................................................................................................................................................................................................................................................................................................................................................/usr/lib/python2.4/site-packages/scipy/ndimage/_segmenter.py:30: UserWarning: The segmentation code is under heavy development and therefore the public API will change in the future. The NIPY group is actively working on this code, and has every intention of generalizing this for the Scipy community. Use this module minimally, if at all, until it this warning is removed. warnings.warn(_msg, UserWarning) .................................EE..............................................................................................................................................................................................................................................................................................................................0.2 0.2 0.2 ......0.2 ..0.2 0.2 0.2 0.2 0.2 ...................EE...warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations ...warning: specified build_dir '..' does not exist or is not writable. Trying default locations ..warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations ...warning: specified build_dir '..' does not exist or is not writable. Trying default locations ............................building extensions here: /home/ondra/.python24_compiled/m0 .Numpy is installed in /usr/lib/python2.4/site-packages/numpy Numpy version 1.0.4 Python version 2.4.5 (#2, Mar 12 2008, 00:15:51) [GCC 4.2.3 (Debian 4.2.3-2)] Found 10/10 tests for numpy.core.defmatrix Found 36/36 tests for numpy.core.ma Found 223/223 tests for numpy.core.multiarray Found 62/62 tests for numpy.core.numeric Found 31/31 tests for numpy.core.numerictypes Found 12/12 tests for numpy.core.records Found 6/6 tests for numpy.core.scalarmath Found 14/14 tests for numpy.core.umath Found 5/5 tests for numpy.distutils.misc_util Found 1/1 tests for numpy.fft.fftpack Found 3/3 tests for numpy.fft.helper Found 9/9 tests for numpy.lib.arraysetops Found 46/46 tests for numpy.lib.function_base Found 5/5 tests for numpy.lib.getlimits Found 4/4 tests for numpy.lib.index_tricks Found 3/3 tests for numpy.lib.polynomial Found 49/49 tests for numpy.lib.shape_base Found 15/15 tests for numpy.lib.twodim_base Found 43/43 tests for numpy.lib.type_check Found 1/1 tests for numpy.lib.ufunclike Found 40/40 tests for numpy.linalg Found 2/2 tests for numpy.random Found 0/0 tests for __main__ ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 685 tests in 0.623s OK .............................................................................Numpy is installed in /usr/lib/python2.4/site-packages/numpy Numpy version 1.0.4 Python version 2.4.5 (#2, Mar 12 2008, 00:15:51) [GCC 4.2.3 (Debian 4.2.3-2)] Found 10/10 tests for numpy.core.defmatrix Found 36/36 tests for numpy.core.ma Found 223/223 tests for numpy.core.multiarray Found 62/62 tests for numpy.core.numeric Found 31/31 tests for numpy.core.numerictypes Found 12/12 tests for numpy.core.records Found 6/6 tests for numpy.core.scalarmath Found 14/14 tests for numpy.core.umath Found 5/5 tests for numpy.distutils.misc_util Found 1/1 tests for numpy.fft.fftpack Found 3/3 tests for numpy.fft.helper Found 9/9 tests for numpy.lib.arraysetops Found 46/46 tests for numpy.lib.function_base Found 5/5 tests for numpy.lib.getlimits Found 4/4 tests for numpy.lib.index_tricks Found 3/3 tests for numpy.lib.polynomial Found 49/49 tests for numpy.lib.shape_base Found 15/15 tests for numpy.lib.twodim_base Found 43/43 tests for numpy.lib.type_check Found 1/1 tests for numpy.lib.ufunclike Found 40/40 tests for numpy.linalg Found 2/2 tests for numpy.random Found 0/0 tests for __main__ ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 685 tests in 0.493s OK ....................Numpy is installed in /usr/lib/python2.4/site-packages/numpy Numpy version 1.0.4 Python version 2.4.5 (#2, Mar 12 2008, 00:15:51) [GCC 4.2.3 (Debian 4.2.3-2)] Found 10/10 tests for numpy.core.defmatrix Found 36/36 tests for numpy.core.ma Found 223/223 tests for numpy.core.multiarray Found 62/62 tests for numpy.core.numeric Found 31/31 tests for numpy.core.numerictypes Found 12/12 tests for numpy.core.records Found 6/6 tests for numpy.core.scalarmath Found 14/14 tests for numpy.core.umath Found 5/5 tests for numpy.distutils.misc_util Found 1/1 tests for numpy.fft.fftpack Found 3/3 tests for numpy.fft.helper Found 9/9 tests for numpy.lib.arraysetops Found 46/46 tests for numpy.lib.function_base Found 5/5 tests for numpy.lib.getlimits Found 4/4 tests for numpy.lib.index_tricks Found 3/3 tests for numpy.lib.polynomial Found 49/49 tests for numpy.lib.shape_base Found 15/15 tests for numpy.lib.twodim_base Found 43/43 tests for numpy.lib.type_check Found 1/1 tests for numpy.lib.ufunclike Found 40/40 tests for numpy.linalg Found 2/2 tests for numpy.random Found 0/0 tests for __main__ ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 685 tests in 0.513s OK . ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/integrate/tests/test_integrate.py", line 9, in ? from scipy.linalg import norm File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/interpolate/__init__.py", line 7, in ? from interpolate import * File "/usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py", line 13, in ? import scipy.linalg as slin File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: ImportError (cannot import name deprecate_with_doc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/io/__init__.py", line 7, in ? from numpy import deprecate_with_doc ImportError: cannot import name deprecate_with_doc ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/linsolve/__init__.py", line 6, in ? from scipy.sparse.linalg.dsolve import * File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/maxentropy/__init__.py", line 9, in ? from maxentropy import * File "/usr/lib/python2.4/site-packages/scipy/maxentropy/maxentropy.py", line 74, in ? from scipy import optimize, sparse File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/signal/__init__.py", line 11, in ? from ltisys import * File "/usr/lib/python2.4/site-packages/scipy/signal/ltisys.py", line 9, in ? import scipy.interpolate as interpolate File "/usr/lib/python2.4/site-packages/scipy/interpolate/__init__.py", line 7, in ? from interpolate import * File "/usr/lib/python2.4/site-packages/scipy/interpolate/interpolate.py", line 13, in ? import scipy.linalg as slin File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/splinalg/__init__.py", line 5, in ? from scipy.sparse.linalg import * File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ====================================================================== ERROR: Failure: TypeError (deprecate() takes exactly 3 arguments (1 given)) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/loader.py", line 351, in loadTestsFromName module = self.importer.importFromPath( File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/ondra/lib/lib/python/nose-0.10.2.dev_r442-py2.4.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/stats/__init__.py", line 7, in ? from stats import * File "/usr/lib/python2.4/site-packages/scipy/stats/stats.py", line 192, in ? import scipy.linalg as linalg File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/__init__.py", line 5, in ? from base import * File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 45, in ? class spmatrix(object): File "/usr/lib/python2.4/site-packages/scipy/sparse/base.py", line 139, in spmatrix @deprecate TypeError: deprecate() takes exactly 3 arguments (1 given) ---------------------------------------------------------------------- Ran 1063 tests in 9.364s FAILED (errors=10) In [3]: Ondrej From millman at berkeley.edu Mon Apr 7 11:46:01 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 08:46:01 -0700 Subject: [SciPy-dev] 0.7.0 release In-Reply-To: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> References: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> Message-ID: On Mon, Apr 7, 2008 at 7:03 AM, Ondrej Certik wrote: > what are the release plans for 0.7.0? SciPy svn contains a lot of new > improvements (especially the sparse solvers) that I need. I tried to > create a Debian package from the scipy svn version, just to see if it > works and it seems to be ok. However, when I run tests, I got a lot of > errors (see below). I have been busy trying to get NumPy 1.0.5. out, which I hope will happen this week. As soon as that is released I will turn my attention to SciPy. SciPy 0.7 requires NumPy 1.0.5. So you may need to use the NumPy trunk. As far as I am concerned, I think getting SciPy 0.7 out will mostly require make sure all the test run and that are no other regressions. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From millman at berkeley.edu Mon Apr 7 11:52:03 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 08:52:03 -0700 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: On Mon, Apr 7, 2008 at 6:53 AM, Alan G Isaac wrote: > Out of curiosity, why is it not a scikit, > and how are new users supposed to find it now? > (I suppose David and Tim have already discussed this.) As far as I recall, the only discussion has been this thread: http://projects.scipy.org/pipermail/scipy-dev/2007-December/008103.html I found it by searching for numexpr using "numexpr google code". I included the link in my original email, so that it would be easier to find. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From aisaac at american.edu Mon Apr 7 12:17:02 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 7 Apr 2008 12:17:02 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca><200712240856.08804.faltet@carabos.com> Message-ID: On Mon, 7 Apr 2008, Jarrod Millman apparently wrote: > http://projects.scipy.org/pipermail/scipy-dev/2007-December/008103.html Oh yes, I had forgotten about that. A user's perspective: 1. I do not accept the "not 'sciency' enough" argument. 2. I understand the need for separate/separable bug databases, and this seems like a general problem to be addressed for scikits. I would hate to see scikits start spinning off to new locations based on this consideration. 3. I think a tendency for projects to "cluster" around SciPy is very healthy for SciPy as a whole. At the *very* least I would hope to see such projects list themself at the (nonexistent!) page http://www.scipy.org/SciPyPackages. By the way, will need to be changed ... Cheers, Alan Isaac From ondrej at certik.cz Mon Apr 7 12:35:20 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 7 Apr 2008 18:35:20 +0200 Subject: [SciPy-dev] 0.7.0 release In-Reply-To: References: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> Message-ID: <85b5c3130804070935y73cdd315oa71106592f05d587@mail.gmail.com> On Mon, Apr 7, 2008 at 5:46 PM, Jarrod Millman wrote: > On Mon, Apr 7, 2008 at 7:03 AM, Ondrej Certik wrote: > > what are the release plans for 0.7.0? SciPy svn contains a lot of new > > improvements (especially the sparse solvers) that I need. I tried to > > create a Debian package from the scipy svn version, just to see if it > > works and it seems to be ok. However, when I run tests, I got a lot of > > errors (see below). > > I have been busy trying to get NumPy 1.0.5. out, which I hope will > happen this week. As soon as that is released I will turn my > attention to SciPy. SciPy 0.7 requires NumPy 1.0.5. So you may need > to use the NumPy trunk. Ah, that's it. Because I just had to comment out the @deprecated decorators and it started to work for me - so I guess those where changed in numpy 1.0.5. > As far as I am concerned, I think getting SciPy 0.7 out will mostly > require make sure all the test run and that are no other regressions. Yes, thanks for your work on it. Let's get numpy 1.0.5 out, then into Debian and other distributions, see if all is ok and then fix scipy. Ondrej From millman at berkeley.edu Mon Apr 7 12:36:42 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 09:36:42 -0700 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: On Mon, Apr 7, 2008 at 9:17 AM, Alan G Isaac wrote: > 1. I do not accept the "not 'sciency' enough" argument. I can't really speak for the authors, but I would guess they were looking for a large audience. Rather than numexpr being rejected from the scikit's namespace as "not 'sciency' enough", the authors wanted their project not to appear to be restricted to scientific uses. You will need to convince the authors that their code should be a scikit, rather than a stand-alone project. Personally, I think it makes complete sense to be a stand-alone project; but I would be happy to see it as a scikit as well. > 2. I understand the need for separate/separable bug > databases, and this seems like a general problem to be > addressed for scikits. I would hate to see scikits > start spinning off to new locations based on this > consideration. As far as I can tell, this doesn't seem to have been an issue for them. I believe this is more of an issue for the OpenOpt project. > 3. I think a tendency for projects to "cluster" around > SciPy is very healthy for SciPy as a whole. At the > *very* least I would hope to see such projects list > themself at the (nonexistent!) page > http://www.scipy.org/SciPyPackages. Go for it! I would prefer you call the page http://www.scipy.org/SciKits. Even better would be to spend some time working on the scikits' project page and then adding a link to the scipy page somewhere. That way you won't have to keep two pages in synch. This is much needed work and would go a long way to providing more publicity to the scikits and would be much appreciated work. > By the way, > will need to be changed ... It is a wiki, so go ahead and update it yourself. (I am assuming the authors haven't added access restriction; if they have, you should send them an email directly with your suggested changes.) Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From millman at berkeley.edu Mon Apr 7 12:38:54 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 09:38:54 -0700 Subject: [SciPy-dev] 0.7.0 release In-Reply-To: <85b5c3130804070935y73cdd315oa71106592f05d587@mail.gmail.com> References: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> <85b5c3130804070935y73cdd315oa71106592f05d587@mail.gmail.com> Message-ID: On Mon, Apr 7, 2008 at 9:35 AM, Ondrej Certik wrote: > > As far as I am concerned, I think getting SciPy 0.7 out will mostly > > require make sure all the test run and that are no other regressions. > > Yes, thanks for your work on it. Let's get numpy 1.0.5 out, then into > Debian and other distributions, see if all is ok and then fix scipy. Hey Ondrej, I will need some help verifying that NumPy 1.0.5 builds correctly on Debian later this week. So, hopefully, you will have a little free time! Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From guyer at nist.gov Mon Apr 7 13:07:20 2008 From: guyer at nist.gov (Jonathan Guyer) Date: Mon, 7 Apr 2008 13:07:20 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: <47394214-82AD-4221-A59F-BB641B5DD515@nist.gov> On Apr 7, 2008, at 12:36 PM, Jarrod Millman wrote: > I can't really speak for the authors, but I would guess they were > looking for a large audience. Rather than numexpr being rejected from > the scikit's namespace as "not 'sciency' enough", the authors wanted > their project not to appear to be restricted to scientific uses. This is just a guess, but they also might not want to have appeared to require SciPy, specifically. Speaking from our own experience, although FiPy uses and recommends SciPy, we don't require it because it's frankly a headache to install and we know from experience that the support requests from our users are going to come to us, not to the SciPy developers. -- Jonathan E. Guyer, PhD Metallurgy Division National Institute of Standards and Technology From aisaac at american.edu Mon Apr 7 13:28:54 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 7 Apr 2008 13:28:54 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca><200712240856.08804.faltet@carabos.com> Message-ID: > On Mon, Apr 7, 2008 at 9:17 AM, Alan G Isaac wrote: >> 2. I understand the need for separate/separable bug >> databases, and this seems like a general problem to be >> addressed for scikits. I would hate to see scikits >> start spinning off to new locations based on this >> consideration. On Mon, 7 Apr 2008, Jarrod Millman apparently wrote: > As far as I can tell, this doesn't seem to have been an > issue for them. Quoting from: "I don't like sharing a bug database with other, separate, projects" Cheers, Alan Isaac From millman at berkeley.edu Mon Apr 7 13:43:43 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 7 Apr 2008 10:43:43 -0700 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: On Mon, Apr 7, 2008 at 10:28 AM, Alan G Isaac wrote: > On Mon, 7 Apr 2008, Jarrod Millman apparently wrote: > > > As far as I can tell, this doesn't seem to have been an > > issue for them. > > > Quoting from: > > > "I don't like sharing a bug database with other, > separate, projects" Sorry I am not getting enough sleep. OK, so that is another reason they don't want to make this a scikit. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From ondrej at certik.cz Mon Apr 7 13:55:33 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 7 Apr 2008 19:55:33 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers Message-ID: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> Hi, these are probably mainly questions to Nathan. :) First let me say that the new sparse functionality of scipy is awesome, I really like it, it's just so easy and natural to work with it. Thanks for your work on it. Here I have some points: 1) Why do you want to remove umfpack wrappers from scipy? I suggest to leave wrappers in there (those can be BSD, cannot they?), otherwise it will be much harder for scipy users to use umfpack (they would have to install scikits too). If so, could this warning message be removed? scipy/sparse/linalg/dsolve/linsolve.py:20: if isUmfpack and noScikit: warn( 'scipy.sparse.linalg.dsolve.umfpack will be removed,' ' install scikits.umfpack instead', DeprecationWarning ) 2) why was pysparse removed? are there any objections of adding it among arpack and lobpcg solvers? 3) sometimes I also use blzpack - are there any objections if I implement it (if I find time of course) the same way lobpcg and arpack is in there? 4) I would love to see all available open source sparse solvers and eigensolvers to be callable from scipy. Is this the intention of scipy.sparse? If this is the intention, I'd like to help with this too -- there are other good open source solvers out there, for example Primme (this one is GPLed, but that's not a problem, the wrappers could be in scipy, but users will install them themselves, or from scikits). 5) The documentation in the info.py files is quite good, there is also some documentation at: http://scipy.org/SciPy_Tutorial#head-c60163f2fd2bab79edd94be43682414f18b90df7 I have some improvements, for example scipy/sparse/linalg/eigen/info.py is missing the lobpcg, the attached patch fixes that. I used to have a commit access, but I forgot my username && password -- can I ask for a renewal please? :) Those are enough questions so far, depending on the answers, I'll probably have some more then, regarding the interface. :) Ondrej -------------- next part -------------- A non-text attachment was scrubbed... Name: scipy-doc.patch Type: text/x-patch Size: 386 bytes Desc: not available URL: From robert.kern at gmail.com Mon Apr 7 14:07:42 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 11:07:42 -0700 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> Message-ID: <3d375d730804071107q594d9184s7dd7f824be4c189@mail.gmail.com> On Mon, Apr 7, 2008 at 10:55 AM, Ondrej Certik wrote: > Hi, > > these are probably mainly questions to Nathan. :) > > First let me say that the new sparse functionality of scipy is > awesome, I really like it, it's just so easy and natural to work with > it. Thanks for your work on it. > > Here I have some points: > > 1) Why do you want to remove umfpack wrappers from scipy? I suggest to > leave wrappers in there (those can be BSD, cannot they?), That just confuses the issue. If you are going to use the UMFPACK functionality, you will need UMFPACK and the GPL. Consequently, the wrappers must be optional; all of scipy must be buildable and usable without any GPLed code. However, having optional functionality is bad; in order to depend on it you can't just tell your users "install scipy" but "install scipy with the UMFPACK wrappers installed". This makes it especially difficult for things like Linux distributions. They have to make the choice whether to include the GPL code or not. If they do, then they can't include a non-GPL-compatible package that depends on scipy. If they don't, they can't include a package that needs the UMFPACK wrappers. I would have spoken against the initial inclusion of the UMFPACK wrappers if I had been paying attention at the time. I am glad to see them being removed. However, just moving the wrappers to a scikit doesn't solve any problem if scipy code is requiring the scikit. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ondrej at certik.cz Mon Apr 7 14:08:56 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 7 Apr 2008 20:08:56 +0200 Subject: [SciPy-dev] 0.7.0 release In-Reply-To: References: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> <85b5c3130804070935y73cdd315oa71106592f05d587@mail.gmail.com> Message-ID: <85b5c3130804071108m12ac9b28iad7df98f5f195772@mail.gmail.com> On Mon, Apr 7, 2008 at 6:38 PM, Jarrod Millman wrote: > On Mon, Apr 7, 2008 at 9:35 AM, Ondrej Certik wrote: > > > As far as I am concerned, I think getting SciPy 0.7 out will mostly > > > require make sure all the test run and that are no other regressions. > > > > Yes, thanks for your work on it. Let's get numpy 1.0.5 out, then into > > Debian and other distributions, see if all is ok and then fix scipy. > > Hey Ondrej, > > I will need some help verifying that NumPy 1.0.5 builds correctly on > Debian later this week. So, hopefully, you will have a little free > time! Yep. Even though I am in Munich now, doing my research, but since I use numpy, I'll definitely test it. Ondrej From nwagner at iam.uni-stuttgart.de Mon Apr 7 14:09:23 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Mon, 07 Apr 2008 20:09:23 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> Message-ID: On Mon, 7 Apr 2008 19:55:33 +0200 "Ondrej Certik" wrote: > Hi, > > these are probably mainly questions to Nathan. :) > >First let me say that the new sparse functionality of >scipy is > awesome, I really like it, it's just so easy and natural >to work with > it. Thanks for your work on it. > > Here I have some points: > > 1) Why do you want to remove umfpack wrappers from >scipy? I suggest to > leave wrappers in there (those can be BSD, cannot >they?), otherwise > it will be much harder for scipy users to use umfpack >(they would have > to install scikits too). If so, could this warning >message be removed? > > scipy/sparse/linalg/dsolve/linsolve.py:20: > > if isUmfpack and noScikit: > warn( 'scipy.sparse.linalg.dsolve.umfpack will be >removed,' > ' install scikits.umfpack instead', >DeprecationWarning ) > > 2) why was pysparse removed? are there any objections of >adding it > among arpack and lobpcg solvers? > > 3) sometimes I also use blzpack - are there any >objections if I > implement it (if I find time of course) the same way >lobpcg and arpack > is in there? > > 4) I would love to see all available open source sparse >solvers and > eigensolvers to be callable from scipy. Is this the >intention of > scipy.sparse? > > If this is the intention, I'd like to help with this too >-- there are > other good open source solvers out there, for example >Primme (this one > is GPLed, but that's not a problem, the wrappers could > be in scipy, but users will install them themselves, or >from scikits). > > 5) The documentation in the info.py files is quite good, >there is also > some documentation at: > > http://scipy.org/SciPy_Tutorial#head-c60163f2fd2bab79edd94be43682414f18b90df7 > > I have some improvements, for example > scipy/sparse/linalg/eigen/info.py is missing the lobpcg, >the attached > patch fixes that. lobpcg is currently disabled. See ticket http://projects.scipy.org/scipy/scipy/ticket/630 BTW, a svd for sparse matrices would be nice, too. Nils From robert.kern at gmail.com Mon Apr 7 14:35:36 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 11:35:36 -0700 Subject: [SciPy-dev] Creating a swig C++ wrapper with numpy.distutils In-Reply-To: References: Message-ID: <3d375d730804071135t6aab5027ic210a6cec9a8d15e@mail.gmail.com> On Mon, Apr 7, 2008 at 5:19 AM, Matthieu Brucher wrote: > hi, > > I'd like to create an extension with SWIG. By default, the extension is a C > one, but I need to create a C++ one with swig, that is adding the -c++ flag > to the swig commandline. With setuptools, I just have to add the argument > options={'build_ext':{'swig_opts':'-c++'}} to the setup() command. > Is there something like this in numpy.distutils ? If yes is it possible to > add the option only to one extension with the configuration tool ? Don't add swig_opts to the build_ext command; it affects every SWIG wrapper, which may not be what you want. Instead, add swig_opts='-c++' to the Extension() constructor (or add_extension() method call) of the extension that needs it. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From vshah at interactivesupercomputing.com Mon Apr 7 14:53:11 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Mon, 7 Apr 2008 11:53:11 -0700 Subject: [SciPy-dev] Scipy equivalents of MATLAB's sparse/find/accumarray Message-ID: <9F089E41-DE21-4925-B2AF-B67B666FA87F@interactivesupercomputing.com> Nathan, thanks for your earlier response on sparse matrix indexing. The other thing that I found cumbersome with the current sparse implementation in scipy is not being able to figure out the exact equivalents of Matlab's sparse/accumarray and find. In the following examples, I will denote Matlab codes with M>> and Python with P>> M>> S = sparse (I, J, V, m, n) The equivalent in Python seems to be: P>> S = sparse.csc_matrix ((obj, ij), dims) Or P>> S = sparse.csr_matrix ((obj, ij), dims) It wasn't clear looking at the documentation, what happens when there are duplicate row/col values. Matlab's sparse() adds duplicates, and accumarray() provides a more general way to combine duplicates. I was wondering, if it would be possible to add an extra argument that would allow the user to specify what to do with duplicates. Options could be last, any, sum, prod, max, min, argmax, argmin, or a user specified function. For M>> F = find (a) The equivalent seems to be; P>> f = where(a) But where doesn't seem to work for sparse matrices, in 0.6. -viral From vshah at interactivesupercomputing.com Mon Apr 7 15:15:19 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Mon, 7 Apr 2008 12:15:19 -0700 Subject: [SciPy-dev] Sparse data structures. Message-ID: <30F099C6-37D7-48EE-984A-615D895E538D@interactivesupercomputing.com> Sparse folks, I had a couple of other general questions/comments about Python's sparse matrix infrastructure. Coming from the Matlab land, I am used to the system providing me just one data storage for sparse matrices. Python provides 5. The rationale for Matlab's sparse matrix design is described in great detail in: http://www.hpl.hp.com/personal/Robert_Schreiber/papers/Sparse%20Matrices%20in%20Matlab/simax.pdf For linear algebra operations such as direct solvers and iterative methods, it doesn't really matter what format you start out in. The cost of conversion is small compared to the cost of the operation. Most of these are provided in libraries, and hence not a huge source of worry. The real questions are array operations that are performed in code that a Python programmer may write. These include operations such as assembly, indexing, assignment, and concatenation. This is where the choice of data structure becomes important. Offering 5 choices to the naive programmer who doesn't understand the details about sparse matrix implementations may make Scipy look daunting. I am wondering whats the best way to address this ? I summarize my understanding of the different formats below. I'd love to have a conversation with someone on how to make this useful in a coherent way (one default data structure, for example), or at the very least, update the documentation with some information to guide someone. For row/col indexing, CSR and CSC formats can be decent - especially if you are extracting columns from CSC and rows from CSR. Random access is decent, but not the greatest. These formats are also good for matrix-vector multiplication, the workhorse of many iterative solvers - but one could use a library like OSKI for that. CSR/CSC are terrible for assembly and assignment. They can be decent for concatenation along the preferred dimension. A dictionary representation is great for random access, assembly, and concatenation, but not very efficient for submatrix indexing and assignment. Co-ordinate representation can be good for operations, given that the right rule for combining duplicates is used. The good thing about both these representations is that they can be extended to N-d sparse arrays. The list of lists approach lies somewhere in the middle. I am assuming these lists are kept sorted. The main drawback is the excessive memory usage to store all the pointers. I haven't yet looked at the code, and I could be wrong. Thanks, -viral From nwagner at iam.uni-stuttgart.de Mon Apr 7 15:15:49 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Mon, 07 Apr 2008 21:15:49 +0200 Subject: [SciPy-dev] 0.7.0 release In-Reply-To: References: <85b5c3130804070703q42714e89he9522e378cb1239e@mail.gmail.com> Message-ID: On Mon, 7 Apr 2008 08:46:01 -0700 "Jarrod Millman" wrote: > On Mon, Apr 7, 2008 at 7:03 AM, Ondrej Certik > wrote: >> what are the release plans for 0.7.0? SciPy svn >>contains a lot of new >> improvements (especially the sparse solvers) that I >>need. I tried to >> create a Debian package from the scipy svn version, >>just to see if it >> works and it seems to be ok. However, when I run tests, >>I got a lot of >> errors (see below). > > I have been busy trying to get NumPy 1.0.5. out, which I >hope will > happen this week. As soon as that is released I will >turn my > attention to SciPy. SciPy 0.7 requires NumPy 1.0.5. So >you may need > to use the NumPy trunk. > > As far as I am concerned, I think getting SciPy 0.7 out >will mostly > require make sure all the test run and that are no other >regressions. Here are the corresponding tickets http://projects.scipy.org/scipy/scipy/ticket/375 http://projects.scipy.org/scipy/scipy/ticket/584 http://projects.scipy.org/scipy/scipy/ticket/586 http://projects.scipy.org/scipy/scipy/ticket/611 BTW, I found two failures ====================================================================== FAIL: test_imresize (test_pilutil.TestPILUtil) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/testing/decorators.py", line 83, in skipper return f(*args, **kwargs) File "/usr/lib/python2.4/site-packages/scipy/misc/tests/test_pilutil.py", line 25, in test_imresize assert_equal(im1.shape,(11,22)) File "/usr/lib/python2.4/site-packages/numpy/testing/utils.py", line 137, in assert_equal assert_equal(len(actual),len(desired),err_msg,verbose) File "/usr/lib/python2.4/site-packages/numpy/testing/utils.py", line 145, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: ACTUAL: 0 DESIRED: 2 ====================================================================== FAIL: test_namespace (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_formula.py", line 119, in test_namespace self.assertEqual(xx.namespace, Y.namespace) AssertionError: {} != {'Y': array([ 0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78, 80, 82, 84, 86, 88, 90, 92, 94, 96, 98]), 'X': array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49])} ---------------------------------------------------------------------- Ran 2025 tests in 55.207s FAILED (failures=2, errors=5) Nils From wnbell at gmail.com Mon Apr 7 16:39:25 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 7 Apr 2008 15:39:25 -0500 Subject: [SciPy-dev] Scipy equivalents of MATLAB's sparse/find/accumarray In-Reply-To: <9F089E41-DE21-4925-B2AF-B67B666FA87F@interactivesupercomputing.com> References: <9F089E41-DE21-4925-B2AF-B67B666FA87F@interactivesupercomputing.com> Message-ID: On Mon, Apr 7, 2008 at 1:53 PM, Viral Shah wrote: > Nathan, thanks for your earlier response on sparse matrix indexing. > > The other thing that I found cumbersome with the current sparse > implementation in scipy is not being able to figure out the exact > equivalents of Matlab's sparse/accumarray and find. In the following > examples, I will denote Matlab codes with M>> and Python with P>> > > M>> S = sparse (I, J, V, m, n) > > The equivalent in Python seems to be: > P>> S = sparse.csc_matrix ((obj, ij), dims) > Or > P>> S = sparse.csr_matrix ((obj, ij), dims) That's correct. These are just wrappers for coo_matrix((obj,ij)).tocsc() and coo_matrix((obj,ij)).tocsr(). Likewise, find() in MATLAB implemented by .tocoo(). > It wasn't clear looking at the documentation, what happens when there > are duplicate row/col values. Matlab's sparse() adds duplicates, and > accumarray() provides a more general way to combine duplicates. I was > wondering, if it would be possible to add an extra argument that would > allow the user to specify what to do with duplicates. Options could be > last, any, sum, prod, max, min, argmax, argmin, or a user specified > function. Currently the coo->csr and coo->csc conversions sum duplicates together. accumarray() is probably better accomplished by argsorting the index tuples and performing whatever transformation you wish on the permuted values. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From slesarev.anton at gmail.com Mon Apr 7 17:36:11 2008 From: slesarev.anton at gmail.com (Anton Slesarev) Date: Tue, 8 Apr 2008 01:36:11 +0400 Subject: [SciPy-dev] Google Summer of Code and scipy.learn (another trying) In-Reply-To: <20080403032110.GM26722@washoe.rutgers.edu> References: <11abf2720803160541i7ce7d5ebq7c3e1b762d11ff20@mail.gmail.com> <11abf2720803310313w63f5922dl2dfe972ea2ec863b@mail.gmail.com> <20080403032110.GM26722@washoe.rutgers.edu> Message-ID: <11abf2720804071436p16ac7e5dv6a0b59fa05a26547@mail.gmail.com> On Thu, Apr 3, 2008 at 7:21 AM, Yaroslav Halchenko wrote: > Hi Anton > > Thank you for the interest in PyMVPA ;-) Indeed PyMVPA itself might be > not the best basis to jump off for scikits.learn since we see > PyMVPA more of a framework, not a toolbox. As I said earlier, on the > other hand some ideas might be borrowed from PyMVPA. Please don't take > me wrong -- if you think that PyMVPA looks like a good starting point to > develop on -- that is cool, you are welcome to the team. On the other > hand, if you develop .learn independently we will be interested into > migrating some of functionality into wider-audience .learn and just rely > on importing .learn internally within PyMVPA I have no enough experience to solve what could be the best jump point for scikits.learn, but as I understand from David's and Jarrod's post they have some ideas to integrate PyMVPA. And if some mentor from scikits want to deal with me in format of google summer of code, I'll should to take into account their work. If you want to talk with me about my partisipation in this I'll happy to talk with you. But if there will not mentors who want to occupie with me I'am not going to write it sooner than end of summer. And I hope that by that time some structure of package will have clarified. Ok, write your contact and cosy time for you i'll trie to find place to talk with you. (skype is not my usally way of communication.) > > May be we could have some quick (or not so quick) chit chat (voice via > skype or any other voip + VNC screen) some time next week. There is > additional discussion happening on exppsy-pymvpa mailing list which you > are welcome to join as well > http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa > > With best regards > Yaroslav > > On Mon, 31 Mar 2008, Anton Slesarev wrote: > > Main problem how to connect it with your idea to use PyMVPA. I don't > > think that it is a good idea to duplicate its functionality. But I > hope > > we can find approach to cooperate without writing useless code. > > I hope you find mentor who want to take this application. > -- > .-. > =------------------------------ /v\ ----------------------------= > Keep in touch // \\ (yoh@|www.)onerussian.com > Yaroslav Halchenko /( )\ ICQ#: 60653192 > Linux User ^^-^^ [175555] > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- Anton Slesarev -------------- next part -------------- An HTML attachment was scrubbed... URL: From halish at kofeina.org Mon Apr 7 19:53:40 2008 From: halish at kofeina.org (Mateusz Haligowski) Date: Tue, 8 Apr 2008 01:53:40 +0200 Subject: [SciPy-dev] GSoC: Time Series Analysis Message-ID: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> Hello, I have posted my application for Google Summer Of Code, for Python Software Foundation. I want to develop some time series function that have not been yet implemented in SciPy. It's name is "Time series analysis tools for SciPy". Unfortunately, I have no idea how mentors can access this:) I can post a copy of this, if there are requests. I'm looking forward to comments on that. Regards, Mateusz Haligowski -------------- next part -------------- An HTML attachment was scrubbed... URL: From pgmdevlist at gmail.com Mon Apr 7 20:59:08 2008 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 7 Apr 2008 20:59:08 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> Message-ID: <200804072059.09561.pgmdevlist@gmail.com> On Monday 07 April 2008 19:53:40 Mateusz Haligowski wrote: > I want to develop some time series function that have not been yet > implemented in SciPy. It's name is "Time series analysis tools for > SciPy"l.com%3A2b38cac5%3A-5a027c84>. Unfortunately, I have no idea how mentors can > access this:) I can post a copy of this, if there are requests. Please do. We have already some functions developed in scikits.timeseries, but help is always welcome. From halish at kofeina.org Mon Apr 7 21:12:34 2008 From: halish at kofeina.org (Mateusz Haligowski) Date: Tue, 8 Apr 2008 03:12:34 +0200 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <200804072059.09561.pgmdevlist@gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> <200804072059.09561.pgmdevlist@gmail.com> Message-ID: <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> 2008/4/8, Pierre GM : > > On Monday 07 April 2008 19:53:40 Mateusz Haligowski wrote: > > > I want to develop some time series function that have not been yet > > implemented in SciPy. It's name is "Time series analysis tools for > > > SciPy"< > http://code.google.com/soc/app.html?org=psf&csaid=mhaligowski%40gmai > >l.com%3A2b38cac5%3A-5a027c84>. Unfortunately, I have no idea how mentors > can > > > access this:) I can post a copy of this, if there are requests. > > > Please do. We have already some functions developed in scikits.timeseries, > but > help is always welcome. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > I took a look at scikits trac and there isn't much yet on time series. I was thinkinkg of rather econometrical/statistical approach. Anyway, my application is here (I can still change it if something comes into my mind): http://pastebin.com/m67ae5c83 -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Mon Apr 7 21:24:15 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 18:24:15 -0700 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> <200804072059.09561.pgmdevlist@gmail.com> <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> Message-ID: <3d375d730804071824l5565085di4272f73fd0bed93@mail.gmail.com> On Mon, Apr 7, 2008 at 6:12 PM, Mateusz Haligowski wrote: > Anyway, my application is here (I can still change it if something comes > into my mind): http://pastebin.com/m67ae5c83 Can you explain this statement a little more? """In developing the project I will use the proprietary software EViews, which I am going to buy for the stipend.""" How exactly are you going to use EViews? If you just want to cross-check results, there are open source tools which implement the functionality you intend. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From aisaac at american.edu Mon Apr 7 22:08:59 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 7 Apr 2008 22:08:59 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> Message-ID: Just to correct an apparent misimpression ... While I am definitely interested in the general area of this proposal, when I commented that "I have volunteered to mentor", I was NOT committing to this particular project but rather describing my status in the GSoC. Specifically, as I was trying to explain, I expect that the SciPy project under the PSF umbrella will select certain projects, *not* that I will do so. Because of my interest in SciPy I have volunteered to be one of their mentors, if that should prove useful to SciPy. Hope that's clearer, Alan Isaac PS It would be useful if your project description provided a little background on statistical code that you have already implemented. Also, the ADF test is rather trivial to implement, so it would be much more interesting to emphasize other unit root tests for the first bit. From robert.kern at gmail.com Mon Apr 7 22:38:56 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 19:38:56 -0700 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> <200804072059.09561.pgmdevlist@gmail.com> <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> Message-ID: <3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com> On Mon, Apr 7, 2008 at 7:08 PM, Alan G Isaac wrote: > Just to correct an apparent misimpression ... > > While I am definitely interested in the general area of this > proposal, when I commented that "I have volunteered to > mentor", I was NOT committing to this particular project but > rather describing my status in the GSoC. Specifically, > as I was trying to explain, I expect that the SciPy project > under the PSF umbrella will select certain projects, > *not* that I will do so. Rather, as a volunteer mentor for the PSF, you along with all of the other PSF volunteer mentors, will collaborate to select projects. The SciPy project itself does not select any projects except through various volunteer mentors being who choose to be involved in the PSF's process. If you want to be a mentor, you need to follow the instructions here ASAP: http://wiki.python.org/moin/SummerOfCode -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From aisaac at american.edu Mon Apr 7 23:18:17 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 7 Apr 2008 23:18:17 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com> Message-ID: On Mon, 7 Apr 2008, Robert Kern apparently wrote: > Rather, as a volunteer mentor for the PSF, you along with > all of the other PSF volunteer mentors, will collaborate > to select projects. The SciPy project itself does not > select any projects except through various volunteer > mentors being who choose to be involved in the PSF's > process. Thanks for that clarification, which matches what I partially deduced from the discussions. I find the process a bit odd, it that the SciPy project has no formal voice as a project but rather only through the participating mentors. Last year I was recruited to mentor, so I was a bit puzzled this year by the lack of guidance when I volunteered. Oh well, I'll adapt. Thanks, Alan From robert.kern at gmail.com Mon Apr 7 23:18:25 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 20:18:25 -0700 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> <200804072059.09561.pgmdevlist@gmail.com> <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> <3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com> Message-ID: <3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com> On Mon, Apr 7, 2008 at 8:18 PM, Alan G Isaac wrote: > On Mon, 7 Apr 2008, Robert Kern apparently wrote: > > Rather, as a volunteer mentor for the PSF, you along with > > all of the other PSF volunteer mentors, will collaborate > > to select projects. The SciPy project itself does not > > select any projects except through various volunteer > > mentors being who choose to be involved in the PSF's > > process. > > Thanks for that clarification, which matches what > I partially deduced from the discussions. > > I find the process a bit odd, it that the SciPy project has > no formal voice as a project but rather only through the > participating mentors. We were asked by Google to go through the PSF just like we have in previous years. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From vshah at interactivesupercomputing.com Tue Apr 8 00:01:26 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Mon, 7 Apr 2008 21:01:26 -0700 Subject: [SciPy-dev] sparse linear solve crash in 0.6.0 Message-ID: I passed in a rank deficient matrix to the sparse linear solver, which resulted in a core dump. Doing so in the dense case triggers the appropriate error message. I am wondering what the procedure is to report bugs / fix them ? Should I be using the latest version from svn, check again and then open a ticket somewhere ? Once I get a little comfortable with the code in scipy, I might be able to fix some bugs as well. Thanks in advance, -viral From robert.kern at gmail.com Tue Apr 8 00:08:24 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 7 Apr 2008 21:08:24 -0700 Subject: [SciPy-dev] sparse linear solve crash in 0.6.0 In-Reply-To: References: Message-ID: <3d375d730804072108k31b85029v55e3fbc746a3d1cf@mail.gmail.com> On Mon, Apr 7, 2008 at 9:01 PM, Viral Shah wrote: > I passed in a rank deficient matrix to the sparse linear solver, which > resulted in a core dump. Doing so in the dense case triggers the > appropriate error message. > > I am wondering what the procedure is to report bugs / fix them ? > Should I be using the latest version from svn, check again and then > open a ticket somewhere ? Yes, if you can. The bug tracker is here: http://projects.scipy.org/scipy/scipy You will need to register an account first: http://projects.scipy.org/scipy/scipy/register > Once I get a little comfortable with the > code in scipy, I might be able to fix some bugs as well. Thank you, that would be most appreciated. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From wnbell at gmail.com Tue Apr 8 00:34:49 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 7 Apr 2008 23:34:49 -0500 Subject: [SciPy-dev] sparse linear solve crash in 0.6.0 In-Reply-To: References: Message-ID: On Mon, Apr 7, 2008 at 11:01 PM, Viral Shah wrote: > I passed in a rank deficient matrix to the sparse linear solver, which > resulted in a core dump. Doing so in the dense case triggers the > appropriate error message. It's a known problem: http://projects.scipy.org/scipy/scipy/ticket/553 The problem could lie in the SuperLU bindings or SuperLU itself (we're using an older version I believe). To my knowledge, no one has expressed an interest in resolving this problem. So if you want it, it's all yours :) -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Tue Apr 8 00:54:05 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 7 Apr 2008 23:54:05 -0500 Subject: [SciPy-dev] Sparse data structures. In-Reply-To: <30F099C6-37D7-48EE-984A-615D895E538D@interactivesupercomputing.com> References: <30F099C6-37D7-48EE-984A-615D895E538D@interactivesupercomputing.com> Message-ID: On Mon, Apr 7, 2008 at 2:15 PM, Viral Shah wrote: > Offering 5 choices to the naive programmer who doesn't understand the > details about sparse matrix implementations may make Scipy look > daunting. I am wondering whats the best way to address this ? I Actually, there are 7 choices now :) http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/sparse I've made an attempt to explain the rationale for each format in the docstrings of each class. Ideally we'd have a nice wiki page to describe the formats in greater detail. I don't have a great deal of time at my disposal currently, but I would contribute to such a page when I have time. > summarize my understanding of the different formats below. I'd love to > have a conversation with someone on how to make this useful in a > coherent way (one default data structure, for example), or at the very > least, update the documentation with some information to guide someone. Most of the formats are directly related to those in SPARSKIT: http://www-users.cs.umn.edu/~saad/software/SPARSKIT/sparskit.html You may find the SPARSKIT documentation helpful: http://www-users.cs.umn.edu/~saad/software/SPARSKIT/paper.ps I don't know of a "one size fits all" sparse format, so having a default would be dangerous. IMO MATLAB's use of CSC by default *is* simple, but that simplicity comes at the cost of completely miserable performance in many situations. > For row/col indexing, CSR and CSC formats can be decent - especially > if you are extracting columns from CSC and rows from CSR. Random > access is decent, but not the greatest. These formats are also good > for matrix-vector multiplication, the workhorse of many iterative > solvers - but one could use a library like OSKI for that. CSR/CSC are > terrible for assembly and assignment. They can be decent for > concatenation along the preferred dimension. > > A dictionary representation is great for random access, assembly, and > concatenation, but not very efficient for submatrix indexing and > assignment. Co-ordinate representation can be good for operations, > given that the right rule for combining duplicates is used. The good > thing about both these representations is that they can be extended to > N-d sparse arrays. > > The list of lists approach lies somewhere in the middle. I am assuming > these lists are kept sorted. The main drawback is the excessive memory > usage to store all the pointers. I haven't yet looked at the code, and > I could be wrong. You assessment looks correct to me. Currently LIL and DOK are a little too slow for constructing anything large-scale IMO (despite their asymptotic complexity). I highly recommend that you install the development versions of NumPy and SciPy. The sparse module has seen a number of improvements since the last release. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From jtravs at gmail.com Tue Apr 8 04:56:24 2008 From: jtravs at gmail.com (John Travers) Date: Tue, 8 Apr 2008 09:56:24 +0100 Subject: [SciPy-dev] Status of sandbox.spline and my other code. In-Reply-To: <3a1077e70804080155p22fad7bcj9f805155a95fefa8@mail.gmail.com> References: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> <3a1077e70804080155p22fad7bcj9f805155a95fefa8@mail.gmail.com> Message-ID: <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> ---------- Forwarded message ---------- From: John Travers Date: Tue, Apr 8, 2008 at 9:55 AM Subject: Re: [SciPy-dev] Status of sandbox.spline and my other code. To: Jarrod Millman On Mon, Apr 7, 2008 at 11:56 AM, Jarrod Millman wrote: > > Hey John, > > I am hosting a NumPy/SciPy sprint this week at UC Berkeley and among > other things am hoping to make some progress on cleaning up the > various interpolation/spline code in SciPy as well as continue > removing the scipy sandbox. Do you have any more plans for your > sandbox.spline code? Any thoughts on where it should go or what still > needs to be done? > The code in sandbox.spline should be a complete replacement for the code in scipy.interpolate.fitpack and fitpack2 including the f2py wrappers of the fitpack fortran code (i.e. replacing the hand crafted c wrappers). Apart from using only f2py, a number of improvements were made to the docstrings and error checking and some bugs fixed and quite a few unit tests added. Originally I was advocating that we have a separate spline package with the fitpack stuff and a separate interpolate package that can utilise either the spline or rbf of delaunay etc. packages as these can all be used for more than just interpolation. This didn't seem to gain any traction on the mailing list, so I left it. I would still advocate this, but recognise the reasons that people might not want to expand the number of packages so much. I left this code about a year ago and then got bogged down with my thesis work which I finished a few weeks back. If they code/improvements might be used then I can find the time this week to check it is all working and still up to date. Then I could either supply a set of patches against the main interpolate code which could be more easily understood, or provide a mercurial/bzr branch for someone else to merge from (if the bzr/mercurial stuff is setup yet). Let me know if that might be useful. There are also further things that were on my todo list for interpolate, including using RectBivariateSPline where possible as it is much better for interpolation than the currently used surfit. (I have a unit test which segfaults the current code, but not teh RectBivariateSpline code). Hope this helps, John From halish at kofeina.org Tue Apr 8 05:58:08 2008 From: halish at kofeina.org (Mateusz Haligowski) Date: Tue, 8 Apr 2008 11:58:08 +0200 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com> <200804072059.09561.pgmdevlist@gmail.com> <62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com> <3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com> <3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com> Message-ID: <62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> 2008/4/8, Robert Kern : > > On Mon, Apr 7, 2008 at 8:18 PM, Alan G Isaac wrote: > > On Mon, 7 Apr 2008, Robert Kern apparently wrote: > > > Rather, as a volunteer mentor for the PSF, you along with > > > all of the other PSF volunteer mentors, will collaborate > > > to select projects. The SciPy project itself does not > > > select any projects except through various volunteer > > > mentors being who choose to be involved in the PSF's > > > process. > > > > Thanks for that clarification, which matches what > > I partially deduced from the discussions. > > > > I find the process a bit odd, it that the SciPy project has > > no formal voice as a project but rather only through the > > participating mentors. > > > We were asked by Google to go through the PSF just like we have in > previous years. > > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > -- Umberto Eco > _______________________________________________ > > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > Robert: 1. By using EViews, I meant comparing results, as well as finding some new ideas in there. Personally, I use gretl (http://gretl.sourceforge.net), but I was told EViews has wonderful time series capabilities. Alan: 1. So, can we confirm that in case the project is selected by PSF, Alan Isaac is my mentor? That's what I meant by the statement "agreed to volunteer as a mentor". 2. I included the tests I had come across during my studies. Having done further search, I found the KPSS stationarity test and Leybourne-McCabe test. Ufortunately, I don't have enough resources for them - I'll get more as soon as I get the Hamilton's book from Amazon (which can take up to 2 weeks). Are these tests fine? 3. What do you mean by background to the code I've written? I have made some examples on how to use NumPy (arrays manipulation), an optimization code which I used at my university and some rather easy statistical analysis of a stock prices (ie. descriptive analysis + CAPM and Sharpe's model for beta coefficient). I don't use SciPy everyday as my university requires gretl and no other statistical software is allowed to make econometric analysis. I am looking forward to making my research with SciPy. Regards, Mateusz Haligowski -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Tue Apr 8 07:50:38 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 08 Apr 2008 13:50:38 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions Message-ID: <47FB5C0E.6010603@ntc.zcu.cz> Hi, I am now trying to add the following LAPACK functions into scipy: ssygv, dsygv, chegv, zhegv, to be able to solve 'Ax = lambda Bx' with symmetric or Hermitian matrices efficiently (cf. http://www.netlib.org/lapack/lug/node34.html). Having those functions is necessary for adding the lobpcg solver to scipy. It must use symeig package (LGPL) for now, which prevents its proper inclusion. I am getting familiar with scipy.linalg implementation right now, slowly getting lost :[ It seems to me that the wrappers should go into generic_flapack.pyf, as there are already other similar functions (*heev etc.). the file claims, however, that it is generated automatically - how do I regenerate it so that it contains the missing functions? Is that really true? interface_gen.py seems just reading the file and tranforming it into the module in the build directory. Google does not help much as I do not know what exactly to look for. Can someone point me to a documentation on this, or give advice? thanks! r. From pearu at cens.ioc.ee Tue Apr 8 08:22:59 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 08 Apr 2008 14:22:59 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FB5C0E.6010603@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> Message-ID: <47FB63A3.2090906@cens.ioc.ee> Robert Cimrman wrote: > Hi, > > I am now trying to add the following LAPACK functions into scipy: > ssygv, dsygv, chegv, zhegv, to be able to solve 'Ax = lambda Bx' with > symmetric or Hermitian matrices efficiently (cf. > http://www.netlib.org/lapack/lug/node34.html). > > Having those functions is necessary for adding the lobpcg solver to > scipy. It must use symeig package (LGPL) for now, which prevents its > proper inclusion. > > I am getting familiar with scipy.linalg implementation right now, slowly > getting lost :[ New wrappers to lapack functions should be defined in scipy.lib.lapack. Eventually we should get rid of lapack wrappers in scipy.linalg. Grep for flapack_*.pyf.src files to see in which file the signature for a given lapack function should be inserted. Btw, scipy.lib.lapack already has wrappers for sygv and hegv functions. HTH, Pearu From cimrman3 at ntc.zcu.cz Tue Apr 8 09:14:09 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 08 Apr 2008 15:14:09 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FB63A3.2090906@cens.ioc.ee> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> Message-ID: <47FB6FA1.9000906@ntc.zcu.cz> Pearu Peterson wrote: > > Robert Cimrman wrote: >> I am now trying to add the following LAPACK functions into scipy: >> ssygv, dsygv, chegv, zhegv, to be able to solve 'Ax = lambda Bx' with >> symmetric or Hermitian matrices efficiently (cf. >> http://www.netlib.org/lapack/lug/node34.html). > > New wrappers to lapack functions should be defined in scipy.lib.lapack. > Eventually we should get rid of lapack wrappers in scipy.linalg. > Grep for flapack_*.pyf.src files to see in which file the signature > for a given lapack function should be inserted. > > Btw, scipy.lib.lapack already has wrappers for sygv and hegv functions. *** to Pearu: Great, thanks! Does it mean that scipy.linalg should be updated to use the wrappers in scipy.lib.lapack? *** to all: IMHO scipy.linalg.decomp deserves a major refitting in the spirit of what is now going on with scipy.sparse - the arguments of the functions with same functionality have different names (both comparing to sparse equivalents and within decomp.py), there are too many eig* functions, etc. There could be even proper solver classes (LinearSolver, EigenvalueSolver, etc.) with a common interface in each solver family. Below, as an example, is the code we use in sfepy. As you can see, the solver classes have only the __init__ method (can be used e.g. to pre-factorize a matrix in the case of a linear solver) and the __call__ method (application of the solver). Would it be interesting to have something in that spirit in scipy? Then, in the meantime, in lobpcg, I will use directly something like: import scipy.lib.lapack as ll ll.get_lapack_funcs( ['hegv'] ) best regards, r. ---- class Solver( Struct ): def __init__( self, conf, **kwargs ): Struct.__init__( self, conf = conf, **kwargs ) def __call__( self, **kwargs ): print 'called an abstract Solver instance!' raise ValueError class LinearSolver( Solver ): def __init__( self, conf, mtx = None, status = None, **kwargs ): Solver.__init__( self, conf = conf, mtx = mtx, status = status, **kwargs ) def __call__( self, rhs, conf = None, mtx = None, status = None ): print 'called an abstract LinearSolver instance!' raise ValueError class NonlinearSolver( Solver ): def __init__( self, conf, evaluator = None, linSolver = None, status = None, **kwargs ): Solver.__init__( self, conf = conf, evaluator = evaluator, linSolver = linSolver, status = status, **kwargs ) def __call__( self, state0, conf = None, evaluator = None, linSolver = None, status = None ): print 'called an abstract NonlinearSolver instance!' raise ValueError class EigenvalueSolver( Solver ): def __init__( self, conf, mtxA = None, mtxB = None, nEigs = None, eigenvectors = None, status = None ): Solver.__init__( self, conf = conf, mtxA = mtxA, mtxB = mtxB, nEigs = nEigs, eigenvectors = eigenvectors, status = status ) def __call__( self, mtxA, mtxB = None, nEigs = None, eigenvectors = None, status = None, conf = None ): print 'called an abstract EigenvalueSolver instance!' raise ValueError From zachary.pincus at yale.edu Tue Apr 8 09:18:58 2008 From: zachary.pincus at yale.edu (Zachary Pincus) Date: Tue, 8 Apr 2008 09:18:58 -0400 Subject: [SciPy-dev] Status of sandbox.spline and my other code. In-Reply-To: <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> References: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> <3a1077e70804080155p22fad7bcj9f805155a95fefa8@mail.gmail.com> <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> Message-ID: > The code in sandbox.spline should be a complete replacement for the > code in scipy.interpolate.fitpack and fitpack2 including the f2py > wrappers of the fitpack fortran code (i.e. replacing the hand crafted > c wrappers). Apart from using only f2py, a number of improvements were > made to the docstrings and error checking and some bugs fixed and > quite a few unit tests added. This should also make it easier to add wrappers for additional functions from the dierckx code, no? E.g. I now use hand-made wrappers for 'insert' to convert splines to Bezier curves (for graphic output) using knot insertion, but it would be nice to just have that f2py'd. Zach From aisaac at american.edu Tue Apr 8 09:37:17 2008 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 8 Apr 2008 09:37:17 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> Message-ID: On Tue, 8 Apr 2008, Mateusz Haligowski apparently wrote: > my university requires gretl and no other statistical > software is allowed to make econometric analysis gretl is very nice. I am pleased to hear about this usage! Cheers, Alan Isaac From aisaac at american.edu Tue Apr 8 09:57:52 2008 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 8 Apr 2008 09:57:52 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: <62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> Message-ID: On Tue, 8 Apr 2008, Mateusz Haligowski apparently wrote: > can we confirm that in case the project is selected by > PSF, Alan Isaac is my mentor? I can only mentor one project. There are currently two (including yours) for which I might be a suitable mentor. I expect I will mentor the other project. I wonder if Nils Wagner nwagner at mecha.uni-stuttgart.de might be interested in mentoring your project. He would have to sign up very fast. Cheers, Alan Isaac From aisaac at american.edu Tue Apr 8 10:01:47 2008 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 8 Apr 2008 10:01:47 -0400 Subject: [SciPy-dev] GSoC: Time Series Analysis In-Reply-To: References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> Message-ID: On Tue, 8 Apr 2008, Alan G Isaac apparently wrote: > I wonder if Nils Wagner > nwagner at mecha.uni-stuttgart.de > might be interested in mentoring your project. Or Pierre GM Cheers, Alan Isaac From cimrman3 at ntc.zcu.cz Tue Apr 8 09:58:56 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 08 Apr 2008 15:58:56 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence Message-ID: <47FB7A20.2020703@ntc.zcu.cz> Hi, so it is done, thanks again, Pearu, for your hint. could someone try on the fresh SVN version (4114): $ cd /scipy/sparse/linalg/eigen/lobpcg $ python lobpcg.py the output should end with: [ 4. 5. 6.] r. From jtravs at gmail.com Tue Apr 8 10:48:38 2008 From: jtravs at gmail.com (John Travers) Date: Tue, 8 Apr 2008 15:48:38 +0100 Subject: [SciPy-dev] Status of sandbox.spline and my other code. In-Reply-To: References: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> <3a1077e70804080155p22fad7bcj9f805155a95fefa8@mail.gmail.com> <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> Message-ID: <3a1077e70804080748v653484c8p54efa7240e83715a@mail.gmail.com> On Tue, Apr 8, 2008 at 2:18 PM, Zachary Pincus wrote: > > The code in sandbox.spline should be a complete replacement for the > > code in scipy.interpolate.fitpack and fitpack2 including the f2py > > wrappers of the fitpack fortran code (i.e. replacing the hand crafted > > c wrappers). Apart from using only f2py, a number of improvements were > > made to the docstrings and error checking and some bugs fixed and > > quite a few unit tests added. > > This should also make it easier to add wrappers for additional > functions from the dierckx code, no? E.g. I now use hand-made wrappers > for 'insert' to convert splines to Bezier curves (for graphic output) > using knot insertion, but it would be nice to just have that f2py'd. > It doesn't make it much easier, because some functions are using f2py in the original module anyway. But more consistent/easier to maintain. BTW I added the insert wrapper using f2py to the sandbox version of the code about a year ago (when you first posted your patches I think). I also have wrappers for some other dierckx functions, which I'll add when I get time. Cheers, John From matthieu.brucher at gmail.com Tue Apr 8 10:34:45 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 8 Apr 2008 16:34:45 +0200 Subject: [SciPy-dev] Creating a swig C++ wrapper with numpy.distutils In-Reply-To: <3d375d730804071135t6aab5027ic210a6cec9a8d15e@mail.gmail.com> References: <3d375d730804071135t6aab5027ic210a6cec9a8d15e@mail.gmail.com> Message-ID: 2008/4/7, Robert Kern : > > On Mon, Apr 7, 2008 at 5:19 AM, Matthieu Brucher > wrote: > > hi, > > > > I'd like to create an extension with SWIG. By default, the extension is > a C > > one, but I need to create a C++ one with swig, that is adding the -c++ > flag > > to the swig commandline. With setuptools, I just have to add the > argument > > options={'build_ext':{'swig_opts':'-c++'}} to the setup() command. > > Is there something like this in numpy.distutils ? If yes is it possible > to > > add the option only to one extension with the configuration tool ? > > > Don't add swig_opts to the build_ext command; it affects every SWIG > wrapper, which may not be what you want. Instead, add swig_opts='-c++' > to the Extension() constructor (or add_extension() method call) of the > extension that needs it. > I managed to use this, but then the wrapper'extension is .c and not .cpp. I digged into the code and found out that the builder can infere the type of the generated file with the first line, so I added : //-*-c++-*- without the additional options and everything went smoothly. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From doutriaux1 at llnl.gov Tue Apr 8 11:15:12 2008 From: doutriaux1 at llnl.gov (Charles Doutriaux) Date: Tue, 08 Apr 2008 08:15:12 -0700 Subject: [SciPy-dev] forcing scipy to use sources instead of system libraries? In-Reply-To: References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> Message-ID: <47FB8C00.6010303@llnl.gov> Hello, The blas installed on my system is apparently corrupted, is there a way to force scipy to use the sources instead of the libs it finds on the system? I do set the env variables: LAPACK_SRC and BLAS_SRC but it does not pick this up because it finds some blas/lapack in /usr/lib Any idea? Thanks, C. From nwagner at iam.uni-stuttgart.de Tue Apr 8 11:49:34 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 17:49:34 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB7A20.2020703@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> Message-ID: On Tue, 08 Apr 2008 15:58:56 +0200 Robert Cimrman wrote: > Hi, > > so it is done, thanks again, Pearu, for your hint. > > could someone try on the fresh SVN version (4114): > > $ cd >/scipy/sparse/linalg/eigen/lobpcg > $ python lobpcg.py > > the output should end with: > [ 4. 5. 6.] > > r. Hi Robert, Works for me. Thank you very much !! Cheers Nils Here is the full output python lobpcg.py /usr/lib/python2.4/site-packages/numpy/lib/utils.py:111: DeprecationWarning: ('speye is deprecated, use scipy.sparse.eye',) warnings.warn(str1, DeprecationWarning) Solving generalized eigenvalue problem with preconditioning matrix size 100 block size 3 3 constraints iteration 0 current block size: 3 eigenvalue: [ 50.16603296 51.25237413 55.80526208] residual norms: [ 28.41337687 28.73028907 29.82590999] iteration 1 current block size: 3 eigenvalue: [ 5.69942153 9.10470514 11.78759315] residual norms: [ 5.60832687 7.51508501 7.88648963] iteration 2 current block size: 3 eigenvalue: [ 4.13720512 5.45430399 7.66390996] residual norms: [ 1.37078739 2.47960467 2.52320209] iteration 3 current block size: 3 eigenvalue: [ 4.00936839 5.03454726 7.09804759] residual norms: [ 0.38117189 0.71527311 2.20847017] iteration 4 current block size: 3 eigenvalue: [ 4.00043005 5.00184096 6.27945283] residual norms: [ 0.08700255 0.21163672 1.88467025] iteration 5 current block size: 3 eigenvalue: [ 4.00000785 5.00006727 6.02517101] residual norms: [ 0.01364081 0.03834687 0.59292732] iteration 6 current block size: 3 eigenvalue: [ 4.00000012 5.00000191 6.00286877] residual norms: [ 0.00155469 0.0072571 0.19477203] iteration 7 current block size: 3 eigenvalue: [ 4. 5.00000006 6.00037219] residual norms: [ 0.00020097 0.00111362 0.08918231] iteration 8 current block size: 2 eigenvalue: [ 4. 5.00000001 6.00004161] residual norms: [ 3.24468469e-05 3.10698977e-04 2.47415455e-02] iteration 9 current block size: 1 eigenvalue: [ 4. 5. 6.00000402] residual norms: [ 1.44290611e-05 8.30172695e-05 9.47072140e-03] iteration 10 current block size: 1 eigenvalue: [ 4. 5. 6.00000052] residual norms: [ 1.42516389e-05 7.25968963e-05 2.56545826e-03] iteration 11 current block size: 1 eigenvalue: [ 4. 5. 6.00000015] residual norms: [ 1.39884037e-05 7.28790926e-05 9.90709310e-04] iteration 12 current block size: 1 eigenvalue: [ 4. 5. 6.00000005] residual norms: [ 1.38176556e-05 7.11253299e-05 6.92343064e-04] iteration 13 current block size: 1 eigenvalue: [ 4. 5. 6.00000001] residual norms: [ 1.40526840e-05 7.49691661e-05 4.12650573e-04] iteration 14 current block size: 1 eigenvalue: [ 4. 5. 6.] residual norms: [ 1.38113346e-05 6.82994247e-05 1.00085913e-04] iteration 15 final eigenvalue: [ 4. 5. 6.] final residual norms: [ 1.37083440e-05 6.79150257e-05 4.57570552e-05] solution time: 0.18 [[ 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ 0.00000000e+00 0.00000000e+00 0.00000000e+00] [ -1.00000000e-00 -2.70166100e-12 -4.49407572e-13] [ 3.45135382e-12 -1.00000000e-00 -3.31004270e-11] [ 1.43272265e-12 4.16009456e-11 -1.00000000e-00] [ -3.01389337e-08 1.04266891e-09 -2.03536622e-06] [ -2.09568203e-07 -1.60001860e-06 -4.27908244e-06] [ -1.34192993e-07 8.93748787e-08 1.11527829e-06] [ -9.54056973e-09 1.65761534e-08 -7.28293414e-07] [ 5.80623027e-08 -6.78924269e-07 1.28523801e-07] [ 2.46652882e-08 2.81056345e-07 8.41732884e-07] [ -2.81842377e-07 2.11865121e-07 -5.11925778e-07] [ 3.79119237e-08 -4.94179195e-07 -9.66429188e-07] [ 3.87690758e-07 7.47167491e-07 1.46782419e-08] [ -2.78202949e-08 -1.26548930e-06 4.29164218e-07] [ 1.59183735e-07 1.61269484e-06 7.67351913e-07] [ -1.98747612e-08 -5.08476559e-07 -2.03110237e-08] [ -2.06671565e-07 4.88730427e-08 1.59871138e-07] [ 5.27846912e-08 1.83780694e-07 -5.21864962e-07] [ -2.77838525e-08 4.18872863e-08 -4.20896367e-07] [ 8.03607930e-09 2.21719430e-07 1.74850575e-08] [ -6.65735713e-08 -9.96554778e-08 -2.66443931e-07] [ -5.86778608e-09 3.58000119e-07 5.33751877e-08] [ -7.04258383e-08 -1.11932530e-06 -2.46227247e-07] [ 1.39589354e-08 -2.44617669e-07 -1.19422327e-07] [ -8.78909615e-09 -1.21292653e-07 2.67872685e-07] [ -4.36250627e-08 -2.18902034e-07 3.97840323e-07] [ 8.33167577e-08 -5.12521808e-07 -2.32577046e-07] [ 1.91312002e-08 -8.86341001e-08 1.51961315e-07] [ -3.77761175e-08 2.11412046e-08 3.88146759e-07] [ 1.25522710e-07 -3.78933326e-07 -2.14288715e-08] [ -1.13179554e-09 -1.30570575e-07 1.01778213e-07] [ 1.62448418e-08 3.44350353e-08 -2.75283159e-08] [ 1.71774982e-08 -2.20277907e-07 3.82335074e-08] [ -2.62120118e-08 -1.77989065e-07 -1.19618064e-07] [ -4.28509480e-08 -3.11927590e-07 -1.61041412e-07] [ -1.42879894e-08 -2.01377283e-07 -1.58396601e-07] [ 4.55012564e-08 2.81283225e-07 6.95463005e-08] [ 3.01461234e-09 2.47687731e-07 3.22956377e-08] [ 1.49037369e-09 8.06908264e-08 2.08576615e-08] [ -1.76935675e-08 1.27197878e-08 -9.27809622e-08] [ -2.21573205e-08 2.36424514e-07 4.14499817e-08] [ -2.25901032e-09 8.98625917e-08 9.71048798e-08] [ -1.19762775e-08 1.66612827e-08 2.99754656e-08] [ 3.58501864e-08 -2.85612797e-07 -8.90575990e-09] [ -1.01618372e-08 1.89155074e-08 -1.92595357e-08] [ -9.79953541e-09 -3.66627417e-08 4.34605597e-08] [ -8.81422746e-08 3.25839765e-07 5.96807928e-08] [ -5.28638450e-08 1.41856053e-07 3.45967672e-08] [ -5.18085576e-08 6.93031478e-08 8.80196848e-08] [ -1.52218053e-08 -1.37899359e-07 1.04478564e-07] [ -2.69589941e-08 -5.93106849e-08 1.08700966e-07] [ -3.53545293e-08 8.19458663e-08 7.20559393e-08] [ 1.45946266e-09 3.31589157e-08 5.47101871e-08] [ -5.57443827e-10 -4.21819711e-08 2.67719546e-08] [ -3.55174528e-08 8.36400386e-08 9.51545029e-08] [ 3.24829200e-09 6.65526482e-08 4.00465143e-08] [ 1.34456481e-08 -6.03579314e-08 1.58145041e-07] [ 1.94699367e-08 -9.97358384e-08 1.70036092e-07] [ -1.15524633e-08 -2.12045092e-07 1.10068499e-07] [ 8.07683355e-09 1.26033397e-08 8.47700021e-08] [ -8.97404149e-09 -4.63495668e-08 4.12649793e-09] [ -7.26964073e-09 -1.76366897e-07 1.10018048e-07] [ -2.79732271e-09 -6.42588660e-08 -1.74486154e-08] [ 4.36118457e-09 -9.54737197e-08 -2.61915884e-09] [ -5.51719031e-09 -1.50083864e-07 8.42135138e-09] [ -2.44170002e-09 -1.35763732e-07 -3.17630000e-08] [ 1.58753411e-08 -5.95644323e-08 -1.26448328e-08] [ -9.60388044e-09 -3.16397182e-08 7.64129944e-08] [ 1.63884199e-08 -7.09138585e-08 1.62298772e-08] [ 1.57829668e-09 -2.45320884e-08 2.00587307e-08] [ -2.00664528e-09 -1.73441494e-08 2.94430307e-08] [ 2.43650546e-08 -9.48698823e-08 -1.84003993e-08] [ -8.54173260e-09 1.19209149e-08 2.76477653e-08] [ -1.81906746e-08 6.75961095e-08 5.55658348e-08] [ -1.91225406e-08 4.44607252e-08 1.76471499e-08] [ 3.59126559e-08 -9.07891824e-08 3.30484683e-09] [ -1.49658559e-08 5.63993043e-09 -1.19076779e-08] [ 4.59041803e-08 -1.35202754e-07 1.62406623e-09] [ 1.89965469e-08 -5.78088704e-08 -6.41497552e-09] [ 1.14943654e-09 -6.67217334e-08 -2.91756305e-08] [ 2.13037665e-08 -9.37118168e-08 -5.55942868e-09] [ -9.19599377e-09 1.04534807e-07 -7.11401211e-09] [ 1.90727833e-08 -1.79862773e-07 -9.87066470e-09] [ 3.76495941e-08 -2.04832062e-07 2.31035468e-08] [ 9.00764569e-09 -1.21537959e-07 1.19765925e-08] [ -1.93891210e-09 -3.68929944e-08 -1.34838406e-08] [ 1.08746120e-08 -3.10136621e-08 1.06506222e-08] [ -9.79450248e-10 1.70759840e-08 -1.22923235e-08] [ -5.38118244e-09 1.31639610e-07 -9.29980232e-08] [ -9.87081445e-10 1.17067886e-08 -4.15305552e-09] [ -5.25364357e-09 3.43422921e-08 1.57743361e-08] [ -8.88358916e-09 -5.62225135e-08 8.34315725e-08] [ -6.18017047e-09 -7.98718649e-09 3.50353549e-08] [ 4.98672743e-09 1.26197815e-07 5.40157214e-08] [ -5.11599732e-08 -6.47092989e-08 2.17719994e-07] [ 2.72320287e-09 -3.24432174e-08 -1.43940020e-07] [ -3.17446906e-09 -3.00975281e-08 -9.30229880e-08] [ 7.62584618e-09 1.00928621e-07 8.24283838e-08]] [ 4. 5. 6.] From nwagner at iam.uni-stuttgart.de Tue Apr 8 11:52:09 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 17:52:09 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB7A20.2020703@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> Message-ID: On Tue, 08 Apr 2008 15:58:56 +0200 Robert Cimrman wrote: > Hi, > > so it is done, thanks again, Pearu, for your hint. > > could someone try on the fresh SVN version (4114): > > $ cd >/scipy/sparse/linalg/eigen/lobpcg > $ python lobpcg.py > > the output should end with: > [ 4. 5. 6.] > > r. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Oops, I forgot to ask for the possibbility of handling Hermitian matrices within lobpcg http://projects.scipy.org/scipy/scipy/ticket/452 Cheers, Nils From cimrman3 at ntc.zcu.cz Tue Apr 8 11:58:21 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 08 Apr 2008 17:58:21 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: References: <47FB7A20.2020703@ntc.zcu.cz> Message-ID: <47FB961D.50609@ntc.zcu.cz> Hi Nils, Nils Wagner wrote: > Oops, I forgot to ask for the possibbility of handling > Hermitian matrices within lobpcg > > http://projects.scipy.org/scipy/scipy/ticket/452 Can you write a (failing) test case for this? You could also help by detecting the places in lobpcg.py causing it not to work, if you can. cheers, r. From wnbell at gmail.com Tue Apr 8 12:28:30 2008 From: wnbell at gmail.com (Nathan Bell) Date: Tue, 8 Apr 2008 11:28:30 -0500 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB7A20.2020703@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> Message-ID: On Tue, Apr 8, 2008 at 8:58 AM, Robert Cimrman wrote: > Hi, > > so it is done, thanks again, Pearu, for your hint. > > could someone try on the fresh SVN version (4114): > > $ cd /scipy/sparse/linalg/eigen/lobpcg > $ python lobpcg.py > > the output should end with: > [ 4. 5. 6.] Works for me! Thanks for tackling this issue. I was worried that lobpcg wouldn't be ready for the next release. Robert, could we switch the order of the A and BlockVectorX parameters? Having A first matches the convention used by other iterative solvers in SciPy. Also, we should provide a short description of what the method computes in terms of A, B, lambda, etc. A small example would be nice too (since the interface is a little different than, e.g. ARPACK). Thanks again! -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From ondrej at certik.cz Tue Apr 8 12:39:18 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 8 Apr 2008 18:39:18 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <3d375d730804071107q594d9184s7dd7f824be4c189@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <3d375d730804071107q594d9184s7dd7f824be4c189@mail.gmail.com> Message-ID: <85b5c3130804080939n6e4176bdmf3dfb1617ff983f7@mail.gmail.com> On Mon, Apr 7, 2008 at 8:07 PM, Robert Kern wrote: > On Mon, Apr 7, 2008 at 10:55 AM, Ondrej Certik wrote: > > Hi, > > > > these are probably mainly questions to Nathan. :) > > > > First let me say that the new sparse functionality of scipy is > > awesome, I really like it, it's just so easy and natural to work with > > it. Thanks for your work on it. > > > > Here I have some points: > > > > 1) Why do you want to remove umfpack wrappers from scipy? I suggest to > > leave wrappers in there (those can be BSD, cannot they?), > > That just confuses the issue. If you are going to use the UMFPACK > functionality, you will need UMFPACK and the GPL. Consequently, the > wrappers must be optional; all of scipy must be buildable and usable > without any GPLed code. Absolutely, scipy needs to just compile and just work, with BSD stuff only by default. > However, having optional functionality is bad; > in order to depend on it you can't just tell your users "install > scipy" but "install scipy with the UMFPACK wrappers installed". This > makes it especially difficult for things like Linux distributions. > They have to make the choice whether to include the GPL code or not. > If they do, then they can't include a non-GPL-compatible package that > depends on scipy. If they don't, they can't include a package that > needs the UMFPACK wrappers. nonrelated: Is there any package depending on scipy, incompatible with GPL? > > I would have spoken against the initial inclusion of the UMFPACK > wrappers if I had been paying attention at the time. I am glad to see > them being removed. However, just moving the wrappers to a scikit > doesn't solve any problem if scipy code is requiring the scikit. OK, I see your point. Honestly, I don't like this option either (to have to decide at compile time which stuff is going in). In this case, scipy should be pure BSD and that's it. It should not depend on scikits, however when installing scicits, the umfpack functionality will be back in. BTW, I don't like the current code in scipy/sparse/linalg/dsolve/linsolve.py, that switches between superlu and umfpack semiautomatically. I think there should be an easy to use solver for superlu (in scipy) and a separate one for umfpack (in scikits) ideally with the same interface. On Mon, Apr 7, 2008 at 8:09 PM, Nils Wagner wrote: [...] > > I have some improvements, for example > > scipy/sparse/linalg/eigen/info.py is missing the lobpcg, > >the attached > > patch fixes that. > > lobpcg is currently disabled. See ticket > http://projects.scipy.org/scipy/scipy/ticket/630 Right - I think Robert just fixed that now. Robert - could you please commit the doc fix, if lobpcg is in now? > BTW, a svd for sparse matrices would be nice, too. Yep. So 1) and 5) were answered. Nathan, do you have any comments to 2), 3) and 4)? For example, here is a code that converts a scipy matrix for petsc4py solvers: from petsc4py.PETSc import Mat def convert_csr2petsc(mtx): assert isspmatrix_csr(mtx) A = Mat().createSeqAIJ(mtx.shape, nz=mtx.shape[0]) i = mtx.indptr j = mtx.indices v = mtx.data A.setValuesCSR(i, j, v) A.assemble() return A I understand that scipy wants to be BSDed and self-contained, but I think all similar code (either external or non BSD) should go to scikits then. I mean - I am using all these different solvers and I would like to have them all under one roof, using the same interfaces ideally. So I wanted to check, if this is fits well with the scipy and scipy.sparse intentions, or I should rather think how to do it in some separate project (scikit), but even in this case I'd like to put as many things into scipy as possible and only reuse scipy + add more things. Ondrej From cimrman3 at ntc.zcu.cz Tue Apr 8 12:43:57 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 08 Apr 2008 18:43:57 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: References: <47FB7A20.2020703@ntc.zcu.cz> Message-ID: <47FBA0CD.4030903@ntc.zcu.cz> Nathan Bell wrote: > On Tue, Apr 8, 2008 at 8:58 AM, Robert Cimrman wrote: >> Hi, >> >> so it is done, thanks again, Pearu, for your hint. >> >> could someone try on the fresh SVN version (4114): >> >> $ cd /scipy/sparse/linalg/eigen/lobpcg >> $ python lobpcg.py >> >> the output should end with: >> [ 4. 5. 6.] > > Works for me! But note that I just run the single example that was already in the file... No power/time to make more :) > Thanks for tackling this issue. I was worried that lobpcg wouldn't be > ready for the next release. > > Robert, could we switch the order of the A and BlockVectorX > parameters? Having A first matches the convention used by other > iterative solvers in SciPy. Sure, go ahead, please. I propose even: A, B = None, M = None, blockVectorX = , ... > Also, we should provide a short description of what the method > computes in terms of A, B, lambda, etc. A small example would be nice > too (since the interface is a little different than, e.g. ARPACK). There are already some examples contributed by Nils in the tests directory. Concerning the method, I coded it following a paper by Andrew Knyazev (see info.py) some not so small time ago, and remember now zero about what is going on :]. Thanks for testing it! cheers, r. From pearu at cens.ioc.ee Tue Apr 8 13:51:29 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 8 Apr 2008 20:51:29 +0300 (EEST) Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FB6FA1.9000906@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> Message-ID: <60458.88.89.195.89.1207677089.squirrel@cens.ioc.ee> On Tue, April 8, 2008 4:14 pm, Robert Cimrman wrote: > Does it mean that scipy.linalg should be updated to use the wrappers in > scipy.lib.lapack? Yes. And wrappers in scipy.linalg can be gradually removed, may be providing some backward compatibility hooks in case some user code uses scipy.linalg lapack wrappers. The same applies to blas wrappers. Pearu From pearu at cens.ioc.ee Tue Apr 8 13:52:19 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 8 Apr 2008 20:52:19 +0300 (EEST) Subject: [SciPy-dev] forcing scipy to use sources instead of system libraries? In-Reply-To: <47FB8C00.6010303@llnl.gov> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> <47FB8C00.6010303@llnl.gov> Message-ID: <60537.88.89.195.89.1207677139.squirrel@cens.ioc.ee> On Tue, April 8, 2008 6:15 pm, Charles Doutriaux wrote: > Hello, > > The blas installed on my system is apparently corrupted, is there a way > to force scipy to use the sources instead of the libs it finds on the > system? > > I do set the env variables: LAPACK_SRC and BLAS_SRC but it does not pick > this up because it finds some blas/lapack in /usr/lib Set also LAPACK=None BLAS=None Pearu From ondrej at certik.cz Tue Apr 8 13:52:31 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 8 Apr 2008 19:52:31 +0200 Subject: [SciPy-dev] typo in linalg/interface.py Message-ID: <85b5c3130804081052m70439440y81decab5350f623f@mail.gmail.com> Hi, I think this patch should be applied: $ svn di interface.py Index: interface.py =================================================================== --- interface.py (revision 4115) +++ interface.py (working copy) @@ -18,7 +18,7 @@ Required Parameters ------------------- shape : tuple of matrix dimensions (M,N) - matvec(x) : function that returns A * v + matvec(v) : function that returns A * v Optional Parameters ------------------- Ondrej From robert.kern at gmail.com Tue Apr 8 14:03:49 2008 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 8 Apr 2008 11:03:49 -0700 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <60458.88.89.195.89.1207677089.squirrel@cens.ioc.ee> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <60458.88.89.195.89.1207677089.squirrel@cens.ioc.ee> Message-ID: <3d375d730804081103g3016ce01w63574788d94f760d@mail.gmail.com> On Tue, Apr 8, 2008 at 10:51 AM, Pearu Peterson wrote: > On Tue, April 8, 2008 4:14 pm, Robert Cimrman wrote: > > > Does it mean that scipy.linalg should be updated to use the wrappers in > > scipy.lib.lapack? > > Yes. And wrappers in scipy.linalg can be gradually removed, may be > providing some backward compatibility hooks in case some user code > uses scipy.linalg lapack wrappers. > > The same applies to blas wrappers. Can you specify a deadline as to when you will commit to getting this done? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pearu at cens.ioc.ee Tue Apr 8 14:15:16 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 8 Apr 2008 21:15:16 +0300 (EEST) Subject: [SciPy-dev] forcing scipy to use sources instead of system libraries? In-Reply-To: <60537.88.89.195.89.1207677139.squirrel@cens.ioc.ee> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> <47FB8C00.6010303@llnl.gov> <60537.88.89.195.89.1207677139.squirrel@cens.ioc.ee> Message-ID: <63751.88.89.195.89.1207678516.squirrel@cens.ioc.ee> On Tue, April 8, 2008 8:52 pm, Pearu Peterson wrote: > On Tue, April 8, 2008 6:15 pm, Charles Doutriaux wrote: >> Hello, >> >> The blas installed on my system is apparently corrupted, is there a way >> to force scipy to use the sources instead of the libs it finds on the >> system? >> >> I do set the env variables: LAPACK_SRC and BLAS_SRC but it does not pick >> this up because it finds some blas/lapack in /usr/lib > > Set also > > LAPACK=None > BLAS=None Also ATLAS=None might be needed. Pearu From pearu at cens.ioc.ee Tue Apr 8 14:24:22 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 8 Apr 2008 21:24:22 +0300 (EEST) Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <3d375d730804081103g3016ce01w63574788d94f760d@mail.gmail.com> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <60458.88.89.195.89.1207677089.squirrel@cens.ioc.ee> <3d375d730804081103g3016ce01w63574788d94f760d@mail.gmail.com> Message-ID: <64779.88.89.195.89.1207679062.squirrel@cens.ioc.ee> On Tue, April 8, 2008 9:03 pm, Robert Kern wrote: > On Tue, Apr 8, 2008 at 10:51 AM, Pearu Peterson wrote: >> On Tue, April 8, 2008 4:14 pm, Robert Cimrman wrote: >> >> > Does it mean that scipy.linalg should be updated to use the wrappers >> in >> > scipy.lib.lapack? >> >> Yes. And wrappers in scipy.linalg can be gradually removed, may be >> providing some backward compatibility hooks in case some user code >> uses scipy.linalg lapack wrappers. >> >> The same applies to blas wrappers. > > Can you specify a deadline as to when you will commit to getting this > done? It is difficult to say for me, I'd like to do that as soon as possible but at the moment I don't have resources for it. The task itself should be simple to carry out (most of it is copy and paste) but it also requires careful testing which might take more time as some signatures may be slightly different in scipy.lib.lapack, iirc. I know that in the coming two months I'll be busy, so let's set the deadline after that.. unless I'll take a weekend but I will not promise that now. Pearu From doutriaux1 at llnl.gov Tue Apr 8 15:23:56 2008 From: doutriaux1 at llnl.gov (Charles Doutriaux) Date: Tue, 08 Apr 2008 12:23:56 -0700 Subject: [SciPy-dev] forcing scipy to use sources instead of system libraries? In-Reply-To: <63751.88.89.195.89.1207678516.squirrel@cens.ioc.ee> References: <62894b9a0804071653u4f3b5f18kb441cb73502e88c6@mail.gmail.com><200804072059.09561.pgmdevlist@gmail.com><62894b9a0804071812m4b1632a9n1827f8bc85f24d93@mail.gmail.com><3d375d730804071938r6a6c66aci36ddb5d15a45eea1@mail.gmail.com><3d375d730804072018q54ecd5d6m9f0115bedcbc785a@mail.gmail.com><62894b9a0804080258y5824d689j48f97814c26c7733@mail.gmail.com> <47FB8C00.6010303@llnl.gov> <60537.88.89.195.89.1207677139.squirrel@cens.ioc.ee> <63751.88.89.195.89.1207678516.squirrel@cens.ioc.ee> Message-ID: <47FBC64C.9030500@llnl.gov> Thx Pearu, C. Pearu Peterson wrote: > On Tue, April 8, 2008 8:52 pm, Pearu Peterson wrote: > >> On Tue, April 8, 2008 6:15 pm, Charles Doutriaux wrote: >> >>> Hello, >>> >>> The blas installed on my system is apparently corrupted, is there a way >>> to force scipy to use the sources instead of the libs it finds on the >>> system? >>> >>> I do set the env variables: LAPACK_SRC and BLAS_SRC but it does not pick >>> this up because it finds some blas/lapack in /usr/lib >>> >> Set also >> >> LAPACK=None >> BLAS=None >> > > Also > ATLAS=None > might be needed. > > Pearu > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > From matthieu.brucher at gmail.com Tue Apr 8 15:42:07 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 8 Apr 2008 21:42:07 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment Message-ID: Hi, The current repository head of openopt is unstable because we are in the process of enhancing its behaviour: - There will be a registry where everyone will be able to... register its own openopt compatible wrapper of an optimizer. This way, the burden of developing these wrappers will not only rely on dmitrey, everyone will be able to write and test new ones for one's favorite solver - The sys.path list will not be modified by openopt anymore (ticket #50 on the scikits trac). This should make "python setup.py develop" possible. We have to check every solver so that it makes the correct imports and add it to the openopt registry, and then everything should be fine. Currently only the standalone solvers are available (I don't have the other packages installed). Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Tue Apr 8 15:54:36 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Tue, 8 Apr 2008 21:54:36 +0200 Subject: [SciPy-dev] typo in linalg/interface.py In-Reply-To: <85b5c3130804081052m70439440y81decab5350f623f@mail.gmail.com> References: <85b5c3130804081052m70439440y81decab5350f623f@mail.gmail.com> Message-ID: <9457e7c80804081254q66749086gf4d2911d5b7973a9@mail.gmail.com> Thanks, Ondrej. Cleaned up in r4117. On 08/04/2008, Ondrej Certik wrote: > Hi, > > I think this patch should be applied: > > $ svn di interface.py > Index: interface.py From nwagner at iam.uni-stuttgart.de Tue Apr 8 15:59:46 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 21:59:46 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB961D.50609@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: On Tue, 08 Apr 2008 17:58:21 +0200 Robert Cimrman wrote: > Hi Nils, > > Nils Wagner wrote: >> Oops, I forgot to ask for the possibbility of handling >> Hermitian matrices within lobpcg >> >> http://projects.scipy.org/scipy/scipy/ticket/452 > > Can you write a (failing) test case for this? You could >also help by > detecting the places in lobpcg.py causing it not to >work, if you can. > > cheers, > r. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Finally, I found a complex hermitian test matrix. See http://www.cise.ufl.edu/research/sparse/matrices/Bai/mhd1280b.html So I will try to write a test case asap. and thanks for http://projects.scipy.org/scipy/scipy/changeset/4116 Now everyone can use lobpcg for testing. Cheers, Nils From ondrej at certik.cz Tue Apr 8 16:00:38 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 8 Apr 2008 22:00:38 +0200 Subject: [SciPy-dev] typo in linalg/interface.py In-Reply-To: <9457e7c80804081254q66749086gf4d2911d5b7973a9@mail.gmail.com> References: <85b5c3130804081052m70439440y81decab5350f623f@mail.gmail.com> <9457e7c80804081254q66749086gf4d2911d5b7973a9@mail.gmail.com> Message-ID: <85b5c3130804081300v3870c155ucd5ef5c8627a6aae@mail.gmail.com> Thanks Stefan. On Tue, Apr 8, 2008 at 9:54 PM, St?fan van der Walt wrote: > Thanks, Ondrej. Cleaned up in r4117. > > > On 08/04/2008, Ondrej Certik wrote: > > Hi, > > > > I think this patch should be applied: > > > > $ svn di interface.py > > Index: interface.py > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ondrej at certik.cz Tue Apr 8 16:05:08 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 8 Apr 2008 22:05:08 +0200 Subject: [SciPy-dev] docstring for leastsq (in scipy.optimize) incorrectly describes 'ier' Message-ID: <85b5c3130804081305l62c73d7fu1455d0316d9951c6@mail.gmail.com> See this bugreport for details: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=474930 This still applies to the svn version. Ondrej From dmitrey.kroshko at scipy.org Tue Apr 8 16:08:53 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Tue, 08 Apr 2008 23:08:53 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: Message-ID: <47FBD0D5.2030208@scipy.org> Matthieu Brucher wrote: > Hi, > > The current repository head of openopt is unstable because we are in > the process of enhancing its behaviour: > - There will be a registry where everyone will be able to... register > its own openopt compatible wrapper of an optimizer. This way, the > burden of developing these wrappers will not only rely on dmitrey, > everyone will be able to write and test new ones for one's favorite solver The problem is I just mere don't see those dozens of people with willing to contribute sufficient solvers to OO. Some people answered they could do it in the sake of money, but it's inappropriate neither to me no to others. > - The sys.path list will not be modified by openopt anymore (ticket > #50 on the scikits trac). This should make "python setup.py develop" > possible. I still think the path issue must be done in more far future. > > We have to check every solver so that it makes the correct imports and > add it to the openopt registry, and then everything should be fine. I still don't think it's a good idea > Currently only the standalone solvers are available (I don't have the > other packages installed). Regards, D. From wnbell at gmail.com Tue Apr 8 16:22:32 2008 From: wnbell at gmail.com (Nathan Bell) Date: Tue, 8 Apr 2008 15:22:32 -0500 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> Message-ID: On Mon, Apr 7, 2008 at 12:55 PM, Ondrej Certik wrote: Sorry for the late reply, I'm a little busy at the moment. Your concerns/goals are quite reasonable and > 1) Why do you want to remove umfpack wrappers from scipy? I suggest to > leave wrappers in there (those can be BSD, cannot they?), otherwise > it will be much harder for scipy users to use umfpack (they would have > to install scikits too). If so, could this warning message be removed? I have to agree with Robert K. on this point. Self-containment is a big deal IMO. > 2) why was pysparse removed? are there any objections of adding it > among arpack and lobpcg solvers? If I'm not mistaken, pysparse lived in the sandbox and the core functionality has largely been replaced/superceeded by that in scipy.sparse. I have no opposition to reintroducing pysparse solvers that are missing in scipy.sparse, provided that someone is willing to maintain them. > 3) sometimes I also use blzpack - are there any objections if I > implement it (if I find time of course) the same way lobpcg and arpack > is in there? I'm not familiar with blzpack, but I don't see any reason to prohibit it (assuming it's compatible with the BSD). > 4) I would love to see all available open source sparse solvers and > eigensolvers to be callable from scipy. Is this the intention of > scipy.sparse? Supporting a common interface to the various solvers would be quite useful. However, like (2) we should not support interfaces to optional libraries/methods. > Those are enough questions so far, depending on the answers, I'll > probably have some more then, regarding the interface. :) This is a valuable idea, however given the scope of the solvers you'd like to include, a scikit is more appropriate. It would be very confusing to users that something under the scipy namespace required UMFPACK/PETSc/etc. A scikit could support all these solvers while remaining self-contained. I doubt it would require much time before such a scikit became a commonly-installed component. Furthermore, living outside SciPy would permit us to add functionality on a separate timetable. As you've suggested, any functionality with an appropriate license (and sufficient appeal) could be moved to SciPy itself. We should start a wiki page (like http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper ) to discuss the interface. > I have some improvements, for example > scipy/sparse/linalg/eigen/info.py is missing the lobpcg, the attached > patch fixes that. Added in r4118 -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthieu.brucher at gmail.com Tue Apr 8 16:32:14 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 8 Apr 2008 22:32:14 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FBD0D5.2030208@scipy.org> References: <47FBD0D5.2030208@scipy.org> Message-ID: 2008/4/8, dmitrey : > > Matthieu Brucher wrote: > > Hi, > > > > The current repository head of openopt is unstable because we are in > > the process of enhancing its behaviour: > > - There will be a registry where everyone will be able to... register > > its own openopt compatible wrapper of an optimizer. This way, the > > burden of developing these wrappers will not only rely on dmitrey, > > everyone will be able to write and test new ones for one's favorite > solver > > The problem is I just mere don't see those dozens of people with willing > to contribute sufficient solvers to OO. Some people answered they could > do it in the sake of money, but it's inappropriate neither to me no to > others. I didn't say dozens of people. If one developer contributes one wrapper for a solver, it is worth it. > - The sys.path list will not be modified by openopt anymore (ticket > > #50 on the scikits trac). This should make "python setup.py develop" > > possible. > > I still think the path issue must be done in more far future. This has been an issue for more than a year because of a design issue. This is now solved in the lastest revision of the scikit. It didn't take me an evening. Other people asked for it, so this was high on my todo list. This doesn't change a thing for you, except from the fact that you have to use namespace, which is a good thing. > > > We have to check every solver so that it makes the correct imports and > > add it to the openopt registry, and then everything should be fine. > > I still don't think it's a good idea Openopt should not mess with the sys.path the way it does. Now, openopt has a classic way of importing subpckages, and it will not change a thing for the user. Even less namespace conflicts can happen now, and this is a good thing. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From ondrej at certik.cz Tue Apr 8 16:46:34 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 8 Apr 2008 22:46:34 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> Message-ID: <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> Hi Nathan! On Tue, Apr 8, 2008 at 10:22 PM, Nathan Bell wrote: > On Mon, Apr 7, 2008 at 12:55 PM, Ondrej Certik wrote: > > Sorry for the late reply, I'm a little busy at the moment. Thanks for the reply. I am quite busy too at the moment. :) > > 2) why was pysparse removed? are there any objections of adding it > > among arpack and lobpcg solvers? > > If I'm not mistaken, pysparse lived in the sandbox and the core > functionality has largely been replaced/superceeded by that in > scipy.sparse. > > I have no opposition to reintroducing pysparse solvers that are > missing in scipy.sparse, provided that someone is willing to maintain > them. OK, thanks, I am willing to maintain it. I'll try to substitute it with the scipy solvers and if I find that pyspare has some advantages, I'll prepare a patch. > > 3) sometimes I also use blzpack - are there any objections if I > > implement it (if I find time of course) the same way lobpcg and arpack > > is in there? > > I'm not familiar with blzpack, but I don't see any reason to prohibit > it (assuming it's compatible with the BSD). Yes, it's BSD. > > 4) I would love to see all available open source sparse solvers and > > eigensolvers to be callable from scipy. Is this the intention of > > scipy.sparse? > > Supporting a common interface to the various solvers would be quite > useful. However, like (2) we should not support interfaces to > optional libraries/methods. You mean support in scipy by default, right? I would like to have the unified interface to all solvers (including petsc and others). > > Those are enough questions so far, depending on the answers, I'll > > probably have some more then, regarding the interface. :) > > This is a valuable idea, however given the scope of the solvers you'd > like to include, a scikit is more appropriate. It would be very > confusing to users that something under the scipy namespace required > UMFPACK/PETSc/etc. A scikit could support all these solvers while > remaining self-contained. I doubt it would require much time before > such a scikit became a commonly-installed component. Furthermore, > living outside SciPy would permit us to add functionality on a > separate timetable. As you've suggested, any functionality with an > appropriate license (and sufficient appeal) could be moved to SciPy > itself. OK, that sounds very good. I am going to study how to start such a scikit and we are going to move our additional solvers we have in sfepy to it (pysparse, soon petsc4py) -- are you ok with that Robert? :) Plus umfpack. Plus the eigensolvers, like primme, blzpack and others. And then we'll see how it goes and if some part is useful (like pysparse or blzpack) and with BSD license, it can later be moved to scipy directly. > > We should start a wiki page (like > http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper ) to discuss > the interface. ok, I'll start it when we move things to the scikit. Do you have some idea about the interface? I noticed the linalg/interface.py, that's the way to go. This is only for the iterative solvers though. We need to devise some unified interface for the direct solvers and also for eigensolvers. Ok, I'll try to follow the arpack example and then we'll see if it's ok, or it needs some changes for example in the blzpack case. For example the lobpcg now in scipy has a different interface, doesn't it? > > I have some improvements, for example > > scipy/sparse/linalg/eigen/info.py is missing the lobpcg, the attached > > patch fixes that. > > Added in r4118 Thanks, Ondrej From aisaac at american.edu Tue Apr 8 17:00:15 2008 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 8 Apr 2008 17:00:15 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FBD0D5.2030208@scipy.org> References: <47FBD0D5.2030208@scipy.org> Message-ID: >> The sys.path list will not be modified by openopt anymore >> (ticket #50 on the scikits trac). This should make >> "python setup.py develop" possible. On Tue, 08 Apr 2008, dmitrey apparently wrote: > I still think the path issue must be done in more far future. Certainly modifying sys.path is a generally a bad idea in a package that others will install. Can you explain why you would prefer to postpone this change? (You can always modify PYTHONPATH for your own use.) Cheers, Alan Isaac From nwagner at iam.uni-stuttgart.de Tue Apr 8 17:00:28 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 23:00:28 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> Message-ID: On Tue, 8 Apr 2008 22:46:34 +0200 "Ondrej Certik" wrote: > Hi Nathan! > > On Tue, Apr 8, 2008 at 10:22 PM, Nathan Bell > wrote: >> On Mon, Apr 7, 2008 at 12:55 PM, Ondrej Certik >> wrote: >> >> Sorry for the late reply, I'm a little busy at the >>moment. > > Thanks for the reply. I am quite busy too at the moment. >:) > >> > 2) why was pysparse removed? are there any >>objections of adding it >> > among arpack and lobpcg solvers? >> >> If I'm not mistaken, pysparse lived in the sandbox and >>the core >> functionality has largely been replaced/superceeded by >>that in >> scipy.sparse. >> >> I have no opposition to reintroducing pysparse solvers >>that are >> missing in scipy.sparse, provided that someone is >>willing to maintain >> them. > > OK, thanks, I am willing to maintain it. I'll try to >substitute it > with the scipy solvers and if I find that pyspare has >some advantages, > I'll prepare a patch. > >> > 3) sometimes I also use blzpack - are there any >>objections if I >> > implement it (if I find time of course) the same way >>lobpcg and arpack >> > is in there? >> >> I'm not familiar with blzpack, but I don't see any >>reason to prohibit >> it (assuming it's compatible with the BSD). > > Yes, it's BSD. > >> > 4) I would love to see all available open source >>sparse solvers and >> > eigensolvers to be callable from scipy. Is this the >>intention of >> > scipy.sparse? >> >> Supporting a common interface to the various solvers >>would be quite >> useful. However, like (2) we should not support >>interfaces to >> optional libraries/methods. > > You mean support in scipy by default, right? I would >like to have the > unified interface to all solvers (including petsc and >others). > >> > Those are enough questions so far, depending on the >>answers, I'll >> > probably have some more then, regarding the >>interface. :) >> >> This is a valuable idea, however given the scope of the >>solvers you'd >> like to include, a scikit is more appropriate. It >>would be very >> confusing to users that something under the scipy >>namespace required >> UMFPACK/PETSc/etc. A scikit could support all these >>solvers while >> remaining self-contained. I doubt it would require >>much time before >> such a scikit became a commonly-installed component. >> Furthermore, >> living outside SciPy would permit us to add >>functionality on a >> separate timetable. As you've suggested, any >>functionality with an >> appropriate license (and sufficient appeal) could be >>moved to SciPy >> itself. > > OK, that sounds very good. I am going to study how to >start such a > scikit and we are going to move our additional solvers >we have in > sfepy to it (pysparse, soon petsc4py) -- are you ok with >that Robert? > :) Plus umfpack. Plus the eigensolvers, like primme, >blzpack and > others. > AFAIK Jacobi-Davidson is part of pysparse. And you probably know slepc http://www.grycap.upv.es/slepc/ http://t2.unl.edu/documentation/slepc4py And polyeig ... How about that ? Cheers, Nils For nonlinear eigenvalue problems, see http://www.mims.manchester.ac.uk/research/numerical-analysis/nlevp.html From matthieu.brucher at gmail.com Tue Apr 8 17:03:20 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 8 Apr 2008 23:03:20 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FBD0D5.2030208@scipy.org> References: <47FBD0D5.2030208@scipy.org> Message-ID: 2008/4/8, dmitrey : > > Matthieu Brucher wrote: > > Hi, > > > > The current repository head of openopt is unstable because we are in > > the process of enhancing its behaviour: > > - There will be a registry where everyone will be able to... register > > its own openopt compatible wrapper of an optimizer. This way, the > > burden of developing these wrappers will not only rely on dmitrey, > > everyone will be able to write and test new ones for one's favorite > solver > > The problem is I just mere don't see those dozens of people with willing > to contribute sufficient solvers to OO. Some people answered they could > do it in the sake of money, but it's inappropriate neither to me no to > others. I forgot ;) An additonal advantage is that one can display the keys to know which solver is available (because one forgot the name or to select the best solver). Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Tue Apr 8 17:04:52 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 23:04:52 +0200 Subject: [SciPy-dev] arffread.py and python2.4 Message-ID: python2.4 doesn't support the following File "/usr/lib/python2.4/site-packages/scipy/io/arff/arffread.py", line 453 finally: ^ SyntaxError: invalid syntax Nils From robert.kern at gmail.com Tue Apr 8 17:10:10 2008 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 8 Apr 2008 14:10:10 -0700 Subject: [SciPy-dev] arffread.py and python2.4 In-Reply-To: References: Message-ID: <3d375d730804081410v7a9b8deej74a0259e485259d5@mail.gmail.com> Fixed in SVN. Thank you. On Tue, Apr 8, 2008 at 2:04 PM, Nils Wagner wrote: > python2.4 doesn't support the following > > File > "/usr/lib/python2.4/site-packages/scipy/io/arff/arffread.py", > line 453 > finally: > ^ > SyntaxError: invalid syntax > > Nils -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From nwagner at iam.uni-stuttgart.de Tue Apr 8 17:12:38 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 23:12:38 +0200 Subject: [SciPy-dev] import scipy.linalg in svn Message-ID: I guess I found a blocker in scipy svn >>> scipy.__version__ '0.7.0.dev4118' >>> import scipy.linalg Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/__init__.py", line 8, in ? from eigen import * File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/__init__.py", line 6, in ? from lobpcg import * File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/__init__.py", line 4, in ? import lobpcg File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", line 21, in ? import scipy.linalg as sla AttributeError: 'module' object has no attribute 'linalg' >>> from scipy import linalg Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 13, in ? from iterative import * File "/usr/lib/python2.4/site-packages/scipy/linalg/iterative.py", line 5, in ? from scipy.sparse.linalg import isolve File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/__init__.py", line 8, in ? from eigen import * File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/__init__.py", line 6, in ? from lobpcg import * File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/__init__.py", line 4, in ? import lobpcg File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", line 21, in ? import scipy.linalg as sla AttributeError: 'module' object has no attribute 'linalg' Am I missing something ? Nils From wnbell at gmail.com Tue Apr 8 17:13:18 2008 From: wnbell at gmail.com (Nathan Bell) Date: Tue, 8 Apr 2008 16:13:18 -0500 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> Message-ID: On Tue, Apr 8, 2008 at 3:46 PM, Ondrej Certik wrote: > > Supporting a common interface to the various solvers would be quite > > useful. However, like (2) we should not support interfaces to > > optional libraries/methods. > > You mean support in scipy by default, right? I would like to have the > unified interface to all solvers (including petsc and others). I think that a common interface to UMFPACK/PETSc/etc. should live a scikit. Ideally, the scikit would include those libraries itself. > OK, that sounds very good. I am going to study how to start such a > scikit and we are going to move our additional solvers we have in > sfepy to it (pysparse, soon petsc4py) -- are you ok with that Robert? > :) Plus umfpack. Plus the eigensolvers, like primme, blzpack and > others. I just recently saw a talk by some people that work on providing abstract interfaces to the solvers in Trilinos: http://trilinos.sandia.gov/packages/docs/r8.0/packages/stratimikos/doc/html/index.html http://trilinos.sandia.gov/packages/belos/ I have no experience with these, but they may provide some guidance. > And then we'll see how it goes and if some part is useful (like > pysparse or blzpack) and with BSD license, it can later be moved to > scipy directly. Absolutely. > Do you have some idea about the interface? I noticed the > linalg/interface.py, that's the way to go. This is only for the > iterative solvers though. LinearOperator was designed to formalize the interface expected by isolve/iterative.py I haven't thought about it much since then, so it's possible that additions/modifications will be necessary. > We need to devise some unified interface for the direct solvers and > also for eigensolvers. Ok, I'll try to follow the arpack example and > then we'll see if it's ok, or it needs some changes for example in the > blzpack case. > > For example the lobpcg now in scipy has a different interface, doesn't it? Only slightly. Since lobpcg operates on block vectors I added a .matmat() to LinearOperator. If a operator provides a matvec() method, but no matmat(), then the matvec() is applied to each column. In the future some sparse matrices may have optimized matmat() for blockvectors. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Tue Apr 8 17:24:04 2008 From: wnbell at gmail.com (Nathan Bell) Date: Tue, 8 Apr 2008 16:24:04 -0500 Subject: [SciPy-dev] import scipy.linalg in svn In-Reply-To: References: Message-ID: On Tue, Apr 8, 2008 at 4:12 PM, Nils Wagner wrote: > I guess I found a blocker in scipy svn > >>> scipy.__version__ > '0.7.0.dev4118' > "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", > line 21, in ? > import scipy.linalg as sla > AttributeError: 'module' object has no attribute 'linalg' > > Am I missing something ? I committed a workaround in r4120. There's a circular chain of imports due to the deprecated iterative solvers that lived in scipy.linalg.iterative and now reside in scipy.sparse.linalg.isolve http://projects.scipy.org/scipy/scipy/changeset/4120 Please let me know if there's a better way to fix this problem. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From robert.kern at gmail.com Tue Apr 8 17:31:55 2008 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 8 Apr 2008 14:31:55 -0700 Subject: [SciPy-dev] import scipy.linalg in svn In-Reply-To: References: Message-ID: <3d375d730804081431r5a8f5483uafae4a2850248b3f@mail.gmail.com> On Tue, Apr 8, 2008 at 2:24 PM, Nathan Bell wrote: > On Tue, Apr 8, 2008 at 4:12 PM, Nils Wagner > wrote: > > I guess I found a blocker in scipy svn > > >>> scipy.__version__ > > '0.7.0.dev4118' > > > "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", > > line 21, in ? > > import scipy.linalg as sla > > AttributeError: 'module' object has no attribute 'linalg' > > > > Am I missing something ? > > I committed a workaround in r4120. There's a circular chain of > imports due to the deprecated iterative solvers that lived in > scipy.linalg.iterative and now reside in scipy.sparse.linalg.isolve > http://projects.scipy.org/scipy/scipy/changeset/4120 > > Please let me know if there's a better way to fix this problem. Except for removing the deprecated parts in scipy.linalg.iterative, this is a perfectly acceptable workaround. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From nwagner at iam.uni-stuttgart.de Tue Apr 8 17:49:12 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 08 Apr 2008 23:49:12 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB961D.50609@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: On Tue, 08 Apr 2008 17:58:21 +0200 Robert Cimrman wrote: > Hi Nils, > > Nils Wagner wrote: >> Oops, I forgot to ask for the possibbility of handling >> Hermitian matrices within lobpcg >> >> http://projects.scipy.org/scipy/scipy/ticket/452 > > Can you write a (failing) test case for this? You could >also help by > detecting the places in lobpcg.py causing it not to >work, if you can. > > cheers, > r. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev The attached script fails Traceback (most recent call last): File "test_hermitian.py", line 22, in ? eigs,vecs, resnh = lobpcg(X,A,residualTolerance = 1e-6, maxIterations =500, retResidualNormsHistory=1) File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", line 384, in lobpcg aux = b_orthonormalize( B, activeBlockVectorR ) File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", line 130, in b_orthonormalize gramVBV = sla.cholesky( gramVBV ) File "/usr/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 898, in cholesky if info>0: raise LinAlgError, "matrix not positive definite" numpy.linalg.linalg.LinAlgError: matrix not positive definite But for what reason ? Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: test_hermitian.py Type: text/x-python Size: 562 bytes Desc: not available URL: From aisaac at american.edu Tue Apr 8 23:45:35 2008 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 8 Apr 2008 23:45:35 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FBD0D5.2030208@scipy.org> References: <47FBD0D5.2030208@scipy.org> Message-ID: > Matthieu: >> We have to check every solver so that it makes the >> correct imports and add it to the openopt registry, and >> then everything should be fine. On Tue, 08 Apr 2008, dmitrey wrote: > I still don't think it's a good idea Are you two going to work this out in private or are you going to add some details about the design considerations and seek input from the SciPy community? Cheers, Alan From matthieu.brucher at gmail.com Wed Apr 9 02:01:55 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 08:01:55 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> Message-ID: > > On Tue, 08 Apr 2008, dmitrey wrote: > > I still don't think it's a good idea > > > Are you two going to work this out in private or are you > going to add some details about the design considerations > and seek input from the SciPy community? > I'd like to have some feedback. For the moment, the changes will not be seen by the typical end-user. No API was changed. For the sys.path modification to be removed, I had to find a way of importing the corresponding solver that is stored somewhere in the hierarchy, generally in a module named scikits.openopt.solvers.*.solver_oo. So now that the parent modules (scikits.openopt.solvers.*) are no longer in the path, the easiest way (at least for me) was to add in a registry this information, so the available_solvers registry was created. In this configuration, everything that is in solvers uses the Kernel, but the Kernel does not use anything in solvers, they are only called by the registry. I think that dmitrey did a great job in having a clean cut here. For the moment, the registry is not advertised and accessible by scikits.openopt.available_solvers. It is still hidden in scikits.openopt.Kernel.runProbSolver, but this can be easilly modified. Now people can register a new wrapper by calling : scikits.openopt.Kernel.runProbSolver.available_solvers['name'] = 'path.to.module' I agree with dmitrey, there will not be a lot of people that will use this, but if we can get one contributed wrapper this way, I think it will be great for everyone, dmitrey included, as he is the one behind the current wrappers. On a side note, the generic framework is finally importable like any other scikit (this was not the case until 2 or three days ago, becuse of this design, and the Technology Preview in scikits.learn can heavily depend on it) : from openopt.scikits.solvers import optimizers So the available_solver is the only addition in the scikit, and I think that wrappers developers will appreciate not having to add files to the scikit. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 02:26:04 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 09:26:04 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> Message-ID: <47FC617C.1080703@scipy.org> So now nlp_ALGENCAN.py and milp_1.py just don't work, mb some others as well. D. P.S. OO installation page says extracting OO from svn is recommended way. So users should be informed of the issue, not just scipy-dev. On the other hand, I prefer to check all /examples/*.py pass ok BEFORE committing anything to svn. From matthieu.brucher at gmail.com Wed Apr 9 02:45:05 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 08:45:05 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC617C.1080703@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> Message-ID: 2008/4/9, dmitrey : > > So now nlp_ALGENCAN.py and milp_1.py just don't work, mb some others as > well. Perhaps with the erros, I could be able to find out what is happening ? P.S. OO installation page says extracting OO from svn is recommended > way. So users should be informed of the issue, not just scipy-dev. That's what packages are made for. When the package is in a stable state, one can think about modifying files so that the module is enhanced. This fix waited far too long. And besides, I think that the code I provided in the scikit should be available for people who want it (generic framework that you hid in the solvers). I once started a similar fix and you rolled back everything. I'm sorry, but you are not the only one that spent days, weeks, to create the package. I spent hours trying to do something you said you would do months ago (this fix). But sometimes, decisions have to be made, because other code depend on it. I already waited too long to fix this design (this was one of the reasons I did not add new stuff to the package, as I had no way of advertizing it to the outside world). On > the other hand, I prefer to check all /examples/*.py pass ok BEFORE > committing anything to svn. > That cannot be always the case. Besides, since your last release, I don't think you commited code, so people who want to use the last stable realease can do so. And people who want to use the full power of the package can do so now. Sorry for the harsh criticism, but I 'm thinking about this for several months. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Wed Apr 9 02:59:00 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 08:59:00 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC617C.1080703@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> Message-ID: 2008/4/9, dmitrey : > > So now nlp_ALGENCAN.py and milp_1.py just don't work, mb some others as > well. With a fresh SVN checkout, I only get : "OO Error:incorrect solver is called, maybe the solver "ALGENCAN" is not installed. Maybe setting p.debug=1 could specify the matter more precisely" Which is the correct way. If you don't have a fresh SVN, you will have bogus packages that are remaining in your Kernel folder (BTW, this is not my mistake, I didn't add or remove any folder in Kernel, so the issue/bug must be present for some weeks). Restart from scratch and everything will be fine (well, I didn't test with algencan, but I suppose the traceback you got is the one I got). When it comes to milp_1.py, I just checked on the SVN repository, and it seems that the error is due to revision 900. So you commit code without checking every example as well. That's what test are made for. I think you should read Test-Driven Development from Kent Beck as well as Refactoring to Patterns from Martin Fowler. Tests are what you are looking forr to automacally check that the solvers are in a clean state. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 03:14:56 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 10:14:56 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> Message-ID: <47FC6CF0.2000005@scipy.org> Matthieu Brucher wrote: > > > P.S. OO installation page says extracting OO from svn is recommended > way. So users should be informed of the issue, not just scipy-dev. > > > > That's what packages are made for. When the package is in a stable > state, one can think about modifying files so that the module is > enhanced. This fix waited far too long. And besides, I think that the > code I provided in the scikit should be available for people who want > it (generic framework that you hid in the solvers). In the letter from 09/12/2007 I had proposed you to install your optimizers to same path OpenOpt is installed, i.e. using from scikits.genericopt import ... You had answered: "Do what you think is the best, I don't mind using from openopt... import optimizers" > I once started a similar fix and you rolled back everything. I had got to know about those changes when Nils Wagner had informed me that latest OO from subversion just don't work. > I'm sorry, but you are not the only one that spent days, weeks, to > create the package. I spent hours trying to do something you said you > would do months ago (this fix). But sometimes, decisions have to be > made, because other code depend on it. I already waited too long to > fix this design (this was one of the reasons I did not add new stuff > to the package, as I had no way of advertizing it to the outside world). Do you realize what any project will look like if anyone will implement his own changes wrt his own needs only, w/o even informing others? > > > On > the other hand, I prefer to check all /examples/*.py pass ok BEFORE > committing anything to svn. > > > That cannot be always the case. Besides, since your last release, I > don't think you commited code, I have committed some since v 0.17 (changes to ralg + bugfix to ALGENCAN), and I'm just waiting for GSoC decision that have to be announced 11 April. Since I had mentioned my intentions of participating in GSoC in my blog that had been connected to scipy-planet by someone, I guess you are aware of this. BTW anyone can have some holidays during year. > so people who want to use the last stable realease can do so. And > people who want to use the full power of the package can do so now. I still don't think some days of my not-committing give rights to those abstract people you have mentioned to do whatever they want with OO code. > > Sorry for the harsh criticism, but I 'm thinking about this for > several months. > > Matthieu And after those several months you can't wait these 2 days before GSoC results? D. > > > 2008/4/9, dmitrey >: > > So now nlp_ALGENCAN.py and milp_1.py just don't work, mb some > others as > well. > > > > Perhaps with the erros, I could be able to find out what is happening ? Traceback (most recent call last): File "nlp_ALGENCAN.py", line 112, in r = p.solve('ALGENCAN') File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/BaseProblem.py", line 231, in solve return runProbSolver(self, solvers, *args, **kwargs) File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", line 67, in runProbSolver solverClass = getattr(my_import(available_solvers[solver_str]), solver_str) File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", line 50, in my_import mod = __import__(name) File "/usr/lib/python2.5/site-packages/scikits/openopt/solvers/BrasilOpt/ALGENCAN_oo.py", line 5, in from scikits.openopt.setDefaultIterFuncs import SMALL_DF ImportError: No module named setDefaultIterFuncs Traceback (most recent call last): File "milp_1.py", line 17, in r = p.solve('lpSolve') File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/BaseProblem.py", line 231, in solve return runProbSolver(self, solvers, *args, **kwargs) File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", line 199, in runProbSolver solver(p) File "/usr/lib/python2.5/site-packages/scikits/openopt/solvers/lp_solve/lpSolve_oo.py", line 26, in __solver__ [obj, x_opt, duals] = lps(List(f.flatten()), List(p.Awhole), List(p.bwhole.flatten()), List(p.dwhole.flatten()), \ NameError: global name 'lps' is not defined From dmitrey.kroshko at scipy.org Wed Apr 9 03:28:38 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 10:28:38 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> Message-ID: <47FC7026.5050802@scipy.org> Matthieu Brucher wrote: > > > 2008/4/9, dmitrey >: > > So now nlp_ALGENCAN.py and milp_1.py just don't work, mb some > others as > well. > > > With a fresh SVN checkout, I only get : > "OO Error:incorrect solver is called, maybe the solver "ALGENCAN" is > not installed. Maybe setting p.debug=1 could specify the matter more > precisely" > > Which is the correct way. If you don't have a fresh SVN, you will have > bogus packages that are remaining in your Kernel folder (BTW, this is > not my mistake, I didn't add or remove any folder in Kernel, so the > issue/bug must be present for some weeks). Restart from scratch and > everything will be fine (well, I didn't test with algencan, but I > suppose the traceback you got is the one I got). > > When it comes to milp_1.py, I just checked on the SVN repository, and > it seems that the error is due to revision 900. Yes, I have fixed this. > So you commit code without checking every example as well. > > That's what test are made for. I think you should read Test-Driven > Development from Kent Beck as well as Refactoring to Patterns from > Martin Fowler. Tests are what you are looking forr to automacally > check that the solvers are in a clean state. Unittests are mentioned in my GSoC application. http://scipy.org/scipy/scikits/wiki/OOTimeLine D. > > Matthieu > -- > French PhD student > Website : http://matthieu-brucher.developpez.com/ > Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 > LinkedIn : http://www.linkedin.com/in/matthieubrucher > ------------------------------------------------------------------------ > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ondrej at certik.cz Wed Apr 9 03:45:28 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 9 Apr 2008 09:45:28 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> Message-ID: <85b5c3130804090045pf8703cav41402c43ca0ef620@mail.gmail.com> On Tue, Apr 8, 2008 at 11:00 PM, Nils Wagner wrote: > On Tue, 8 Apr 2008 22:46:34 +0200 > > > OK, that sounds very good. I am going to study how to > >start such a > > scikit and we are going to move our additional solvers > >we have in > > sfepy to it (pysparse, soon petsc4py) -- are you ok with > >that Robert? > > :) Plus umfpack. Plus the eigensolvers, like primme, > >blzpack and > > others. > > > AFAIK Jacobi-Davidson is part of pysparse. Yep. > And you probably know slepc > > http://www.grycap.upv.es/slepc/ > > http://t2.unl.edu/documentation/slepc4py Of course. Even the authors of slepc personally. It's a good project, but unfortunately, slepc is not a free software, so I don't want to depend on it. I think and I hope Jose (one of the authors) will release it as opensource in couple years, but for now it is not. > > And polyeig ... How about that ? You mean this? http://www-eleves-isia.cma.fr/documentation/matlab/techdoc/ref/polyeig.html I don't need it for my problems, so... :) > > Cheers, > > Nils > > For nonlinear eigenvalue problems, see > > http://www.mims.manchester.ac.uk/research/numerical-analysis/nlevp.html Thanks, I didn't know about this one. Ondrej From fperez.net at gmail.com Wed Apr 9 03:52:36 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 9 Apr 2008 00:52:36 -0700 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC7026.5050802@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> Message-ID: On Wed, Apr 9, 2008 at 12:28 AM, dmitrey wrote: > Matthieu Brucher wrote: > > That's what test are made for. I think you should read Test-Driven > > Development from Kent Beck as well as Refactoring to Patterns from > > Martin Fowler. Tests are what you are looking forr to automacally > > check that the solvers are in a clean state. > Unittests are mentioned in my GSoC application. > http://scipy.org/scipy/scikits/wiki/OOTimeLine It seems to me that if your timeline starts in May and your 'writing unittests' line item is listed in August after the 'pencils down' mark, perhaps test-driven development isn't exactly what you have in mind... But I could well be missing something, it's late and I'm tired. Cheers, f From dmitrey.kroshko at scipy.org Wed Apr 9 04:57:25 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 11:57:25 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> Message-ID: <47FC84F5.9020302@scipy.org> Fernando Perez wrote: > On Wed, Apr 9, 2008 at 12:28 AM, dmitrey wrote: > >> Matthieu Brucher wrote: >> > > >> > That's what test are made for. I think you should read Test-Driven >> > Development from Kent Beck as well as Refactoring to Patterns from >> > Martin Fowler. Tests are what you are looking forr to automacally >> > check that the solvers are in a clean state. >> Unittests are mentioned in my GSoC application. >> http://scipy.org/scipy/scikits/wiki/OOTimeLine >> > > It seems to me that if your timeline starts in May and your 'writing > unittests' line item is listed in August after the 'pencils down' > mark, perhaps test-driven development isn't exactly what you have in > mind... But I could well be missing something, it's late and I'm > tired. > > Cheers, > > f > Alan G Isaac said me the same. As well as I had answered to him, it's not my idea of moving unittests to GSoC end. It's GSoC schedule chapter: http://code.google.com/opensource/gsoc/2008/faqs.html#0.1_timeline August 11: */Suggested/* 'pencils down' date. Take a week to scrub code, write tests, improve documentation, etc. Regards, D. From stefan at sun.ac.za Wed Apr 9 05:47:56 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Wed, 9 Apr 2008 11:47:56 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC84F5.9020302@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> <47FC84F5.9020302@scipy.org> Message-ID: <9457e7c80804090247q465162eci575b9a3afd6ec698@mail.gmail.com> On 09/04/2008, dmitrey wrote: > Alan G Isaac said me the same. > As well as I had answered to him, it's not my idea of moving unittests > to GSoC end. > It's GSoC schedule chapter: > http://code.google.com/opensource/gsoc/2008/faqs.html#0.1_timeline > August 11: */Suggested/* 'pencils down' date. Take a week to scrub code, > write tests, improve documentation, etc. That should probably have been "write additional tests". If you want to produce high quality code, you can't get away from writing tests as you go. Regards St?fan From cimrman3 at ntc.zcu.cz Wed Apr 9 06:00:03 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 09 Apr 2008 12:00:03 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> Message-ID: <47FC93A3.7040100@ntc.zcu.cz> Hi Nathan, Nathan Bell wrote: >> OK, that sounds very good. I am going to study how to start such a >> scikit and we are going to move our additional solvers we have in >> sfepy to it (pysparse, soon petsc4py) -- are you ok with that Robert? >> :) Plus umfpack. Plus the eigensolvers, like primme, blzpack and >> others. Sure! > I just recently saw a talk by some people that work on providing > abstract interfaces to the solvers in Trilinos: > http://trilinos.sandia.gov/packages/docs/r8.0/packages/stratimikos/doc/html/index.html > http://trilinos.sandia.gov/packages/belos/ > > I have no experience with these, but they may provide some guidance. I will look at this for inspiration, too, but see below. >> Do you have some idea about the interface? I noticed the >> linalg/interface.py, that's the way to go. This is only for the >> iterative solvers though. > > LinearOperator was designed to formalize the interface expected by > isolve/iterative.py > > I haven't thought about it much since then, so it's possible that > additions/modifications will be necessary. What do you think about the proposal of the various solver classes I posted? We (with Ondrej) could try it as a proof-of-concept within the solvers scikit that is slowly emerging from this discussion. cheers, r. From matthieu.brucher at gmail.com Wed Apr 9 08:33:18 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 14:33:18 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC6CF0.2000005@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> Message-ID: > > > That's what packages are made for. When the package is in a stable > > state, one can think about modifying files so that the module is > > enhanced. This fix waited far too long. And besides, I think that the > > code I provided in the scikit should be available for people who want > > it (generic framework that you hid in the solvers). > > > In the letter from 09/12/2007 I had proposed you to install your > optimizers to same path OpenOpt is installed, i.e. using from > scikits.genericopt import ... > You had answered: "Do what you think is the best, I don't mind using > from openopt... import optimizers" Yes, I would like to use this, but for several _months_, you did nothing in that matter. > I once started a similar fix and you rolled back everything. > > > I had got to know about those changes when Nils Wagner had informed me > that latest OO from subversion just don't work. Breaking my own code in the process. I didn't say a thing because you were so busy. > I'm sorry, but you are not the only one that spent days, weeks, to > > create the package. I spent hours trying to do something you said you > > would do months ago (this fix). But sometimes, decisions have to be > > made, because other code depend on it. I already waited too long to > > fix this design (this was one of the reasons I did not add new stuff > > to the package, as I had no way of advertizing it to the outside world). > > Do you realize what any project will look like if anyone will implement > his own changes wrt his own needs only, w/o even informing others? Perhaps then you should start listening to what I'm saying for several months ? > so people who want to use the last stable realease can do so. And > > people who want to use the full power of the package can do so now. > > I still don't think some days of my not-committing give rights to those > abstract people you have mentioned to do whatever they want with OO code. They don't do anything with OO code. Please read what I'm saying. They will add by themselves in the registry (not modifying openopt code) the name of their wrapper. NOTHING ELSE. > And after those several months you can't wait these 2 days before GSoC > results? > I don't know about the deadlines. I'm just worried about code I promised several people to give to the community. And now, with my fixes, everybody is happy, because it doesn't change a thing to how you and everyone using openopt code. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 08:51:11 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 15:51:11 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> Message-ID: <47FCBBBF.2000302@scipy.org> Matthieu Brucher wrote: > > > That's what packages are made for. When the package is in a stable > > state, one can think about modifying files so that the module is > > enhanced. This fix waited far too long. And besides, I think > that the > > code I provided in the scikit should be available for people who > want > > it (generic framework that you hid in the solvers). > > > In the letter from 09/12/2007 I had proposed you to install your > optimizers to same path OpenOpt is installed, i.e. using from > scikits.genericopt import ... > You had answered: "Do what you think is the best, I don't mind using > from openopt... import optimizers" > > > > Yes, I would like to use this, but for several _months_, you did > nothing in that matter. You should just inform that something IS NOT WORKING AT ALL, while you had said that something works in other way than you desire only. > > > > I once started a similar fix and you rolled back everything. > > > I had got to know about those changes when Nils Wagner had informed me > that latest OO from subversion just don't work. > > > > Breaking my own code in the process. I didn't say a thing because you > were so busy. Ok, that means that anyone could commit whatever he want and say "I implemented my own changes to svn and I haven't informed you, because you are too busy!" > > > > I'm sorry, but you are not the only one that spent days, weeks, to > > create the package. I spent hours trying to do something you > said you > > would do months ago (this fix). But sometimes, decisions have to be > > made, because other code depend on it. I already waited too long to > > fix this design (this was one of the reasons I did not add new stuff > > to the package, as I had no way of advertizing it to the outside > world). > > Do you realize what any project will look like if anyone will > implement > his own changes wrt his own needs only, w/o even informing others? > > > > Perhaps then you should start listening to what I'm saying for several > months ? there are lots of people saying me different things what OO should look like. I CAN"T satisfy all those opinions because they are just CONTRADICTORY, and because I have my own point of view what OO should be. > > > > so people who want to use the last stable realease can do so. And > > people who want to use the full power of the package can do so now. > > I still don't think some days of my not-committing give rights to > those > abstract people you have mentioned to do whatever they want with > OO code. > > > > They don't do anything with OO code. Please read what I'm saying. > They will add by themselves in the registry (not modifying openopt > code) the name of their wrapper. NOTHING ELSE. Any people could connect their solver without any regardless to your changes done. There were no needs of creating any registry, anyone could just put his _oo.py file into /solvers folder, and use his solver, and submit his solver(s) (_oo.py files) to openopt svn. > > > And after those several months you can't wait these 2 days before GSoC > results? > > > I don't know about the deadlines. But at least you know (particularly, from scipy mail lists) how much GSoC means to me and that results have to be known soon. > I'm just worried about code I promised several people to give to the > community. Ok, suppose I'll promise to my dept to change "lil_matrix" to "sparse", so it means i can commit my changes w/o asking permission of Natan? Just because I (+, optionally, "some people", or even "most of people") consider these changes to be good? > And now, with my fixes, everybody is happy, I'm not, and it's already enough for your statement to be 100% False D. From guyer at nist.gov Wed Apr 9 09:05:01 2008 From: guyer at nist.gov (Jonathan Guyer) Date: Wed, 9 Apr 2008 09:05:01 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCBBBF.2000302@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> Message-ID: Somehow I don't think this is exactly what Alan had in mind when he asked you two to "add some details about the design considerations and seek input from the SciPy community". Any chance you guys could take your little hissy fit elsewhere? From matthieu.brucher at gmail.com Wed Apr 9 09:15:30 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 15:15:30 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCBBBF.2000302@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> Message-ID: > > > Yes, I would like to use this, but for several _months_, you did > > nothing in that matter. > > You should just inform that something IS NOT WORKING AT ALL, while you > had said that something works in other way than you desire only. I was not the only one to tell you that your system was broken. Wat I coded WAS NOT ACCESSSIBLE AT ALL, as you said you would do. I tried it several times, and when I saw that you didn't do the changes you said you would do (09/12/2007), I did them myself, and this time correctly. You just told us that we had to wait for you to have financial support to do so. I agree that I should have been less careless. > Breaking my own code in the process. I didn't say a thing because you > > were so busy. > > Ok, that means that anyone could commit whatever he want and say "I > implemented my own changes to svn and I haven't informed you, because > you are too busy!" No, but I'm a maintainer of this package as you are. You broke the import of a part of the scikit, and you didn't bother to fix it, even if you said you would do so. > Perhaps then you should start listening to what I'm saying for several > > months ? > > there are lots of people saying me different things what OO should look > like. I CAN"T satisfy all those opinions because they are just > CONTRADICTORY, and because I have my own point of view what OO should be. Well, as I have recalled you, we are both maintainers of this package, and this is not what OO should look like. This is how OO must look like : a clean and correct package that does not mess with people's installation. > They don't do anything with OO code. Please read what I'm saying. > > They will add by themselves in the registry (not modifying openopt > > code) the name of their wrapper. NOTHING ELSE. > > Any people could connect their solver without any regardless to your > changes done. There were no needs of creating any registry, anyone could > just put his _oo.py file into /solvers folder, and use his solver, and > submit his solver(s) (_oo.py files) to openopt svn. ... Not everyone can do this. Not everyone wants to do this (because they installed openopt in their /usr/lin folder for instance). Give them an efficient way of contributing to the package. BTW, this is not something I did for fun, it was the only solution available to fix OO's design. > I'm just worried about code I promised several people to give to the > > community. > > Ok, suppose I'll promise to my dept to change "lil_matrix" to "sparse", > so it means i can commit my changes w/o asking permission of Natan? Just > because I (+, optionally, "some people", or even "most of people") > consider these changes to be good? No, because you are breaking the API. I did not break your API. > And now, with my fixes, everybody is happy, > > I'm not, and it's already enough for your statement to be 100% False You are not happy because you didn't want anyone to provide the necessary fixes to your design. Happy means that OO works the way it does before my fixes. Beides you blame me for mistakes you have made. I will stop answering you now, because this is useless. I'm also a maintainer of this package and I did not break anything in OO. I only refactored some code that you did not want to refactor for some obscure reason (so that I do not want to contribute more to the package perhaps ?). Now, a major flaw is gone and there are no issue with sys.path anymore. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Wed Apr 9 09:19:59 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 15:19:59 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> Message-ID: 2008/4/9, Jonathan Guyer : > > Somehow I don't think this is exactly what Alan had in mind when he > asked you two to "add some details about the design considerations and > seek input from the SciPy community". > > Any chance you guys could take your little hissy fit elsewhere? > I agree ;) Does any one wants the solvers repository to be publicly advertised ? That's the only question that remains in the discussion. The remaining is over. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 09:44:05 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 16:44:05 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> Message-ID: <47FCC825.3070900@scipy.org> Matthieu Brucher wrote: > > > Yes, I would like to use this, but for several _months_, you did > > nothing in that matter. > > You should just inform that something IS NOT WORKING AT ALL, while you > had said that something works in other way than you desire only. > > > > I was not the only one to tell you that your system was broken. Wat I > coded WAS NOT ACCESSSIBLE AT ALL, as you said you would do. I tried it > several times, and when I saw that you didn't do the changes you said > you would do (09/12/2007), I did them myself, and this time correctly. I propose you that time to do, but you had said you don't care, you can work with this in your way - OK, so I had switched to other tasks. > > You just told us that we had to wait for you to have financial support > to do so. I agree that I should have been less careless. > > > Breaking my own code in the process. I didn't say a thing > because you > > were so busy. > > Ok, that means that anyone could commit whatever he want and say "I > implemented my own changes to svn and I haven't informed you, because > you are too busy!" > > > > No, but I'm a maintainer of this package as you are. You broke the > import of a part of the scikit, and you didn't bother to fix it, even > if you said you would do so. I was insisted to include genericopt code to spread it along with openopt. From the very beginning I wanted to reject the idea from my mentors, and finally, as I understand, merging genericopt to openopt had been argeed PROVIDED genericopt remains separate package. Alan G Isaac confirmed: genericopt WILL remain separate package, available for SEPARATE download and usage w/o OpenOpt. But now I see that you (and mb some other scipy developers?) had understood the situation in other way. > > > Perhaps then you should start listening to what I'm saying for > several > > months ? > > there are lots of people saying me different things what OO should > look > like. I CAN"T satisfy all those opinions because they are just > CONTRADICTORY, and because I have my own point of view what OO > should be. > > > > Well, as I have recalled you, we are both maintainers of this package, above > and this is not what OO should look like. So, as I have forseen and mention in scipy mail list, this is what sooner or later should be happen. W/o strict clarification about code rights there are no single center of making solutions. > This is how OO must look like : a clean and correct package that does > not mess with people's installation. I hadn't obtain any pretensions (except of yours, of course) about OO installation for a very long time. And noone said me it leaks "clean" or "correct". > > > > They don't do anything with OO code. Please read what I'm saying. > > They will add by themselves in the registry (not modifying openopt > > code) the name of their wrapper. NOTHING ELSE. > > Any people could connect their solver without any regardless to your > changes done. There were no needs of creating any registry, anyone > could > just put his _oo.py file into /solvers folder, and use his solver, and > submit his solver(s) (_oo.py files) to openopt svn. > > > > ... > Not everyone can do this. Not everyone wants to do this (because they > installed openopt in their /usr/lin folder for instance). Give them an > efficient way of contributing to the package. BTW, this is not > something I did for fun, it was the only solution available to fix > OO's design. I don't see any problems with previous version. Anyone can download OO, add his _oo.py-file to a directory from /solvers, run "python install" and select any path that he want. Also, anyone could add his code to svn/.../solvers, I didn't mind. If it's not enough for someone, for example for you with GenericOpt, you can spread it as a separate scikit, as I had proposed you to be done (09/12/2007) - btw, in cost of spreading it along with OO (you could have separate tarball and/or svn/scikit directory for genericopt). > > > > I'm just worried about code I promised several people to give to the > > community. > > Ok, suppose I'll promise to my dept to change "lil_matrix" to > "sparse", > so it means i can commit my changes w/o asking permission of > Natan? Just > because I (+, optionally, "some people", or even "most of people") > consider these changes to be good? > > > > No, because you are breaking the API. I did not break your API. So all problems are in API only? Ok, suppose I had promised to some people to use some fortran code instead of C or wise versa, for to increase sparse lib speed or any other matters. So it gives me right to delete some Nathan's C files and put my Fortran files, with possible bugs? > > > > And now, with my fixes, everybody is happy, > > I'm not, and it's already enough for your statement to be 100% False > > > You are not happy because you didn't want anyone to provide the > necessary fixes to your design. Happy means that OO works the way it > does before my fixes. As situation with ALGENCAN shows, it doesn't > Beides you blame me for mistakes you have made. > > I will stop answering you now, because this is useless. I'm also a > maintainer of this package and I did not break anything in OO. I only > refactored some code that you did not want to refactor for some > obscure reason (so that I do not want to contribute more to the > package perhaps ?) I don't mind, I have enough my own intended work to be done and mb bugs to be fixed, instead of spending time for someone's else. D. From matthieu.brucher at gmail.com Wed Apr 9 10:03:14 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 16:03:14 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCC825.3070900@scipy.org> References: <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> <47FCC825.3070900@scipy.org> Message-ID: > > > You are not happy because you didn't want anyone to provide the > > necessary fixes to your design. Happy means that OO works the way it > > does before my fixes. > > As situation with ALGENCAN shows, it doesn't > Just one little note here. Just get a clean checkout of the trunk. There are no bug here (with algencen). Not one. I've just checked the code. The line you mention does not exist. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 10:29:05 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 17:29:05 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> <47FCC825.3070900@scipy.org> Message-ID: <47FCD2B1.40900@scipy.org> Matthieu Brucher wrote: > > > You are not happy because you didn't want anyone to provide the > > necessary fixes to your design. Happy means that OO works the way it > > does before my fixes. > > As situation with ALGENCAN shows, it doesn't > > > Just one little note here. Just get a clean checkout of the trunk. > There are no bug here (with algencen). Not one. I've just checked the > code. The line you mention does not exist. Which one line do you mean? I have removed all files in my openopt directory, checked out code from svn and still I have (with p.debug=1, to have more info on bug instead of "algencan possibly not installed"): Traceback (most recent call last): File "nlp_ALGENCAN.py", line 112, in r = p.solve('ALGENCAN') File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/BaseProblem.py", line 231, in solve return runProbSolver(self, solvers, *args, **kwargs) File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", line 67, in runProbSolver solverClass = getattr(my_import(available_solvers[solver_str]), solver_str) File "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", line 50, in my_import mod = __import__(name) File "/usr/lib/python2.5/site-packages/scikits/openopt/solvers/BrasilOpt/ALGENCAN_oo.py", line 5, in from scikits.openopt.setDefaultIterFuncs import SMALL_DF ImportError: No module named setDefaultIterFuncs D. From matthieu.brucher at gmail.com Wed Apr 9 11:03:11 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 17:03:11 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCD2B1.40900@scipy.org> References: <47FC617C.1080703@scipy.org> <47FC6CF0.2000005@scipy.org> <47FCBBBF.2000302@scipy.org> <47FCC825.3070900@scipy.org> <47FCD2B1.40900@scipy.org> Message-ID: OK, my bad, there was a "Kernel." missing. Matthieu 2008/4/9, dmitrey : > > Matthieu Brucher wrote: > > > > > You are not happy because you didn't want anyone to provide the > > > necessary fixes to your design. Happy means that OO works the way > it > > > does before my fixes. > > > > As situation with ALGENCAN shows, it doesn't > > > > > > Just one little note here. Just get a clean checkout of the trunk. > > There are no bug here (with algencen). Not one. I've just checked the > > code. The line you mention does not exist. > > Which one line do you mean? > > I have removed all files in my openopt directory, checked out code from > svn and still I have (with p.debug=1, to have more info on bug instead > of "algencan possibly not installed"): > > > Traceback (most recent call last): > File "nlp_ALGENCAN.py", line 112, in > r = p.solve('ALGENCAN') > File > "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/BaseProblem.py", > line 231, in solve > return runProbSolver(self, solvers, *args, **kwargs) > File > > "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", > line 67, in runProbSolver > solverClass = getattr(my_import(available_solvers[solver_str]), > solver_str) > File > > "/usr/lib/python2.5/site-packages/scikits/openopt/Kernel/runProbSolver.py", > line 50, in my_import > mod = __import__(name) > File > > "/usr/lib/python2.5/site-packages/scikits/openopt/solvers/BrasilOpt/ALGENCAN_oo.py", > line 5, in > from scikits.openopt.setDefaultIterFuncs import SMALL_DF > ImportError: No module named setDefaultIterFuncs > > > D. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Wed Apr 9 12:18:08 2008 From: aisaac at american.edu (Alan Isaac) Date: Wed, 9 Apr 2008 12:18:08 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FC7026.5050802@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org><47FC7026.5050802@scipy.org> Message-ID: OK, it seems that things are largely getting worked out here. It looks to me like Matthieu implemented some useful changes, and one of the good things that happened is that a bug was uncovered. As far as I can tell, there has been an increase in functionality and ease of use with no loss of functionality or ease of use. Dmitrey, please correct me if I misunderstand (by stating in simple terms what functionality is lost). I have been encouraging Dmitrey to rely more on test-driven development. This event highlights its importance, and it is one of the things I will be enouraging if I end up mentoring him for the GSoC (as I hope I will). I think that aside from the design issues, another kind of issue was uncovered that is quite natural. Dmitrey and Matthieu are working on code that sometimes overlaps in terms of needs and tests. But of course it is natural for each to feel protective of his primary area of work, since they are not actually working jointly on a single project. This means that good tests become all the more crucial---tests that should be passed in advance of commits. But in addition, I suggest a 2-stage procedure in areas of design disagreement: 1. discuss with each other any large or medium size changes in design 2. if there are unresolvable disagreements, ask the developers list for feedback on the best resolution Cheers, Alan Isaac From matthieu.brucher at gmail.com Wed Apr 9 12:40:35 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 18:40:35 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> Message-ID: 2008/4/9, Alan Isaac : > > OK, it seems that things are largely getting worked out > here. > > It looks to me like Matthieu implemented some useful > changes, and one of the good things that happened is that > a bug was uncovered. As far as I can tell, there has been > an increase in functionality and ease of use with no loss of > functionality or ease of use. Dmitrey, please correct me if > I misunderstand (by stating in simple terms what > functionality is lost). Well, the only drawback with this implementation is that Dmitrey will have to explicitely add the new wrappers he's writting to the registry. Aside from this "issue", there is only more ease of use and of modularity. I have been encouraging Dmitrey to rely more on test-driven > development. This event highlights its importance, and it > is one of the things I will be enouraging if I end up > mentoring him for the GSoC (as I hope I will). > > I think that aside from the design issues, another kind of > issue was uncovered that is quite natural. Dmitrey and > Matthieu are working on code that sometimes overlaps in > terms of needs and tests. But of course it is natural for > each to feel protective of his primary area of work, since > they are not actually working jointly on a single project. We work on different part of the package, but I have had this issue for several months (not being able to use the generic framework). This change was not discussed here, but was hinted on several other ML some months ago (python-list for instance). This means that good tests become all the more > crucial---tests that should be passed in advance of commits. > But in addition, I suggest a 2-stage procedure in areas of > design disagreement: > > 1. discuss with each other any large or medium size > changes in design > 2. if there are unresolvable disagreements, ask the > developers list for feedback on the best resolution > This seems natural, if everyone plays by the same rule and pays attention not to cripple functionalities. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Wed Apr 9 12:49:37 2008 From: aisaac at american.edu (Alan Isaac) Date: Wed, 9 Apr 2008 12:49:37 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org><47FC617C.1080703@scipy.org><47FC7026.5050802@scipy.org> Message-ID: On Wed, 9 Apr 2008, Matthieu Brucher wrote: > the only drawback with this implementation is that Dmitrey > will have to explicitely add the new wrappers he's > writting to the registry. Aside from this "issue", there > is only more ease of use and of modularity. But adding a wrapper to the registry is trivial, right? Or do I misunderstand. Thanks, Alan From matthieu.brucher at gmail.com Wed Apr 9 13:04:37 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 19:04:37 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> Message-ID: 2008/4/9, Alan Isaac : > > On Wed, 9 Apr 2008, Matthieu Brucher wrote: > > the only drawback with this implementation is that Dmitrey > > will have to explicitely add the new wrappers he's > > writting to the registry. Aside from this "issue", there > > is only more ease of use and of modularity. > > > But adding a wrapper to the registry is trivial, right? > Or do I misunderstand. > Yes, it is easy. See http://scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/Kernel/runProbSolver.pyand the current dictionary. But I think that everything must be said about the changes I made ;) Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Wed Apr 9 13:11:50 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 20:11:50 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> Message-ID: <47FCF8D6.3060001@scipy.org> Matthieu Brucher wrote: > > I have been encouraging Dmitrey to rely more on test-driven > development. This event highlights its importance, and it > is one of the things I will be enouraging if I end up > mentoring him for the GSoC (as I hope I will). > > I think that aside from the design issues, another kind of > issue was uncovered that is quite natural. Dmitrey and > Matthieu are working on code that sometimes overlaps in > terms of needs and tests. But of course it is natural for > each to feel protective of his primary area of work, since > they are not actually working jointly on a single project. > > > > We work on different part of the package, but I have had this issue > for several months (not being able to use the generic framework). This > change was not discussed here, but was hinted on several other ML some > months ago (python-list for instance). I didn't remember that you informed me (at least directly or via scipy mail list) that something from genericopt doesn't work. Would you inform me of problems with GenericOpt in direct letter, I would fix the situation. But I remember only the words from the letter mentioned that you can use genericopt in current implementation; I didn't remember had you ever informed me of your problems with importing genericopt. If you want, I could propose you once again keeping (and/or installing from openopt installation) GenericOpt in separate scikit, so you will do any changes in GenericOpt code w/o any comments from me (provided w/o modifying OO code of course), also, you and/or other genericopt users could access him (more) directly. Also, you could do anything whatever you want (if you'll want anything of course) in openopt/solvers/genericopt, as well as any other one with willing to contribute anything in solvers/. D. From nwagner at iam.uni-stuttgart.de Wed Apr 9 13:41:01 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 09 Apr 2008 19:41:01 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: <85b5c3130804090045pf8703cav41402c43ca0ef620@mail.gmail.com> References: <85b5c3130804071055y2ab3eed9t96adae6767294baa@mail.gmail.com> <85b5c3130804081346k71e95d23ta87d2a5a3aa9ff2a@mail.gmail.com> <85b5c3130804090045pf8703cav41402c43ca0ef620@mail.gmail.com> Message-ID: On Wed, 9 Apr 2008 09:45:28 +0200 "Ondrej Certik" wrote: > On Tue, Apr 8, 2008 at 11:00 PM, Nils Wagner > wrote: >> On Tue, 8 Apr 2008 22:46:34 +0200 >> >> > OK, that sounds very good. I am going to study how to >> >start such a >> > scikit and we are going to move our additional >>solvers >> >we have in >> > sfepy to it (pysparse, soon petsc4py) -- are you ok >>with >> >that Robert? >> > :) Plus umfpack. Plus the eigensolvers, like primme, >> >blzpack and >> > others. >> > >> AFAIK Jacobi-Davidson is part of pysparse. > > Yep. > >> And you probably know slepc >> >> http://www.grycap.upv.es/slepc/ >> >> http://t2.unl.edu/documentation/slepc4py > > Of course. Even the authors of slepc personally. It's a >good project, > but unfortunately, slepc is not a free software, so I >don't want to > depend on it. > I think and I hope Jose (one of the authors) will >release it as > opensource in couple years, but for now it is not. > >> >> And polyeig ... How about that ? > > You mean this? > > http://www-eleves-isia.cma.fr/documentation/matlab/techdoc/ref/polyeig.html > > I don't need it for my problems, so... :) > >> >> Cheers, >> >> Nils >> >> For nonlinear eigenvalue problems, see >> >> http://www.mims.manchester.ac.uk/research/numerical-analysis/nlevp.html > > Thanks, I didn't know about this one. > > Ondrej BTW, it would be great to have solvers for certain (linear) matrix equations i.e. Lyapunov, Sylvester, Riccati, Stein ... in SciPy / Scikits. Unfortunately slicot is not a free software. Any comments ? Cheers, Nils From aisaac at american.edu Wed Apr 9 13:45:40 2008 From: aisaac at american.edu (Alan Isaac) Date: Wed, 9 Apr 2008 13:45:40 -0400 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCF8D6.3060001@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> <47FCF8D6.3060001@scipy.org> Message-ID: On Wed, 09 Apr 2008, dmitrey wrote: > If you want, I could propose you once again keeping > (and/or installing from openopt installation) GenericOpt > in separate scikit We have just experienced some of the *advantages* of the current structure and have not discovered any need to change it, especially since Matthieu's most recent contribution. I suspect in the long-run some of Matthieu's work on generic optimizers may migrate to SciPy proper, but right now it is great to have it packaged with OpenOpt, since this withill offfer substantial functionality even to those who install no other solvers. Alan Isaac From matthieu.brucher at gmail.com Wed Apr 9 13:51:50 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Apr 2008 19:51:50 +0200 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: <47FCF8D6.3060001@scipy.org> References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> <47FCF8D6.3060001@scipy.org> Message-ID: > > Also, you could do anything whatever > you want (if you'll want anything of course) in > openopt/solvers/genericopt, as well as any other one with willing to > contribute anything in solvers/. > That's something I have to do, because your interface allows the use of many different and specific solvers ;) I agree with Alan on the split in two scikits. Having only one scikit dedicated to optimization is better than having several suboptimal ones. At the beginning, I was more inclined for this solution, but Alan made me change my mind last year ;) Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From wnbell at gmail.com Wed Apr 9 15:52:10 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 9 Apr 2008 14:52:10 -0500 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FB6FA1.9000906@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> Message-ID: On Tue, Apr 8, 2008 at 8:14 AM, Robert Cimrman wrote: > There could be even proper solver classes (LinearSolver, > EigenvalueSolver, etc.) with a common interface in each solver family. > Below, as an example, is the code we use in sfepy. As you can see, the > solver classes have only the __init__ method (can be used e.g. to > pre-factorize a matrix in the case of a linear solver) and the __call__ > method (application of the solver). Would it be interesting to have > something in that spirit in scipy? > > class Solver( Struct ): > def __init__( self, conf, **kwargs ): > Struct.__init__( self, conf = conf, **kwargs ) > def __call__( self, **kwargs ): > print 'called an abstract Solver instance!' > raise ValueError > Robert, I like the idea of having an abstract interface to the various solvers/eigensolvers. I'm currently battling with this in PyAMG for various coarse-grid solvers and relaxation methods. We should definitely pursue this idea further. In the meantime, we should strive to make the arguments to the individual solvers as uniform as possible (e.g. 'tol', 'maxiter', preconditioners, etc). This will make LinearSolver() etc. easier to create. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From nwagner at iam.uni-stuttgart.de Wed Apr 9 16:10:04 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 09 Apr 2008 22:10:04 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: <47FB961D.50609@ntc.zcu.cz> References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: On Tue, 08 Apr 2008 17:58:21 +0200 Robert Cimrman wrote: > Hi Nils, > > Nils Wagner wrote: >> Oops, I forgot to ask for the possibbility of handling >> Hermitian matrices within lobpcg >> >> http://projects.scipy.org/scipy/scipy/ticket/452 > > Can you write a (failing) test case for this? You could >also help by > detecting the places in lobpcg.py causing it not to >work, if you can. > > cheers, > r. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Robert, I have added an example to http://scipy.org/scipy/scipy/ticket/452. However, the test matrix is to large and I can't submit it as an attachment. Anyway, you can extract it from http://www.cs.wm.edu/~andreas/software/primme_v1.1.tar.gz Pearu, is there a chance to support io.mmread('*.mtx.bz2') ? So far you can use io.mmread('*.mtx.gz'). The output is /usr/bin/python -i test_hermitian.py [[ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j ] [ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j ] [ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j ] [ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.08304560-0.00218969j -0.00899514+0.0073746j ] [ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j -0.08304560+0.00218969j 0.00000000+0.j 0.01788708+0.00906446j] [ 0.00000000+0.j 0.00000000+0.j 0.00000000+0.j 0.00899514-0.0073746j -0.01788708-0.00906446j 0.00000000+0.j ]] Traceback (most recent call last): File "test_hermitian.py", line 17, in ? (eval, blockVectorV, eval_history, resnorm_history) = lobpcg(blockVectorX, A, retLambdaHistory=True, retResidualNormsHistory=True) File "/usr/lib/python2.4/site-packages/scipy/sparse/linalg/eigen/lobpcg/lobpcg.py", line 424, in lobpcg assert nm.allclose( gramA.T, gramA ) AssertionError Cheers, Nils From dmitrey.kroshko at scipy.org Wed Apr 9 14:40:50 2008 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 09 Apr 2008 21:40:50 +0300 Subject: [SciPy-dev] [scikits] openopt SVN instable for the moment In-Reply-To: References: <47FBD0D5.2030208@scipy.org> <47FC617C.1080703@scipy.org> <47FC7026.5050802@scipy.org> <47FCF8D6.3060001@scipy.org> Message-ID: <47FD0DB2.5020703@scipy.org> Alan Isaac wrote: > On Wed, 09 Apr 2008, dmitrey wrote: > >> If you want, I could propose you once again keeping >> (and/or installing from openopt installation) GenericOpt >> in separate scikit >> > > We have just experienced some of the *advantages* of the > current structure and have not discovered any need to change > it, especially since Matthieu's most recent contribution. > I suspect in the long-run some of Matthieu's work on generic > optimizers may migrate to SciPy proper, but right now it is > great to have it packaged with OpenOpt, since this withill > offfer substantial functionality even to those who install > no other solvers. > Afaik Matthieu's optimizers have some unconstrained NLP solvers only (+mb some box-bound solvers for single-variable problems like golden section). OO has it as well w/o any additional sources (ralg, lincher, moreover, ralg with all constraints, lincher with box-bound). As for my comment, I just propose to *install* genericopt as separate scikit, so users will have choice to use GenericOpt solvers from OO or directly (in last case it will be no any problems with importing python modules). It is much more good to have possibility to write "from scikits.genericopt import ..." than each time have to write "from scikits.openopt.solvers.optimizers import ...". I still continue to think that openopt and genericopt intended audiences are very different + they serve different goals (BTW Matthieu by himself had said he not see his GenericOpt being served to wide audience). I had already said long time ago (during GSoC 2007) that I want other solvers to remain their own API, to have no strict dependence on OO (so being OO-independent, and no problems like this one will be encountered anymore) - BTW it's better for solvers developer(s) by themselves, when there is always choice - to have more or less dependences, or, moreover, to remain absolutely independent on side packages. And IIRC you not rejected the idea. D. From wnbell at gmail.com Wed Apr 9 18:06:17 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 9 Apr 2008 17:06:17 -0500 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: On Wed, Apr 9, 2008 at 3:10 PM, Nils Wagner wrote: > I have added an example to > http://scipy.org/scipy/scipy/ticket/452. > > However, the test matrix is to large and I can't submit it > as an attachment. Anyway, you can extract it from > http://www.cs.wm.edu/~andreas/software/primme_v1.1.tar.gz > > Pearu, is there a chance to support io.mmread('*.mtx.bz2') > ? > So far you can use io.mmread('*.mtx.gz'). FWIW, saving it to a MATLAB mat file would probably be even smaller (and supported by SciPy). I find that using MATLAB to save the file (as opposed to SciPy) compresses the data better. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From cimrman3 at ntc.zcu.cz Thu Apr 10 04:21:36 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 10 Apr 2008 10:21:36 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: <47FDCE10.6070706@ntc.zcu.cz> Nils Wagner wrote: > I have added an example to > http://scipy.org/scipy/scipy/ticket/452. > > However, the test matrix is to large and I can't submit it > as an attachment. Anyway, you can extract it from > http://www.cs.wm.edu/~andreas/software/primme_v1.1.tar.gz Thanks, Nils. I will try to look at it as soon as possible. r. From cimrman3 at ntc.zcu.cz Thu Apr 10 04:34:15 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 10 Apr 2008 10:34:15 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> Message-ID: <47FDD107.70900@ntc.zcu.cz> Nathan Bell wrote: > On Tue, Apr 8, 2008 at 8:14 AM, Robert Cimrman wrote: >> There could be even proper solver classes (LinearSolver, >> EigenvalueSolver, etc.) with a common interface in each solver family. >> Below, as an example, is the code we use in sfepy. As you can see, the >> solver classes have only the __init__ method (can be used e.g. to >> pre-factorize a matrix in the case of a linear solver) and the __call__ >> method (application of the solver). Would it be interesting to have >> something in that spirit in scipy? >> >> class Solver( Struct ): >> def __init__( self, conf, **kwargs ): >> Struct.__init__( self, conf = conf, **kwargs ) >> def __call__( self, **kwargs ): >> print 'called an abstract Solver instance!' >> raise ValueError >> > > > Robert, I like the idea of having an abstract interface to the various > solvers/eigensolvers. I'm currently battling with this in PyAMG for > various coarse-grid solvers and relaxation methods. > > We should definitely pursue this idea further. In the meantime, we > should strive to make the arguments to the individual solvers as > uniform as possible (e.g. 'tol', 'maxiter', preconditioners, etc). > This will make LinearSolver() etc. easier to create. We could make a proposal on the wiki, let's say http://scipy.org/SolversProposal? If all are ok with this name, I will initiallize the page with what we have in sfepy, to have something to start with. best, r. From pearu at cens.ioc.ee Thu Apr 10 05:05:50 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 10 Apr 2008 11:05:50 +0200 Subject: [SciPy-dev] lobpcg: removed symeig dependence In-Reply-To: References: <47FB7A20.2020703@ntc.zcu.cz> <47FB961D.50609@ntc.zcu.cz> Message-ID: <47FDD86E.6010601@cens.ioc.ee> Nils Wagner wrote: > > Pearu, is there a chance to support io.mmread('*.mtx.bz2')? Yes, I have added necessary hooks to io/mmio.py (r4125) but I have not tested it. So, give it a try and let me know if it does not work. Regards, Pearu From vshah at interactivesupercomputing.com Thu Apr 10 05:21:20 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Thu, 10 Apr 2008 02:21:20 -0700 Subject: [SciPy-dev] the current state and future directions of the sparse solvers Message-ID: Hi Robert, > Hi Nathan, > > Nathan Bell wrote: >>> OK, that sounds very good. I am going to study how to start such a >>> scikit and we are going to move our additional solvers we have in >>> sfepy to it (pysparse, soon petsc4py) -- are you ok with that >>> Robert? >>> :) Plus umfpack. Plus the eigensolvers, like primme, blzpack and >>> others. > > Sure! I agree its good to have more solvers. Tim Davis' SuiteSparse has a whole bunch, and they are the most robust direct sparse solvers out there.. >> I just recently saw a talk by some people that work on providing >> abstract interfaces to the solvers in Trilinos: >> http://trilinos.sandia.gov/packages/docs/r8.0/packages/stratimikos/doc/html/index.html >> http://trilinos.sandia.gov/packages/belos/ >> >> I have no experience with these, but they may provide some guidance. Trilinos is good idea. By itself, its a huge collection. The relevant piece is Amesos for direct solvers. It makes plugging in various sparse direct solvers effortless. Cholmod for sparse cholesky seems to not exist yet. I am wondering how to find out more about the state of iterative solvers in scipy. Trilinos has all the methods and preconditioners. However, it should be possible to write the iterative solvers and preconditioners completely in .py. For the standard stuff, perhaps compiled libraries could be used for speed, but the flexibility is nice to have. The last time I had looked, they didn't have support for complex sparse matrices. I don't know if scipy sparse supports complex matrices or not yet. But if you want to solve non-symmetric eigenvalue problems, you need that.. -viral From stefan at sun.ac.za Thu Apr 10 05:44:38 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 11:44:38 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FDD107.70900@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <47FDD107.70900@ntc.zcu.cz> Message-ID: <9457e7c80804100244u258c1671j51d17dc05e1d3ddf@mail.gmail.com> On 10/04/2008, Robert Cimrman wrote: > Nathan Bell wrote: > > On Tue, Apr 8, 2008 at 8:14 AM, Robert Cimrman wrote: > >> There could be even proper solver classes (LinearSolver, > >> EigenvalueSolver, etc.) with a common interface in each solver family. > >> Below, as an example, is the code we use in sfepy. As you can see, the > >> solver classes have only the __init__ method (can be used e.g. to > >> pre-factorize a matrix in the case of a linear solver) and the __call__ > >> method (application of the solver). Would it be interesting to have > >> something in that spirit in scipy? > >> > >> class Solver( Struct ): > >> def __init__( self, conf, **kwargs ): > >> Struct.__init__( self, conf = conf, **kwargs ) > >> def __call__( self, **kwargs ): > >> print 'called an abstract Solver instance!' > >> raise ValueError > >> > > > > > > Robert, I like the idea of having an abstract interface to the various > > solvers/eigensolvers. I'm currently battling with this in PyAMG for > > various coarse-grid solvers and relaxation methods. > > > > We should definitely pursue this idea further. In the meantime, we > > should strive to make the arguments to the individual solvers as > > uniform as possible (e.g. 'tol', 'maxiter', preconditioners, etc). > > This will make LinearSolver() etc. easier to create. > > > We could make a proposal on the wiki, let's say > http://scipy.org/SolversProposal? If all are ok with this name, I will > initiallize the page with what we have in sfepy, to have something to > start with. Developer's info can also go on the Trac wiki. We haven't discussed where we want to host such proposals, or what their structures should be, but we should do that soon; Numpy Enhancement Proposals were mentioned on the numpy-discussion list just yesterday. Cheers St?fan From ondrej at certik.cz Thu Apr 10 07:31:47 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 13:31:47 +0200 Subject: [SciPy-dev] the current state and future directions of the sparse solvers In-Reply-To: References: Message-ID: <85b5c3130804100431j6bfa0815r37f4ec0d94465d3e@mail.gmail.com> On Thu, Apr 10, 2008 at 11:21 AM, Viral Shah wrote: > Hi Robert, > > > > Hi Nathan, > > > > Nathan Bell wrote: > > >>> OK, that sounds very good. I am going to study how to start such a > >>> scikit and we are going to move our additional solvers we have in > >>> sfepy to it (pysparse, soon petsc4py) -- are you ok with that > >>> Robert? > >>> :) Plus umfpack. Plus the eigensolvers, like primme, blzpack and > >>> others. > > > > Sure! > > I agree its good to have more solvers. Tim Davis' SuiteSparse has a > whole bunch, and they are the most robust direct sparse solvers out > there.. > > > >> I just recently saw a talk by some people that work on providing > >> abstract interfaces to the solvers in Trilinos: > >> http://trilinos.sandia.gov/packages/docs/r8.0/packages/stratimikos/doc/html/index.html > >> http://trilinos.sandia.gov/packages/belos/ > >> > >> I have no experience with these, but they may provide some guidance. > > Trilinos is good idea. By itself, its a huge collection. The relevant > piece is Amesos for direct solvers. It makes plugging in various > sparse direct solvers effortless. Cholmod for sparse cholesky seems > to not exist yet. > > I am wondering how to find out more about the state of iterative > solvers in scipy. Trilinos has all the methods and preconditioners. > However, it should be possible to write the iterative solvers and > preconditioners completely in .py. For the standard stuff, perhaps > compiled libraries could be used for speed, but the flexibility is > nice to have. Look into the sources, that's the best thing. It's quite well documented in docstrings and info.py files. And you need to do it anyway, if you want to contribute. :) > The last time I had looked, they didn't have support for complex > sparse matrices. I don't know if scipy sparse supports complex > matrices or not yet. But if you want to solve non-symmetric eigenvalue It does. > problems, you need that.. Ondrej From ondrej at certik.cz Thu Apr 10 07:32:51 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 13:32:51 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FDD107.70900@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <47FDD107.70900@ntc.zcu.cz> Message-ID: <85b5c3130804100432k273e3dlea57827d0a13bd1f@mail.gmail.com> On Thu, Apr 10, 2008 at 10:34 AM, Robert Cimrman wrote: > Nathan Bell wrote: > > On Tue, Apr 8, 2008 at 8:14 AM, Robert Cimrman wrote: > >> There could be even proper solver classes (LinearSolver, > >> EigenvalueSolver, etc.) with a common interface in each solver family. > >> Below, as an example, is the code we use in sfepy. As you can see, the > >> solver classes have only the __init__ method (can be used e.g. to > >> pre-factorize a matrix in the case of a linear solver) and the __call__ > >> method (application of the solver). Would it be interesting to have > >> something in that spirit in scipy? > >> > >> class Solver( Struct ): > >> def __init__( self, conf, **kwargs ): > >> Struct.__init__( self, conf = conf, **kwargs ) > >> def __call__( self, **kwargs ): > >> print 'called an abstract Solver instance!' > >> raise ValueError > >> > > > > > > Robert, I like the idea of having an abstract interface to the various > > solvers/eigensolvers. I'm currently battling with this in PyAMG for > > various coarse-grid solvers and relaxation methods. > > > > We should definitely pursue this idea further. In the meantime, we > > should strive to make the arguments to the individual solvers as > > uniform as possible (e.g. 'tol', 'maxiter', preconditioners, etc). > > This will make LinearSolver() etc. easier to create. > > We could make a proposal on the wiki, let's say > http://scipy.org/SolversProposal? If all are ok with this name, I will > initiallize the page with what we have in sfepy, to have something to > start with. Please start the page. Ondrej From stefan at sun.ac.za Thu Apr 10 08:01:23 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 14:01:23 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- Message-ID: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> Pearu, Why do checkins like these even happen? 04:03 Changeset [4125] by pearu Add bz2 support to io.mmio [not tested] I appreciate that you labelled the test deficiency, but what now? Are you going to add the test, or was the message meant as a reminder to someone else? Regards St?fan From stefan at sun.ac.za Thu Apr 10 08:05:02 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 14:05:02 +0200 Subject: [SciPy-dev] r4121: untested changes Message-ID: <9457e7c80804100505x6cbf939cp594218b6d7aadf65@mail.gmail.com> Referring to http://projects.scipy.org/scipy/scipy/changeset/4121 Please add regression tests whenever functionality is fixed, otherwise someone's going to break the code again sooner or later. From cimrman3 at ntc.zcu.cz Thu Apr 10 08:14:01 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 10 Apr 2008 14:14:01 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <9457e7c80804100244u258c1671j51d17dc05e1d3ddf@mail.gmail.com> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <47FDD107.70900@ntc.zcu.cz> <9457e7c80804100244u258c1671j51d17dc05e1d3ddf@mail.gmail.com> Message-ID: <47FE0489.9020504@ntc.zcu.cz> St?fan van der Walt wrote: > On 10/04/2008, Robert Cimrman wrote: >> We could make a proposal on the wiki, let's say >> http://scipy.org/SolversProposal? If all are ok with this name, I will >> initiallize the page with what we have in sfepy, to have something to >> start with. > > Developer's info can also go on the Trac wiki. We haven't discussed > where we want to host such proposals, or what their structures should > be, but we should do that soon; Numpy Enhancement Proposals were > mentioned on the numpy-discussion list just yesterday. Yes, I agree that a standard way of hosting such proposals would be handy. Anyway, I have created the page, so that we can start discussing now. After the right place is agreed upon we can move it easily. cheers, r. From ondrej at certik.cz Thu Apr 10 09:07:29 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 15:07:29 +0200 Subject: [SciPy-dev] r4121: untested changes In-Reply-To: <9457e7c80804100505x6cbf939cp594218b6d7aadf65@mail.gmail.com> References: <9457e7c80804100505x6cbf939cp594218b6d7aadf65@mail.gmail.com> Message-ID: <85b5c3130804100607x1a0c73d3ndb1b5dfe37128643@mail.gmail.com> On Thu, Apr 10, 2008 at 2:05 PM, St?fan van der Walt wrote: > Referring to > > http://projects.scipy.org/scipy/scipy/changeset/4121 > > Please add regression tests whenever functionality is fixed, otherwise > someone's going to break the code again sooner or later. +1 Ondrej From ondrej at certik.cz Thu Apr 10 09:16:37 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 15:16:37 +0200 Subject: [SciPy-dev] help: wrapping generalized symmetric evp functions In-Reply-To: <47FE0489.9020504@ntc.zcu.cz> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <47FDD107.70900@ntc.zcu.cz> <9457e7c80804100244u258c1671j51d17dc05e1d3ddf@mail.gmail.com> <47FE0489.9020504@ntc.zcu.cz> Message-ID: <85b5c3130804100616l2d5a8e0k752ec64c747caa1c@mail.gmail.com> On Thu, Apr 10, 2008 at 2:14 PM, Robert Cimrman wrote: > St?fan van der Walt wrote: > > On 10/04/2008, Robert Cimrman wrote: > > >> We could make a proposal on the wiki, let's say > >> http://scipy.org/SolversProposal? If all are ok with this name, I will > >> initiallize the page with what we have in sfepy, to have something to > >> start with. > > > > Developer's info can also go on the Trac wiki. We haven't discussed > > where we want to host such proposals, or what their structures should > > be, but we should do that soon; Numpy Enhancement Proposals were > > mentioned on the numpy-discussion list just yesterday. > > Yes, I agree that a standard way of hosting such proposals would be > handy. Anyway, I have created the page, so that we can start discussing > now. After the right place is agreed upon we can move it easily. Thanks, "Now is better than never.". To get the discussion going, here are some comments. Everyone please let us know what you think: 1) > s = Solver.anyFromConf( conf, mtx = A ) # Possible pre-solves by LU. How about this: s = Solver(conf, mtx = A ) and also this (together with the syntax above): s = Solver(kind="ls.scipy_iterative", method="cg", mtx = A ) This is useful for reading the configuration from some file. However, sometimes (a lot of times) I prefer this: 2) how about this: class SciPyIterative(LinearSolver): blabla class CG(SciPyIterative): pass class Umfpack(LinearSolver): pass and people would just use: from scipy.sparse.linalg import CG, Umfpack s = CG(epsA=1e-6, mtx=A) or s = Umfpack(mtx=A) 3) I also prefer to pass the matrix as the first argument, becuase you always need to supply a matrix, only then some optinal default arguments, or preconditioners, i.e.: s = CG(A, epsA=1e-6) or s = Umfpack(A) Ondrej From millman at berkeley.edu Thu Apr 10 11:35:33 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 10 Apr 2008 08:35:33 -0700 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> Message-ID: On Thu, Apr 10, 2008 at 5:01 AM, St?fan van der Walt wrote: > 04:03 Changeset [4125] by pearu > Add bz2 support to io.mmio [not tested] While I am happy for the added functionality, I agree we need to enforce our policy of testing all new code! Stefan has done an excellent job of opening trouble tickets for all new non-tested functionality. However, I would rather see Stefan spending his time more productively. If everyone will just commit to adding some unit tests or even doctests when they added new functionality, Stefan and others will be able to focus on adding new functionality as well. I know that writing tests isn't the most fun activity for everyone, but if we all do it for our own code, we will be better off in the long run. This check-in also raises what I think is an important question, which I hope to turn into a larger discussion once NumPy 1.0.5 is released. I would like to see us develop a simple file interface used by all the NumPy/SciPy IO code that would transparently handle things like compression and location. Chris Burns has contributed a prototype of what I am thinking about from the NIPY project: http://projects.scipy.org/scipy/numpy/browser/trunk/numpy/lib/_datasource.py Unfortunately, we haven't had time to publicize this. However, I hope that everyone who is interested will have a chance to take a look at this code over the next few weeks. Once 1.0.5 is out and we start discussing what goes into NumPy 1.1, I hope that Chris will have an opportunity to lead a discussion about what the community thinks this simple file interface should be. My only constraint is that I think it must remain very lean and lightweight. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From pearu at cens.ioc.ee Thu Apr 10 11:41:54 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 10 Apr 2008 18:41:54 +0300 (EEST) Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> Message-ID: <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> On Thu, April 10, 2008 3:01 pm, St?fan van der Walt wrote: > Pearu, > > Why do checkins like these even happen? > > 04:03 Changeset [4125] by pearu > Add bz2 support to io.mmio [not tested] > > I appreciate that you labelled the test deficiency, but what now? Are > you going to add the test, or was the message meant as a reminder to > someone else? Have you followed the thread http://projects.scipy.org/pipermail/scipy-dev/2008-April/008804.html ? The changeset is there because a user asked a feature. Adding this feature was trivial, testing it is not so because it requires certain input that I don't have at hand. There is a request for the user to check if the feature works. I don't see harm in such patches, it does not break anything, and there is potential user who can confirm if the feature works or not. Regards, Pearu From ondrej at certik.cz Thu Apr 10 12:58:30 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 18:58:30 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> Message-ID: <85b5c3130804100958mea29379mb4f8089efc9763b9@mail.gmail.com> On Thu, Apr 10, 2008 at 5:41 PM, Pearu Peterson wrote: > On Thu, April 10, 2008 3:01 pm, St?fan van der Walt wrote: > > Pearu, > > > > Why do checkins like these even happen? > > > > 04:03 Changeset [4125] by pearu > > Add bz2 support to io.mmio [not tested] > > > > I appreciate that you labelled the test deficiency, but what now? Are > > you going to add the test, or was the message meant as a reminder to > > someone else? > > Have you followed the thread > > http://projects.scipy.org/pipermail/scipy-dev/2008-April/008804.html > > ? The changeset is there because a user asked a feature. Adding > this feature was trivial, testing it is not so because it requires certain > input that I don't have at hand. There is a request for the user to check > if the feature works. > > I don't see harm in such patches, it does not break anything, > and there is potential user who can confirm if the feature works or not. One should follow the Zen of Python: Now is better than never. Although never is often better than *right* now. So in this particular case, I think Pearu is right, because he fixed that now instead of never and in this case I think never is not better than *right* now, considering that he would contribute a test once Robert says if it works or not. However, as to running tests -- are you against fixing the tests or scipy, so that it doesn't print such a garbage, like this? ...cswap:n=4 ..cswap:n=3 .....daxpy:n=4 ..daxpy:n=3 ....dcopy:n=4 ..dcopy:n=3 .............dscal:n=4 ....dswap:n=4 ..dswap:n=3 If not, I'll keep it in mind if I find some free afternoon. Ondrej From stefan at sun.ac.za Thu Apr 10 13:07:46 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 19:07:46 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> Message-ID: <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> On 10/04/2008, Pearu Peterson wrote: > On Thu, April 10, 2008 3:01 pm, St?fan van der Walt wrote: > > Pearu, > > > > Why do checkins like these even happen? > > > > 04:03 Changeset [4125] by pearu > > Add bz2 support to io.mmio [not tested] > > > > I appreciate that you labelled the test deficiency, but what now? Are > > you going to add the test, or was the message meant as a reminder to > > someone else? > > > Have you followed the thread > > http://projects.scipy.org/pipermail/scipy-dev/2008-April/008804.html > > ? The changeset is there because a user asked a feature. Adding > this feature was trivial, testing it is not so because it requires certain > input that I don't have at hand. There is a request for the user to check > if the feature works. While waiting for input, why not attach the change as a patch to the ticket? That way, once the user leaves feedback, a test can be constructed and the patch checked in. > I don't see harm in such patches, it does not break anything, > and there is potential user who can confirm if the feature works or not. The problem is that, while I trust your coding skill, I have no guarantee that your code is working right now on my machine, or that it won't stop working at some point in the future. Here's an abstract from a testing guideline I am busy writing, that describes my point of view: """ Why should each changeset be accompanied by a test? --------------------------------------------------- A changeset addresses some deficiency (e.g. missing functionality) or breakage (e.g. incorrect results, or failure to execute). Breakages should ideally be reported automatically by the test framework. In order to accomplish this, new functionality should have test coverage, to ensure that our software produces valid and accurate output, now and in the future, on any of a multitude of platforms and build systems. Are tests useful for anything other than verifying results? ----------------------------------------------------------- Mainly, there are three groups who benefit from tests: the documentation team, developers and users. To the documentation team, tests are invaluable since they illustrate function calling conventions. Developers also rely on tests to highlight corner cases, of which they need to be aware when rewriting functionality. The most important group, the end-users, benefits from tests in the form of doctests; these inline examples illustrate how to use our libraries. """ I hope I caused no offense, I have a lot of respect for the work you and others did in SciPy. I just hope that we can strive to build an higher quality codebase in the future. Regards St?fan From nwagner at iam.uni-stuttgart.de Thu Apr 10 13:17:30 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 10 Apr 2008 19:17:30 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> Message-ID: On Thu, 10 Apr 2008 18:41:54 +0300 (EEST) "Pearu Peterson" wrote: > On Thu, April 10, 2008 3:01 pm, St?fan van der Walt >wrote: >> Pearu, >> >> Why do checkins like these even happen? >> >> 04:03 Changeset [4125] by pearu >> Add bz2 support to io.mmio [not tested] >> >> I appreciate that you labelled the test deficiency, but >>what now? Are >> you going to add the test, or was the message meant as a >>reminder to >> someone else? > > Have you followed the thread > > http://projects.scipy.org/pipermail/scipy-dev/2008-April/008804.html > > ? The changeset is there because a user asked a feature. >Adding > this feature was trivial, testing it is not so because >it requires certain > input that I don't have at hand. There is a request for >the user to check > if the feature works. > > I don't see harm in such patches, it does not break >anything, > and there is potential user who can confirm if the >feature works or not. > > Regards, > Pearu > Hi Pearu, Thank you very much for your effort ! It works like a charm. Just one comment The new option is not mentioned in the docstring and it should be .mtx.gz insead of .mtz.gz ;-) def mmread(source): """ Reads the contents of a Matrix Market file 'filename' into a matrix. Inputs: source - Matrix Market filename (extensions .mtx, .mtz.gz) or open file object. Thanks again Nils From stefan at sun.ac.za Thu Apr 10 13:53:24 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 19:53:24 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <85b5c3130804100958mea29379mb4f8089efc9763b9@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <85b5c3130804100958mea29379mb4f8089efc9763b9@mail.gmail.com> Message-ID: <9457e7c80804101053o150b5641lef9405c83d4b5372@mail.gmail.com> On 10/04/2008, Ondrej Certik wrote: > However, as to running tests -- are you against fixing the tests or > scipy, so that it doesn't print such a garbage, like this? > > ...cswap:n=4 > ..cswap:n=3 > .....daxpy:n=4 > ..daxpy:n=3 > ....dcopy:n=4 > ..dcopy:n=3 > .............dscal:n=4 > ....dswap:n=4 > ..dswap:n=3 > > If not, I'll keep it in mind if I find some free afternoon. Please do! There should be a standard way of turning on and off debug info -- does Nose provide this, or do we have to use the logging module? It would be great if you could investigate. Thanks, St?fan From millman at berkeley.edu Thu Apr 10 13:45:36 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 10 Apr 2008 10:45:36 -0700 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> Message-ID: On Thu, Apr 10, 2008 at 8:41 AM, Pearu Peterson wrote: > > I appreciate that you labelled the test deficiency, but what now? Are > > you going to add the test, or was the message meant as a reminder to > > someone else? > > Have you followed the thread > > http://projects.scipy.org/pipermail/scipy-dev/2008-April/008804.html > > ? The changeset is there because a user asked a feature. Adding > this feature was trivial, testing it is not so because it requires certain > input that I don't have at hand. There is a request for the user to check > if the feature works. > > I don't see harm in such patches, it does not break anything, > and there is potential user who can confirm if the feature works or not. I haven't talked with Stefan about this, but I believe that his point (or at least this is my point) is not whether this code is fully tested in the sense that the code is correct. I think the issue is that we are requiring unittests; that is, there should be a short little unittest added here: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/io/tests/test_mmio.py I also agree that adding simple features such as this to satisfy user requests is very good. I spoke up about this because I think this a really important general discussion and isn't exclusively related to this specific check-in. Obviously, adding features will often require having user's test these features to ensure that they perform as specified. But it is essential that we have extensive unittests so that when we edit and change code, there is some mechanism available to determine whether old code is broken (hopefully, with a semi-automated mechanism). The solution to this problem shouldn't be that Stefan monitors add new code to determine whether it comes with unittests or not. He shouldn't have to write the unittests himself and he shouldn't have to create a ticket every time someone checks in un(unit)tested code. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From ondrej at certik.cz Thu Apr 10 14:21:45 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 10 Apr 2008 20:21:45 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <9457e7c80804101053o150b5641lef9405c83d4b5372@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <85b5c3130804100958mea29379mb4f8089efc9763b9@mail.gmail.com> <9457e7c80804101053o150b5641lef9405c83d4b5372@mail.gmail.com> Message-ID: <85b5c3130804101121w47ec8852y6cefa04e141afdfd@mail.gmail.com> On Thu, Apr 10, 2008 at 7:53 PM, St?fan van der Walt wrote: > On 10/04/2008, Ondrej Certik wrote: > > > However, as to running tests -- are you against fixing the tests or > > scipy, so that it doesn't print such a garbage, like this? > > > > ...cswap:n=4 > > ..cswap:n=3 > > .....daxpy:n=4 > > ..daxpy:n=3 > > ....dcopy:n=4 > > ..dcopy:n=3 > > .............dscal:n=4 > > ....dswap:n=4 > > ..dswap:n=3 > > > > If not, I'll keep it in mind if I find some free afternoon. > > Please do! There should be a standard way of turning on and off debug > info -- does Nose provide this, or do we have to use the logging > module? It would be great if you could investigate. Yeah. Unfortunately I am really busy, so I don't promise anything. But the tests should just print "." and that's it (no warnings, no nothing). Then you can easily spot all warnings or errors if something goes wrong. Ondrej From pearu at cens.ioc.ee Thu Apr 10 14:49:08 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 10 Apr 2008 21:49:08 +0300 (EEST) Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> Message-ID: <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> On Thu, April 10, 2008 8:07 pm, St?fan van der Walt wrote: ... >> ? The changeset is there because a user asked a feature. Adding >> this feature was trivial, testing it is not so because it requires >> certain >> input that I don't have at hand. There is a request for the user to >> check >> if the feature works. > > While waiting for input, why not attach the change as a patch to the > ticket? That way, once the user leaves feedback, a test can be > constructed and the patch checked in. I think in this particular case that would have been an overkill. We are all busy. When I looked what was needed to implement the feature, I saw that only a simple patch was required that I could do in 5 minutes. So, I did it. Otherwise I would have postpone it and asked to submit a feature request. >> I don't see harm in such patches, it does not break anything, >> and there is potential user who can confirm if the feature works or >> not. > > The problem is that, while I trust your coding skill, I have no > guarantee that your code is working right now on my machine, or that > it won't stop working at some point in the future. Here's an abstract > from a testing guideline I am busy writing, that describes my point of > view: While I strongly agree with the importance of unittests, we should not be too bureaucratic too, imho, because it can crush down creativity which is also important. So, we need to move forward along a middle ground. In this particular case, I'd say, you have been looking for a problem that was not there. :) > I hope I caused no offense, I have a lot of respect for the work you > and others did in SciPy. I just hope that we can strive to build an > higher quality codebase in the future. No offense taken:) Though I have been inactive for some time now, I have been still following the list and overall scipy/numpy development. But since you are using past tense, I gather I have been writeoff (I hope this is the right term) :) Regards, Pearu From pgmdevlist at gmail.com Thu Apr 10 15:09:44 2008 From: pgmdevlist at gmail.com (Pierre GM) Date: Thu, 10 Apr 2008 15:09:44 -0400 Subject: [SciPy-dev] numpy.ma.mstats in scipy ? Message-ID: <200804101509.45434.pgmdevlist@gmail.com> All, I've been working on the adaptation of scipy.stats.stats to support MaskedArrays. This is mostly an extension of numpy.ma.mstats, with most of the functions of scipy.stats.stats and a couple of corrections (ties correction in friedmanchisquare, tweaking ks_2samp to match results with R). I was considering adding the package to scipy: where should I put it ? As an additional module in scipy.stats (it would then be called scipy.stats.mstats) ? As its own package (scipy.mstats) ? Once the module is in scipy, we could then get rid of numpy.ma.mstats and numpy.ma.morestats, that don't look like they fit in numpy. Thanks a lot in advance. P. From robert.kern at gmail.com Thu Apr 10 15:31:57 2008 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 10 Apr 2008 14:31:57 -0500 Subject: [SciPy-dev] numpy.ma.mstats in scipy ? In-Reply-To: <200804101509.45434.pgmdevlist@gmail.com> References: <200804101509.45434.pgmdevlist@gmail.com> Message-ID: <3d375d730804101231t6cf0a268t98f67ba75fa8ba23@mail.gmail.com> On Thu, Apr 10, 2008 at 2:09 PM, Pierre GM wrote: > All, > > I've been working on the adaptation of scipy.stats.stats to support > MaskedArrays. This is mostly an extension of numpy.ma.mstats, with most of > the functions of scipy.stats.stats and a couple of corrections (ties > correction in friedmanchisquare, tweaking ks_2samp to match results with R). > > I was considering adding the package to scipy: where should I put it ? As an > additional module in scipy.stats (it would then be called > scipy.stats.mstats) ? As its own package (scipy.mstats) ? Once the module is > in scipy, we could then get rid of numpy.ma.mstats and numpy.ma.morestats, > that don't look like they fit in numpy. scipy.stats.mstats would be fine. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From stefan at sun.ac.za Thu Apr 10 15:57:18 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 10 Apr 2008 21:57:18 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> Message-ID: <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> On 10/04/2008, Pearu Peterson wrote: > I think in this particular case that would have been an overkill. We > are all busy. Therefore, we should we jealous of our time and write the tests now, instead of postponing them into the future (where I can guarantee that a) they'd take longer to write and b) they won't be written). A test in time saves nine, I've heard. > When I looked what was needed to implement the feature, > I saw that only a simple patch was required that I could do in 5 > minutes. So, I did it. Otherwise I would have postpone it and asked > to submit a feature request. I'm very happy you wrote the code, and I certainly don't want to dissuade you from doing so in the future. I do think that we need a better way of tracking missing tests than adding "[not tested]" to the change log, though. I didn't see an open ticket on the original feature request (maybe I missed it?), but until the test is in place, there should be a corresponding open ticket (even if just to help me out, next time I want to improve the test coverage). > While I strongly agree with the importance of unittests, we should not > be too bureaucratic too, imho, because it can crush down creativity > which is also important. So, we need to move forward along a middle > ground. > > In this particular case, I'd say, you have been looking for a problem > that was not there. :) I can't really go along with any middle ground on testing, I'm afraid (do I sound like a broken record yet?). To me that reeks of bad software development practice. If it were up to me, there wouldn't be a single line of untested code in NumPy. More and more people depend on NumPy to do their work and research, and we have a duty towards them to produce reliable, high quality software. Your change consists of only 5 lines of code, but what happens when the bz2 module changes its calling convention in Python 3k? We'd happily ship SciPy without even knowing that the compressed matrix market loading was completely unusable. I sincerely hope that I am overreacting, and that there is nothing to be concerned about, but as it stands I see a very real problem here. > No offense taken:) Though I have been inactive for some time now, I have > been still following the list and overall scipy/numpy development. But > since you are using past tense, I gather I have been writeoff (I hope > this is the right term) :) Not at all -- but I can't very well comment on code (and tests!) that haven't been written yet :) Regards St?fan From travis at enthought.com Thu Apr 10 16:13:08 2008 From: travis at enthought.com (Travis Vaught) Date: Thu, 10 Apr 2008 15:13:08 -0500 Subject: [SciPy-dev] [ANN] EuroSciPy Registration now open Message-ID: <238028CC-F3FA-4716-9027-A0C40EC5083D@enthought.com> Greetings, I'm pleased to announce that the registration for the first-annual EuroSciPy Conference is now open. http://scipy.org/EuroSciPy2008 Please take advantage of the early-bird rate and register soon. We'd love to have an early idea of attendance so that we can scale the venue appropriately (the available room is flexible in this regard). The EuroSciPy Conference will be held July 26-27, 2008 in Leipzig, Germany. About EuroSciPy --------------- EuroSciPy is designed to complement the popular SciPy Conferences which have been held for the last 7 years at Caltech (the 2008 SciPy Conference in the U.S. will be held the week of August 19-24). Similarly, the EuroSciPy Conference provides a unique opportunity to learn and affect what is happening in the realm of scientific computing with Python. Attendees will have the opportunity to review the available tools and how they apply to specific problems. By providing a forum for developers to share their Python expertise with the wider commercial, academic, and research communities, this conference fosters collaboration and facilitates the sharing of software components, techniques and a vision for high level language use in scientific computing. Typical presentations include general python use in the sciences, as well as NumPy and SciPy usage for general problem solving. Beyond the excellent talks, there are inter- session discussions that prove stimulating and helpful. Registration ------------ The direct link to the registration site is here: http://www.python-academy.com/euroscipy/index.html The registration fee will be 100.00? for early registrants and will increase to 150.00? for late registration (after June 15). Registration will include breakfast, snacks and lunch for Saturday and Sunday. Call for Participation ---------------------- If you are interested in presenting at the EuroSciPy Conference you may submit an abstract in Plain Text, PDF or MS Word formats to euroabstracts at scipy.org . The deadline for abstract submission is April 30,2008. Papers and/ or presentation slides are acceptable and are due by June 15, 2008. Presentations will be allotted 30 minutes. Please pass this announcement along to any other relevant contacts. Many Thanks, Travis N. Vaught From pearu at cens.ioc.ee Thu Apr 10 16:33:07 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 10 Apr 2008 23:33:07 +0300 (EEST) Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> Message-ID: <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> On Thu, April 10, 2008 10:57 pm, St?fan van der Walt wrote: .... >> While I strongly agree with the importance of unittests, we should not >> be too bureaucratic too, imho, because it can crush down creativity >> which is also important. So, we need to move forward along a middle >> ground. >> >> In this particular case, I'd say, you have been looking for a problem >> that was not there. :) > > I can't really go along with any middle ground on testing, I'm afraid > (do I sound like a broken record yet?). To me that reeks of bad > software development practice. If it were up to me, there wouldn't be > a single line of untested code in NumPy. > > More and more people depend on NumPy to do their work and research, > and we have a duty towards them to produce reliable, high quality > software. > > Your change consists of only 5 lines of code, but what happens when > the bz2 module changes its calling convention in Python 3k? We'd > happily ship SciPy without even knowing that the compressed matrix > market loading was completely unusable. That's a convincing point for me. > I sincerely hope that I am overreacting, and that there is nothing to > be concerned about, but as it stands I see a very real problem here. You are right, of course. Please keep hammering us to write unittests for any new code. Thanks, Pearu From wnbell at gmail.com Thu Apr 10 17:25:37 2008 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 10 Apr 2008 16:25:37 -0500 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: On Thu, Apr 10, 2008 at 3:33 PM, Pearu Peterson wrote: > > Your change consists of only 5 lines of code, but what happens when > > the bz2 module changes its calling convention in Python 3k? We'd > > happily ship SciPy without even knowing that the compressed matrix > > market loading was completely unusable. > > That's a convincing point for me. In my experience the main benefit of the unittests is not to determine whether a newly introduced feature has an error, but rather to keep subsequent contributors (with less expertise/knowledge of the code) from breaking it. I tend to think of it as a contract between myself and future developers. For instance, when making substantial changes to scipy.sparse I found the unittests to be immensely helpful in expressing the intentions of the previous authors. > > I sincerely hope that I am overreacting, and that there is nothing to > > be concerned about, but as it stands I see a very real problem here. > > You are right, of course. Please keep hammering us to write unittests > for any new code. I can say from personal experience that you're not the first to be reprimanded by Stefan about unittests :) -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From guyer at nist.gov Thu Apr 10 19:15:42 2008 From: guyer at nist.gov (Jonathan Guyer) Date: Thu, 10 Apr 2008 19:15:42 -0400 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: On Apr 10, 2008, at 5:25 PM, Nathan Bell wrote: > In my experience the main benefit of the unittests is not to determine > whether a newly introduced feature has an error, but rather to keep > subsequent contributors (with less expertise/knowledge of the code) > from breaking it. That subsequent contributor might even be you. FiPy's pervasive (but certainly imperfect) unittests have enabled us more than once to radically refactor the code. Without the tests, we'd have never even tried. From peridot.faceted at gmail.com Fri Apr 11 02:49:15 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Fri, 11 Apr 2008 02:49:15 -0400 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: On 10/04/2008, Nathan Bell wrote: > In my experience the main benefit of the unittests is not to determine > whether a newly introduced feature has an error, but rather to keep > subsequent contributors (with less expertise/knowledge of the code) > from breaking it. I tend to think of it as a contract between myself > and future developers. For instance, when making substantial changes > to scipy.sparse I found the unittests to be immensely helpful in > expressing the intentions of the previous authors. I have to agree with this - as a relatively new contributor to numpy/scipy, I am never sure I know where and how a function I want to modify is used. It would be easy for me to make some apparently-sensible change that breaks all sorts of code, within numpy/scipy and in users' applications. The fact that there are extensive unit tests means that I can more or less go ahead and make the change; if it passes all tests I'm pretty sure I didn't break anything old. In short, I find that having extensive unit testing makes it easier for new contributors to add code. The effort required to write unittests is also fairly minimal: I always need to test my code anyway to see if it works, so it's fairly easy to just put my tests in numpy's test harness framework. (My only complaint is that it's not always as easy as it might be to tell whether my tests are actually being run.) Anne From millman at berkeley.edu Fri Apr 11 02:58:19 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 10 Apr 2008 23:58:19 -0700 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: On Thu, Apr 10, 2008 at 11:49 PM, Anne Archibald wrote: > The effort required to write unittests is also fairly minimal: I > always need to test my code anyway to see if it works, so it's fairly > easy to just put my tests in numpy's test harness framework. (My only > complaint is that it's not always as easy as it might be to tell > whether my tests are actually being run.) Hopefully, the switch to nose (http://somethingaboutorange.com/mrl/projects/nose/) will help solve the problem with making sure all the tests are actually being run. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From cburns at berkeley.edu Fri Apr 11 04:16:52 2008 From: cburns at berkeley.edu (Christopher Burns) Date: Fri, 11 Apr 2008 01:16:52 -0700 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> Message-ID: <764e38540804110116r75740c83h3a90150064e9f20a@mail.gmail.com> On Thu, Apr 10, 2008 at 11:49 AM, Pearu Peterson wrote: > > While I strongly agree with the importance of unittests, we should not > be too bureaucratic too, imho, because it can crush down creativity > which is also important. So, we need to move forward along a middle > ground. > > Or move forward on a Distributed VCS ground? Seems to me like this is a case where Pearu could branch the code and work with Nils in there until the code met the "Scipy Standard" including documentation and tests, and then merge and push to the main trunk. Pearu is correct in that we want this sort of creativity/development to happen, but we lack an appropriate mechanism for it to happen in. The main scipy trunk could contain only tested, documented, and reviewed code, but exploratory code could still happen in branches. (A sandbox region that doesn't clutter up the user-space.) I'm new to bzr, am I correct that this would be easier to do in bzr (or hg)? Obviously, this is a trivial example, 5 lines of code (maybe 30 if you include additional docs and tests), but the sort of thing that will continue to happen with larger code. Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: From cburns at berkeley.edu Fri Apr 11 04:28:01 2008 From: cburns at berkeley.edu (Christopher Burns) Date: Fri, 11 Apr 2008 01:28:01 -0700 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: <764e38540804110128r404cdc76s4822a31ad6bbff11@mail.gmail.com> On Thu, Apr 10, 2008 at 11:49 PM, Anne Archibald wrote: I have to agree with this - as a relatively new contributor to > numpy/scipy, I am never sure I know where and how a function I want to > modify is used. It would be easy for me to make some > apparently-sensible change that breaks all sorts of code, within > numpy/scipy and in users' applications. The fact that there are > extensive unit tests means that I can more or less go ahead and make > the change; if it passes all tests I'm pretty sure I didn't break > anything old. In short, I find that having extensive unit testing > makes it easier for new contributors to add code. > +1 I'd add, extensive unit testing is great, but ANY testing is better than none. -- Christopher Burns Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From cimrman3 at ntc.zcu.cz Fri Apr 11 05:08:08 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Fri, 11 Apr 2008 11:08:08 +0200 Subject: [SciPy-dev] solver classes (was: help: wrapping generalized symmetric evp functions) In-Reply-To: <85b5c3130804100616l2d5a8e0k752ec64c747caa1c@mail.gmail.com> References: <47FB5C0E.6010603@ntc.zcu.cz> <47FB63A3.2090906@cens.ioc.ee> <47FB6FA1.9000906@ntc.zcu.cz> <47FDD107.70900@ntc.zcu.cz> <9457e7c80804100244u258c1671j51d17dc05e1d3ddf@mail.gmail.com> <47FE0489.9020504@ntc.zcu.cz> <85b5c3130804100616l2d5a8e0k752ec64c747caa1c@mail.gmail.com> Message-ID: <47FF2A78.7070305@ntc.zcu.cz> Hi, Ondrej Certik wrote: > > To get the discussion going, here are some comments. Everyone please > let us know what you think: > > 1) > >> s = Solver.anyFromConf( conf, mtx = A ) # Possible pre-solves by LU. > > How about this: > > s = Solver(conf, mtx = A ) This certainly could be done, by it looks confusing to me - instantiating a generic Solver and getting an instance of something else (a particular solver). I prefer slightly more a static method here, with some better name, e.g. Solver.create_any() or whatever. > and also this (together with the syntax above): > > s = Solver(kind="ls.scipy_iterative", method="cg", mtx = A ) The same here. Also note that in some cases the configuration dictionary can be quite large, so having one argument name reserved for 'conf' or 'config' or ... seems ok to me. Anyway, this way of constructing a solver suits best some general framework where the code does not know in advance which particular solver a user might need (like it is e.g. in sfepy - it just needs to know the type of a solver (linear, nonlinear, ...)). > This is useful for reading the configuration from some file. However, > sometimes (a lot of times) I prefer this: > > 2) how about this: > > class SciPyIterative(LinearSolver): > blabla > > class CG(SciPyIterative): > pass > > class Umfpack(LinearSolver): > pass > > and people would just use: > > from scipy.sparse.linalg import CG, Umfpack > > s = CG(epsA=1e-6, mtx=A) > > or > > s = Umfpack(mtx=A) Yes, this is what I like to have, too. If you solve a particular problem, you construct a particular solver directly. (But note that you can already do that with the classes as they are now in sfepy... Again, the Solver.anyFromConf()-like syntax is useful more for an abstract framework than for a use in small scripts.) > 3) I also prefer to pass the matrix as the first argument, becuase you > always need to supply a matrix, only then some optinal default > arguments, or preconditioners, i.e.: > > s = CG(A, epsA=1e-6) > > or > > s = Umfpack(A) In linear solvers, yes. Other types of solvers might not need a matrix at all. All solvers have a configuration, though. So there is some logic in as it is, but 'conf' can be made a keyword argument, then why not. We should also use something_and_something convention instead of somethingAndSomething for the argument names, as it is usual in SciPy. thanks for the comments, r. From ondrej at certik.cz Fri Apr 11 05:25:37 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Fri, 11 Apr 2008 11:25:37 +0200 Subject: [SciPy-dev] Add bz2 support to io.mmio [not tested] <-- In-Reply-To: References: <9457e7c80804100501l1e6dae39l2903a428d72c01ad@mail.gmail.com> <54932.88.89.195.89.1207842114.squirrel@cens.ioc.ee> <9457e7c80804101007ye5bfc4l5d963da1ad405b06@mail.gmail.com> <51953.88.89.195.89.1207853348.squirrel@cens.ioc.ee> <9457e7c80804101257t5f4656e7m3f2770f27dbe5089@mail.gmail.com> <51831.88.89.195.89.1207859587.squirrel@cens.ioc.ee> Message-ID: <85b5c3130804110225j4ba37852v4d5024951ccff2f3@mail.gmail.com> On Fri, Apr 11, 2008 at 8:58 AM, Jarrod Millman wrote: > On Thu, Apr 10, 2008 at 11:49 PM, Anne Archibald > wrote: > > The effort required to write unittests is also fairly minimal: I > > always need to test my code anyway to see if it works, so it's fairly > > easy to just put my tests in numpy's test harness framework. (My only > > complaint is that it's not always as easy as it might be to tell > > whether my tests are actually being run.) > > Hopefully, the switch to nose > (http://somethingaboutorange.com/mrl/projects/nose/) will help solve > the problem with making sure all the tests are actually being run. I am not sure anyone is sure all the tests are actually being run, and let me say -- at least in Debian, there were always been failures (or at least some weird warnings) with last two released versions of scipy. That is of course very bad. But we all know that, so hopefully we'll improve in the future. Ondrej From pearu at cens.ioc.ee Fri Apr 11 07:07:52 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 11 Apr 2008 13:07:52 +0200 Subject: [SciPy-dev] F2PY future Message-ID: <47FF4688.8090005@cens.ioc.ee> Hi, I am in a process of writing a scientific paper about F2PY that will provide an automatic solution to the Python and Fortran connection problem. While writing it, I also need to decide what will be the future of F2PY. In particulary, I have the following main questions to which I am looking for suggestions: 1) where the future users of F2PY should find it, 2) how the users can get support (documentation, mailing lists, etc). 3) where to continue the development of F2PY. Currently, F2PY has three "home pages": 1) http://cens.ioc.ee/projects/f2py2e/ - this has old f2py. The old f2py is unique in that it covers Numeric and numarray support, but is not being developed anymore. 2) http://www.scipy.org/F2py - this covers the current f2py included in NumPy. f2py in numpy is rather stable and is being maintained. There is no plans to add new functionalities (like F90 derived type support) to the numpy f2py. 3) http://projects.scipy.org/scipy/numpy/wiki/G3F2PY - this is a wiki page for the third generation of f2py. It aims at adding full Fortran 90/../2003 support to the f2py tool, including F90 derived types as well as POINTER arguments. It should replace numpy f2py in future. Obviosly, the three "home pages" for f2py is too much, even when they cover three different code sets. So, now I am looking for to unify these places to one site that will cover all three code sets with software, documentation, and support. Currently I can think of the following options: Use Google Code. Pros: it provides necessary infrastructure to develop software projects and I am used to it. Cons: in my experience Google Code has been too many times broken (at least three times in half a year), though this may improve in future. Also, Google Code provides only SVN, no hg. Since f2py will be an important tool for numpy/scipy users, it would be natural to continue developing f2py under these projects. However, there are rumours of cleaning up scipy/numpy from extension generation tools and so in long term, f2py may need to look for another home. So, I wonder if a hosting could be provided for f2py? Say, in a form of f2py.scipy.org or www.f2py.org? I am rather ignorant about these matters, so any help will be appreciated. Thanks, Pearu From wnbell at gmail.com Fri Apr 11 12:36:35 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 11 Apr 2008 11:36:35 -0500 Subject: [SciPy-dev] F2PY future In-Reply-To: <47FF4688.8090005@cens.ioc.ee> References: <47FF4688.8090005@cens.ioc.ee> Message-ID: On Fri, Apr 11, 2008 at 6:07 AM, Pearu Peterson wrote: > However, there are rumours of cleaning up scipy/numpy from extension > generation tools and so in long term, f2py may need to look for another > home. Do you mean that NumPy/SciPy would no longer include f2py source code or that NumPy/SciPy would no longer use extension generation tools at all? > So, I wonder if a hosting could be provided for f2py? Say, in a > form of f2py.scipy.org or www.f2py.org? I am rather ignorant about these > matters, so any help will be appreciated. If you haven't done so already, register f2py.org immediately (and possibly the .com as well). Many registration services (e.g. godaddy.com) allow you to forward the URL to another URL. For instance, http://pyamg.org currently redirects to http://code.google.com/p/pyamg/. If you chose to run your own host in the future, then you can just point the DNS record for your domain to the host's IP. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From strawman at astraw.com Fri Apr 11 12:48:21 2008 From: strawman at astraw.com (Andrew Straw) Date: Fri, 11 Apr 2008 09:48:21 -0700 Subject: [SciPy-dev] [Numpy-discussion] F2PY future In-Reply-To: <47FF4688.8090005@cens.ioc.ee> References: <47FF4688.8090005@cens.ioc.ee> Message-ID: <47FF9655.1090407@astraw.com> Pearu Peterson wrote: > Use Google Code. Pros: it provides necessary infrastructure to develop > software projects and I am used to it. Cons: in my experience Google > Code has been too many times broken (at least three times in half a > year), though this may improve in future. Also, Google Code provides > only SVN, no hg. > Another option: the IPython people have been using launchpad.net ( https://launchpad.net/ipython ) -- it supports bzr. I'm not sure how happy they are with it, but I think happy enough to stick with it rather than attempt to get a server with hg set up. IIRC, they did initially marginally prefer hg over bzr but the immediate availability of bzr hosting swung them over. -Andrew From oliphant at enthought.com Fri Apr 11 13:23:34 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Fri, 11 Apr 2008 12:23:34 -0500 Subject: [SciPy-dev] SciPy to be down intermittently today Message-ID: <47FF9E96.4030202@enthought.com> Hey everyone, The scipy.org site will be down intermittently today. We are trying to upgrade its memory to improve performance. Thank you, -Travis O. From ellisonbg.net at gmail.com Fri Apr 11 13:28:19 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Apr 2008 11:28:19 -0600 Subject: [SciPy-dev] [Numpy-discussion] F2PY future In-Reply-To: <47FF9655.1090407@astraw.com> References: <47FF4688.8090005@cens.ioc.ee> <47FF9655.1090407@astraw.com> Message-ID: <6ce0ac130804111028p64b56a0fyd49d3cb036641eb0@mail.gmail.com> > Another option: the IPython people have been using launchpad.net ( > https://launchpad.net/ipython ) -- it supports bzr. I'm not sure how > happy they are with it, but I think happy enough to stick with it rather > than attempt to get a server with hg set up. IIRC, they did initially > marginally prefer hg over bzr but the immediate availability of bzr > hosting swung them over. I can't speak for the rest of the ipython dev team, but overall, we have really liked the bzr+launchpad combo. We haven't made a final decision to stick with it, but I imagine we will (everyone has been very happy so far with it). Bottom line: we have been better able to collaborate and the admin side of things has been essentially 0. We all like hg as well, but having the supernice hosting on launchpad really makes a difference. Brian > -Andrew > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From fperez.net at gmail.com Fri Apr 11 13:34:57 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Apr 2008 10:34:57 -0700 Subject: [SciPy-dev] [Numpy-discussion] F2PY future In-Reply-To: <47FF9655.1090407@astraw.com> References: <47FF4688.8090005@cens.ioc.ee> <47FF9655.1090407@astraw.com> Message-ID: On Fri, Apr 11, 2008 at 9:48 AM, Andrew Straw wrote: > Pearu Peterson wrote: > > Use Google Code. Pros: it provides necessary infrastructure to develop > > software projects and I am used to it. Cons: in my experience Google > > Code has been too many times broken (at least three times in half a > > year), though this may improve in future. Also, Google Code provides > > only SVN, no hg. > > > Another option: the IPython people have been using launchpad.net ( > https://launchpad.net/ipython ) -- it supports bzr. I'm not sure how > happy they are with it, but I think happy enough to stick with it rather > than attempt to get a server with hg set up. IIRC, they did initially > marginally prefer hg over bzr but the immediate availability of bzr > hosting swung them over. Pretty happy, for the most part. The launchpad site is a bit messy to navigate, some things aren't totally obvious (which should be) and we've all made a few mistakes with the transition to a bzr workflow, but we'll likely stick to it. Over the next couple of weeks, in between grant deadlines, I hope to push the process of moving 'for real' our code hosting over to lp, which will require doing a real import of the SVN history and something with our Trac bugs. It's important to note that we are NOT abandoning ipython.scipy.org: we'll keep the Moin site, mailing lists, etc as it is. But unless something weird comes up in a final "let's do this" discussion with ipython-dev, I think the decision to make a permanent transition is more or less done. I just don't see anything better out there that exists *right now*. >From a technical perspective I think the hg/bzr discussion is nearly a toss-up, and I'm not interested in spending weeks on the fine points that may lie there. Bzr does the job, more than well enough for us. The real kicker was the vast amount of infrastructure provided by launchpad, as you point out above. While it's not perfect, it does a lot of very good things, and it's widely used by many more projects, thus likely to continue improving. As a note, we've also just switched neuroimaging.scipy.org to launchpad (again, just the code hosting part): https://launchpad.net/nipy If others here are interested, I can give more details on the benefits we've already reaped from the launchpad transition on both ipython and nipy. Regards, f From ellisonbg.net at gmail.com Fri Apr 11 13:47:51 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Apr 2008 11:47:51 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? Message-ID: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> Hi, Just because there haven't been any interesting threads lately.... So, I am currently implementing a distributed memory array package for python: http://projects.scipy.org/ipython/ipython/browser/ipythondistarray The goal is to have distributed/parallel arrays that look and feel just like numpy arrays. Here is an example: import ipythondistarray as ipda a = ipda.random.rand((10,100,100), dist=(None,'b','c')) b = ipda.random.rand((10,100,100), dist=(None,'b','c')) c = 0.5*ipda.sin(a) + 0.5*ipda.cos(b) print c.sum(), c.mean(), c.std(), c.var() This works today on multiple processors. Don't get too excited though, there is still _tons_ of work to be done.... Here is the main issue: I am running into a need for sparse arrays. There are two places I am running into this: 1) I want to implement sparse distributed arrays and need sparse local arrays for this. 2) There are other places in the implementation where sparse arrays are needed. Obviously, my first though was scipy.sparse. I am _really_ excited about the massive improvements that have been happening in this area recently. Here are the problems I am running into: 1) I need N-dimensional sparse arrays. Some of the storage formats in scipy.sparse (dok, coo, maybe lil) could be generalized to N-dimensions, but some work would have to be done. 2) I need these things to be in numpy. I hate to start another "should this go into numpy or scipy" thread, but I actually do think there is a decent case for moving the core sparse arrays into numpy (not the solvers though). Please hear me out: a) Numpy at its core is about arrays. Conceptually, sparse arrays fit into this narrow vision of Numpy. b) Sparse arrays are just as foundational as dense arrays in many areas of computing/science (I would argue, that they are more foundational than ffts and random numbers). c) Moving the core sparse arrays into numpy would increase their visibility and encourage other projects to rely on them. d) It would not make numpy more difficult to build. e) It is currently somewhat confusing that they are not in numpy (remember Numpy = arrays). 3) I need sparse arrays that are implemented more in C. What do I mean by this. I am using cython for the performance critical parts of my package and there are certain things (indexing in tight loops for example) that I need to do in c. Because the current sparse array classes are written in pure python (with a few c++ routines underneath for format conversions), this is difficult. So... I think it would be a very good idea to begin moving the sparse array classes to cython code. This would be a very nice approach because it could be done gradually, without breaking any of the API. The benefit is that we could improve the performance of the sparse array classes drammatically, while keeping things very maintainable. In summary, I am proposing: 1) That we move the core sparse array classes from scipy.sparse to a new package numpy.sparse 2) That we extend some sparse arrays classes to be fully N-dimensional. 3) That we begin to move their implementation to using Cython (as an aside, cython does play very well with templated C++ code). This could provide a much nicer way of tying into the c++ code than using swig. Alright, fire away :) Brian From pearu at cens.ioc.ee Fri Apr 11 14:53:23 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 11 Apr 2008 21:53:23 +0300 (EEST) Subject: [SciPy-dev] F2PY future In-Reply-To: References: <47FF4688.8090005@cens.ioc.ee> Message-ID: <60131.88.89.195.89.1207940003.squirrel@cens.ioc.ee> On Fri, April 11, 2008 7:36 pm, Nathan Bell wrote: > On Fri, Apr 11, 2008 at 6:07 AM, Pearu Peterson wrote: > >> However, there are rumours of cleaning up scipy/numpy from extension >> generation tools and so in long term, f2py may need to look for another >> home. > > Do you mean that NumPy/SciPy would no longer include f2py source code yes, but that is just a rumour:), one should not expect that to happen in near future. I think first we need g3 f2py. > or that NumPy/SciPy would no longer use extension generation tools at > all? they should certainly do that as extension generation tools guarantee higher quality of the extension modules than writing hundreds of extension modules by hand. >> So, I wonder if a hosting could be provided for f2py? Say, in a >> form of f2py.scipy.org or www.f2py.org? I am rather ignorant about >> these >> matters, so any help will be appreciated. > > If you haven't done so already, register f2py.org immediately (and > possibly the .com as well). Many registration services (e.g. > godaddy.com) allow you to forward the URL to another URL. For > instance, http://pyamg.org currently redirects to > http://code.google.com/p/pyamg/. If you chose to run your own host in > the future, then you can just point the DNS record for your domain to > the host's IP. Thanks for the hints, they were very useful. I have registered f2py.org and f2py.com domains and the redirections work fine. This solves one problem: now one can advertise f2py via www.f2py.com (or www.f2py.org) and it should be stable at least for the next 10 years:). Thanks, Pearu From wnbell at gmail.com Fri Apr 11 15:26:09 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 11 Apr 2008 14:26:09 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> Message-ID: On Fri, Apr 11, 2008 at 12:47 PM, Brian Granger wrote: > 1) I need N-dimensional sparse arrays. Some of the storage formats in > scipy.sparse (dok, coo, maybe lil) could be generalized to > N-dimensions, but some work would have to be done. To make this efficient, you'd probably need a lower-level implementation of a hash-based container like DOK. BTW, which applications use sparse N-dimensional arrays? > 2) I need these things to be in numpy. I hate to start another > "should this go into numpy or scipy" thread, but I actually do think > there is a decent case for moving the core sparse arrays into numpy > (not the solvers though). Please hear me out: I don't see the need for or utility of sparse arrays in numpy. Numpy does low-level manipulation of dense arrays/memory buffers. > 3) I need sparse arrays that are implemented more in C. What do I > mean by this. I am using cython for the performance critical parts of > my package and there are certain things (indexing in tight loops for > example) that I need to do in c. Because the current sparse array > classes are written in pure python (with a few c++ routines underneath > for format conversions), this is difficult. So... All of the costly operations on CSR/CSC/COO matrices are done in C++. Only lil_matrix and dok_matrix are pure-Python implementations. Indexing sparse arrays inside a Python loop *is* slow, but there's not much that can be done about it. > I think it would be a very good idea to begin moving the sparse array > classes to cython code. This would be a very nice approach because it > could be done gradually, without breaking any of the API. The benefit > is that we could improve the performance of the sparse array classes > drammatically, while keeping things very maintainable. I'm not aware of any performance problems with the existing backend to scipy.sparse. What would you implement in cython that's not already implemented in scipy.sparse.sparsetools? > 3) That we begin to move their implementation to using Cython (as an > aside, cython does play very well with templated C++ code). This > could provide a much nicer way of tying into the c++ code than using > swig. In the context of scipy.sparse, what's wrong with SWIG and C++? What would be improved by using Cython? > Alright, fire away :) :) -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From robert.kern at gmail.com Fri Apr 11 17:30:21 2008 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 11 Apr 2008 16:30:21 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> Message-ID: <3d375d730804111430s15e801bcs51f9fd1943848041@mail.gmail.com> On Fri, Apr 11, 2008 at 12:47 PM, Brian Granger wrote: > 2) I need these things to be in numpy. I hate to start another > "should this go into numpy or scipy" thread, but I actually do think > there is a decent case for moving the core sparse arrays into numpy > (not the solvers though). Please hear me out: > > a) Numpy at its core is about arrays. Conceptually, sparse arrays fit > into this narrow vision of Numpy. Previously, Travis has stated a desire to get some form of standard sparse array (or possibly just matrix) support into numpy 1.1 for precisely this reason. I happen to agree. However, I have to address the following points. > b) Sparse arrays are just as foundational as dense arrays in many > areas of computing/science (I would argue, that they are more > foundational than ffts and random numbers). The parenthetical is not a relevant argument. numpy.{linalg,fft,random} exist because of history, not design. In order to convince people to move from Numeric to numpy, we *had* to support a transition from the LinearAlgebra, FFT, and RandomArray modules that were distributed with Numeric. > d) It would not make numpy more difficult to build. A fair amount of the current sparse code uses C++ templates, so I will have to say that this statement needs qualification. The impact may be low, but it is not negligible. If we rewrite it not to use C++, then we don't know how much more difficult it will be. We will need to evaluate that when the code is written. The only thing that we can be sure won't make numpy more difficult to build is pure Python code. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ellisonbg.net at gmail.com Fri Apr 11 17:35:19 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Apr 2008 15:35:19 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> Message-ID: <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> > > 1) I need N-dimensional sparse arrays. Some of the storage formats in > > scipy.sparse (dok, coo, maybe lil) could be generalized to > > N-dimensions, but some work would have to be done. > > To make this efficient, you'd probably need a lower-level > implementation of a hash-based container like DOK. > > BTW, which applications use sparse N-dimensional arrays? The big place that I need them right now is for handling so called ghost cells. If you are not familiar with this idea, ghost cells are parts of a distributed array that are not on your local processor (they are on another one). When we fetch the ghost cells we need a structure to store them in and a sparse N-dim array is the best option. In terms of true applications for N-dim sparse arrays, I am not sure. But, the higher-dimension an array is the more important sparsity is. Sure numpy can handle 6 dimensional arrays, but if they are dense, they will likely be too big. So I guess a better question is: what are the applications for N-dimensional dense arrays -> if you find an application for N>3, then you should probably consider using sparse arrays. > > 2) I need these things to be in numpy. I hate to start another > > "should this go into numpy or scipy" thread, but I actually do think > > there is a decent case for moving the core sparse arrays into numpy > > (not the solvers though). Please hear me out: > > I don't see the need for or utility of sparse arrays in numpy. Numpy > does low-level manipulation of dense arrays/memory buffers. True, well sort of. That is what Numpy does today. But, I think there are good reasons to extend that reach more broadly to other types of arrays. My understanding of the scope of numpy is that it is the "base layer." I feel that sparse arrays are (conceptually) a part of that base layer. > > > 3) I need sparse arrays that are implemented more in C. What do I > > mean by this. I am using cython for the performance critical parts of > > my package and there are certain things (indexing in tight loops for > > example) that I need to do in c. Because the current sparse array > > classes are written in pure python (with a few c++ routines underneath > > for format conversions), this is difficult. So... > > All of the costly operations on CSR/CSC/COO matrices are done in C++. > Only lil_matrix and dok_matrix are pure-Python implementations. Yes, but to use the fast c++ code, I have to go through a slow python layer - the actual csr/csc/coo classes and slow+fast = slow in my book. Also, the dok/lil formats are some of the most important and they should be optimized. > Indexing sparse arrays inside a Python loop *is* slow, but there's not > much that can be done about it. I will write a simple dok format sparse matrix using cython and we can compare the performance. > > > I think it would be a very good idea to begin moving the sparse array > > classes to cython code. This would be a very nice approach because it > > could be done gradually, without breaking any of the API. The benefit > > is that we could improve the performance of the sparse array classes > > drammatically, while keeping things very maintainable. > > I'm not aware of any performance problems with the existing backend to > scipy.sparse. What would you implement in cython that's not already > implemented in scipy.sparse.sparsetools? This is absolutely true. Don't misunderstand me. I think that the c++ backend in sparsetools in _really_ nice and I am not suggesting changing that to anything else. The only think I would implement in cython is the higher-level sparse classes. These are the slow part (they are pure python) and cython would help a lot there. Again, I _really_ like the current design and interfaces of these classes and I am not suggesting changing that (other than to support N-dim arrays for certain formats). > > > 3) That we begin to move their implementation to using Cython (as an > > aside, cython does play very well with templated C++ code). This > > could provide a much nicer way of tying into the c++ code than using > > swig. > > In the context of scipy.sparse, what's wrong with SWIG and C++? What > would be improved by using Cython? Think of it in terms of layers (I will use the csr format as an example): sparse/csr.py => top level python class that wraps the lower layers. sparsetools/csr.py => thin swig generated python wrapper that introduces overhead sparsetools/csr.h => fast c++ code Using swig forces me to deal with two layers of python code before I can talk to the fast c++ code. The problem with this is that I want to call the top level layer from C! This would be like NumPy not having a C API! The cython stack would look like this sparse/csr.pyx => top level python class that is a C extension type and can be called from C or python sparsetools/csr.h => fast c++ code In that case, I could write my extension code in C and talk directly to the C interface without any overhead of multiple layers of Python. To me that is a huge difference. But again, the existing c++ code could be used with probably no modifications. Brian > > Alright, fire away :) > > :) > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ellisonbg.net at gmail.com Fri Apr 11 17:41:36 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Apr 2008 15:41:36 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <3d375d730804111430s15e801bcs51f9fd1943848041@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <3d375d730804111430s15e801bcs51f9fd1943848041@mail.gmail.com> Message-ID: <6ce0ac130804111441j48357ab3q7459cbd9c2bdbfa2@mail.gmail.com> > > 2) I need these things to be in numpy. I hate to start another > > "should this go into numpy or scipy" thread, but I actually do think > > there is a decent case for moving the core sparse arrays into numpy > > (not the solvers though). Please hear me out: > > > > a) Numpy at its core is about arrays. Conceptually, sparse arrays fit > > into this narrow vision of Numpy. > > Previously, Travis has stated a desire to get some form of standard > sparse array (or possibly just matrix) support into numpy 1.1 for > precisely this reason. I happen to agree. However, I have to address > the following points. I did not know that. It would definitely have to wait until 1.1. > > > b) Sparse arrays are just as foundational as dense arrays in many > > areas of computing/science (I would argue, that they are more > > foundational than ffts and random numbers). > > The parenthetical is not a relevant argument. > numpy.{linalg,fft,random} exist because of history, not design. In > order to convince people to move from Numeric to numpy, we *had* to > support a transition from the LinearAlgebra, FFT, and RandomArray > modules that were distributed with Numeric. I understand this, but that doesn't mean we can't make future decisions about what goes into NumnPy based on more principled ideas. > > > d) It would not make numpy more difficult to build. > > A fair amount of the current sparse code uses C++ templates, so I will > have to say that this statement needs qualification. The impact may be > low, but it is not negligible. If we rewrite it not to use C++, then > we don't know how much more difficult it will be. We will need to > evaluate that when the code is written. The only thing that we can be > sure won't make numpy more difficult to build is pure Python code. At some level this is true. C++ template are nice in principle, but not always so in practice because of build issues. We would just have to see, but I am hopeful. On the other hand, if the build problems were really that much of a problem, they would be equally problematic in scipy. Brian > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > -- Umberto Eco > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Fri Apr 11 17:50:22 2008 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 11 Apr 2008 16:50:22 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111441j48357ab3q7459cbd9c2bdbfa2@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <3d375d730804111430s15e801bcs51f9fd1943848041@mail.gmail.com> <6ce0ac130804111441j48357ab3q7459cbd9c2bdbfa2@mail.gmail.com> Message-ID: <3d375d730804111450t26181d59u518359b5e63116ab@mail.gmail.com> On Fri, Apr 11, 2008 at 4:41 PM, Brian Granger wrote: > > > 2) I need these things to be in numpy. I hate to start another > > > "should this go into numpy or scipy" thread, but I actually do think > > > there is a decent case for moving the core sparse arrays into numpy > > > (not the solvers though). Please hear me out: > > > > > > a) Numpy at its core is about arrays. Conceptually, sparse arrays fit > > > into this narrow vision of Numpy. > > > > Previously, Travis has stated a desire to get some form of standard > > sparse array (or possibly just matrix) support into numpy 1.1 for > > precisely this reason. I happen to agree. However, I have to address > > the following points. > > I did not know that. It would definitely have to wait until 1.1. > > > > b) Sparse arrays are just as foundational as dense arrays in many > > > areas of computing/science (I would argue, that they are more > > > foundational than ffts and random numbers). > > > > The parenthetical is not a relevant argument. > > numpy.{linalg,fft,random} exist because of history, not design. In > > order to convince people to move from Numeric to numpy, we *had* to > > support a transition from the LinearAlgebra, FFT, and RandomArray > > modules that were distributed with Numeric. > > I understand this, but that doesn't mean we can't make future > decisions about what goes into NumnPy based on more principled ideas. My point *was* that the decision to include them wasn't based on a principled stance. Their presence is not evidence for any particular principle about what should be in numpy. You cannot use them as a reference point. > > > d) It would not make numpy more difficult to build. > > > > A fair amount of the current sparse code uses C++ templates, so I will > > have to say that this statement needs qualification. The impact may be > > low, but it is not negligible. If we rewrite it not to use C++, then > > we don't know how much more difficult it will be. We will need to > > evaluate that when the code is written. The only thing that we can be > > sure won't make numpy more difficult to build is pure Python code. > > At some level this is true. C++ template are nice in principle, but > not always so in practice because of build issues. We would just have > to see, but I am hopeful. On the other hand, if the build problems > were really that much of a problem, they would be equally problematic > in scipy. C++ alone makes it more difficult. For a long time we have held firm on only including C in numpy. Yes, the C++ does make scipy more difficult to build, but the standards for acceptable difficulty are different for scipy than numpy. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From wnbell at gmail.com Sat Apr 12 00:45:23 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 11 Apr 2008 23:45:23 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> Message-ID: On Fri, Apr 11, 2008 at 4:35 PM, Brian Granger wrote: > > All of the costly operations on CSR/CSC/COO matrices are done in C++. > > Only lil_matrix and dok_matrix are pure-Python implementations. > > Yes, but to use the fast c++ code, I have to go through a slow python > layer - the actual csr/csc/coo classes and slow+fast = slow in my > book. Also, the dok/lil formats are some of the most important and > they should be optimized. Can you demonstrate a real-world scenario where the overhead is noticeable? In all cases that I'm aware of the number of "slow" operations is O(1). OTOH the "fast" code always does at least O(nnz) operations. You are right about the lil and dok formats. They are currently too slow for large-scale problems. You can have your way with them :) > Think of it in terms of layers (I will use the csr format as an example): > > sparse/csr.py => top level python class that wraps the lower layers. > sparsetools/csr.py => thin swig generated python wrapper that > introduces overhead > sparsetools/csr.h => fast c++ code > > Using swig forces me to deal with two layers of python code before I > can talk to the fast c++ code. The problem with this is that I want > to call the top level layer from C! This would be like NumPy not > having a C API! > > The cython stack would look like this > > sparse/csr.pyx => top level python class that is a C extension > type and can be called from C or python > sparsetools/csr.h => fast c++ code > > In that case, I could write my extension code in C and talk directly > to the C interface without any overhead of multiple layers of Python. I understand your point, but it's not immediately clear to me that the SWIG-induced overhead is actually troublesome. For frequent calls to simple operations I see the purpose. However, each of functions in sparsetools does a substantial amount of work per call. Also, what code that would want to interface directly with csr.h wouldn't live in sparsetools itself? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From peridot.faceted at gmail.com Sat Apr 12 00:53:59 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Sat, 12 Apr 2008 00:53:59 -0400 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> Message-ID: On 11/04/2008, Brian Granger wrote: > > > 1) I need N-dimensional sparse arrays. Some of the storage formats in > > > scipy.sparse (dok, coo, maybe lil) could be generalized to > > > N-dimensions, but some work would have to be done. > > > > To make this efficient, you'd probably need a lower-level > > implementation of a hash-based container like DOK. > > > > BTW, which applications use sparse N-dimensional arrays? > > The big place that I need them right now is for handling so called > ghost cells. If you are not familiar with this idea, ghost cells are > parts of a distributed array that are not on your local processor > (they are on another one). When we fetch the ghost cells we need a > structure to store them in and a sparse N-dim array is the best > option. Hmm. It seems to me that, in the interests of efficiency, it will not often be a good idea to allocate data to processors on an element-by-element basis; instead one will often want to allocate blocks of elements to each processor. This will be very inefficiently represented by ordinary sparse matrices, which take no advantage of ranges or higher-dimensional blocks of nonzero entries. What's more, an opaque sparse matrix library would be very frustrating in this context, since you want to distinguish unavailable entries from entries that are actually zero. It seems like what you need is some sort of proxy object that keeps track of some chunks of an array and serves them up as requested - possibly even as views of underlying numpy arrays - if available, and calls some network message-passing code if they're not available. How do existing distributed-array toolkits handle the problem? Anne From ellisonbg.net at gmail.com Sat Apr 12 11:37:31 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Sat, 12 Apr 2008 09:37:31 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> Message-ID: <6ce0ac130804120837o49713c24m380dac24a7cef479@mail.gmail.com> > It seems to me that, in the interests of efficiency, it will not often > be a good idea to allocate data to processors on an element-by-element > basis; instead one will often want to allocate blocks of elements to > each processor. This will be very inefficiently represented by > ordinary sparse matrices, which take no advantage of ranges or > higher-dimensional blocks of nonzero entries. What's more, an opaque > sparse matrix library would be very frustrating in this context, since > you want to distinguish unavailable entries from entries that are > actually zero. You are absolutely correct. But the actual partitioning that is optimal is determined by the particular applications. We are trying to write a distributed array library that is application neutral. Thus, the partitioning algorithms are essentially arbitrary and we have to design the rest of our library to work with these arbitrary distributions. We provide block, cyclic and block-cyclic distributions by default, by the user can write their own partitioners with very little constraint. The only constrain we have is that the data distributions are cartesian products along the different axes. Of course, we select reasonable defaults for the user - the default distribution is block distributed along the first axis. > It seems like what you need is some sort of proxy object that keeps > track of some chunks of an array and serves them up as requested - > possibly even as views of underlying numpy arrays - if available, and > calls some network message-passing code if they're not available. Yes, the way we are designing our interface is that when a particular processor needs certain global elements, it has to prefetch them before actually using them - this is very much like the proxying you are suggesting. We will have block oriented prefetchers that fetch blocks and store them as local dense numpy arrays, but we will also have single element prefetchers - and for those sparse is the only way to go. The situation is similar to using numpy arrays - there are slow and fast ways of using numpy arrays.. It is up to the user to choose for their particular problem how to use numpy arrays. But, numpy arrays still work as expected in either case (even it you write inefficient code). > How do existing distributed-array toolkits handle the problem? They provide horribly complex interfaces and make way too many assumptions (only vectors and matrices, only certain distributions) to be useful to your average scientist :) Cheers, Brian > Anne > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ellisonbg.net at gmail.com Sat Apr 12 13:21:39 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Sat, 12 Apr 2008 11:21:39 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> Message-ID: <6ce0ac130804121021r5cfbd5bctf93213c517a82c84@mail.gmail.com> > Can you demonstrate a real-world scenario where the overhead is > noticeable? In all cases that I'm aware of the number of "slow" > operations is O(1). OTOH the "fast" code always does at least O(nnz) > operations. Sure. Indexing (getting items) is O(1), but if I do it N=some big number times, it is O(N*1) = O(N). If I have to do all the indexing through python (__getitem__) I have the overhead of calling a python function all N times. This is the same reason we all encourage people to try to avoid writing for i in range(N) style loops to do things with numpy arrays. > You are right about the lil and dok formats. They are currently too > slow for large-scale problems. You can have your way with them :) Thanks, I will probably use them as templates to begin seeing i I can make things faster with cython. This big thing I need right now is fast get/set for individual items (random access). > > Think of it in terms of layers (I will use the csr format as an example): > > > > sparse/csr.py => top level python class that wraps the lower layers. > > sparsetools/csr.py => thin swig generated python wrapper that > > introduces overhead > > sparsetools/csr.h => fast c++ code > > > > Using swig forces me to deal with two layers of python code before I > > can talk to the fast c++ code. The problem with this is that I want > > to call the top level layer from C! This would be like NumPy not > > having a C API! > > > > The cython stack would look like this > > > > sparse/csr.pyx => top level python class that is a C extension > > type and can be called from C or python > > sparsetools/csr.h => fast c++ code > > > > In that case, I could write my extension code in C and talk directly > > to the C interface without any overhead of multiple layers of Python. > > I understand your point, but it's not immediately clear to me that the > SWIG-induced overhead is actually troublesome. For frequent calls to > simple operations I see the purpose. However, each of functions in > sparsetools does a substantial amount of work per call. Yep, it is really just the small stuff that is not being done in sparsetools, but that might need to be done many times that could be optimized. > Also, what code that would want to interface directly with csr.h > wouldn't live in sparsetools itself? Basically, I am thinking of a public C-API for doing things with the sparse arrays. A true C extension type for the sparse array class. That could go into sparsetools, but I think it will be easier to write/maintain in cython. Cheers, Brian > -- > > > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From wnbell at gmail.com Sat Apr 12 21:29:26 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 12 Apr 2008 20:29:26 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804121021r5cfbd5bctf93213c517a82c84@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> <6ce0ac130804121021r5cfbd5bctf93213c517a82c84@mail.gmail.com> Message-ID: On Sat, Apr 12, 2008 at 12:21 PM, Brian Granger wrote: > Sure. Indexing (getting items) is O(1), but if I do it N=some big > number times, it is O(N*1) = O(N). If I have to do all the indexing > through python (__getitem__) I have the overhead of calling a python > function all N times. This is the same reason we all encourage people > to try to avoid writing for i in range(N) style loops to do things > with numpy arrays. This is better accomplished by supporting indexing of the form A[[1,2,3],[1,2,3]] in sparsetools. The other forms of indexing are already handled efficiently. > Yep, it is really just the small stuff that is not being done in > sparsetools, but that might need to be done many times that could be > optimized. > Basically, I am thinking of a public C-API for doing things with the > sparse arrays. A true C extension type for the sparse array class. > That could go into sparsetools, but I think it will be easier to > write/maintain in cython. Why would you want to do that? It would make it more difficult to change the backend in the future for no apparent benefit. IMO you're obligated to show a real-world shortcoming or performance problem with the existing code before proposing to replace Python with something more convoluted. I agree that lil_matrix and dok_matrix are slow pure-Python implementations. OTOH I don't believe that csr_matrix and friends suffer the same inefficiencies. Adding vectorized indexing for CSR/CSC to sparsetools is straightforward and addresses the only problem you've pointed out. /doesn't drink the Cython Kool-Aid -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From millman at berkeley.edu Sun Apr 13 06:01:09 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Sun, 13 Apr 2008 03:01:09 -0700 Subject: [SciPy-dev] Fwd: [matplotlib-devel] standard abbreviation for pyplot or pylab? In-Reply-To: References: <48010B4B.9070205@hawaii.edu> <88e473830804121524q5665af2bn4ddb307d2c1e7ec5@mail.gmail.com> Message-ID: I just realized that this discussion was only on the matplotlib mailing list. Basically, they were clarifying that the import statement for pyplot should be: import matplotlib.pyplot as plt Sorry for the confusion, ---------- Forwarded message ---------- From: Jarrod Millman Date: Sun, Apr 13, 2008 at 2:58 AM Subject: Re: [matplotlib-devel] standard abbreviation for pyplot or pylab? To: John Hunter Cc: Eric Firing , matplotlib development list On Sat, Apr 12, 2008 at 3:24 PM, John Hunter wrote: > On Sat, Apr 12, 2008 at 2:19 PM, Eric Firing wrote: > > and then this was added, as something else that had been agreed at the > > same sprint: > > import pylab as plt > > I think this is a mistake, and it should have been > > > > import matplotlib.pyplot as plt > > This is what we agreed to (import matplotlib.pyplot as plt) during the > numpy sprint (I was on google chat remotely but participated in the > discussion). I agree that pylab as plt would just add to the > confusion. We agreed on promoting (and I currently use) > > > import numpy as np > import scipy as sp > > import matplotlib.pyplot as plt Sorry, that was my mistake. Thanks for clearing it up. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From stefan at sun.ac.za Sun Apr 13 14:24:17 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Sun, 13 Apr 2008 20:24:17 +0200 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <6ce0ac130804111435t22529109t45c91389abd71b3f@mail.gmail.com> <6ce0ac130804121021r5cfbd5bctf93213c517a82c84@mail.gmail.com> Message-ID: <9457e7c80804131124s70ad4424u8f5a0221661e44ab@mail.gmail.com> On 13/04/2008, Nathan Bell wrote: > /doesn't drink the Cython Kool-Aid Let's not associate Cython with Kool-Aid. Like you mentioned, when it is warranted (and only then), C or Cython code provides significant speedup. I'd be sad to see people avoiding Cython after reading statements like the above. Regards St?fan From ondrej at certik.cz Mon Apr 14 01:59:11 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 07:59:11 +0200 Subject: [SciPy-dev] F2PY future In-Reply-To: <60131.88.89.195.89.1207940003.squirrel@cens.ioc.ee> References: <47FF4688.8090005@cens.ioc.ee> <60131.88.89.195.89.1207940003.squirrel@cens.ioc.ee> Message-ID: <85b5c3130804132259m5fb1f313jadaf2c334187774d@mail.gmail.com> On Fri, Apr 11, 2008 at 8:53 PM, Pearu Peterson wrote: > On Fri, April 11, 2008 7:36 pm, Nathan Bell wrote: > > On Fri, Apr 11, 2008 at 6:07 AM, Pearu Peterson wrote: > > > >> However, there are rumours of cleaning up scipy/numpy from extension > >> generation tools and so in long term, f2py may need to look for another > >> home. > > > > Do you mean that NumPy/SciPy would no longer include f2py source code > > yes, but that is just a rumour:), one should not expect that to > happen in near future. I think first we need g3 f2py. > > > > or that NumPy/SciPy would no longer use extension generation tools at > > all? > > they should certainly do that as extension generation tools > guarantee higher quality of the extension modules than writing > hundreds of extension modules by hand. > > > >> So, I wonder if a hosting could be provided for f2py? Say, in a > >> form of f2py.scipy.org or www.f2py.org? I am rather ignorant about > >> these > >> matters, so any help will be appreciated. > > > > If you haven't done so already, register f2py.org immediately (and > > possibly the .com as well). Many registration services (e.g. > > godaddy.com) allow you to forward the URL to another URL. For > > instance, http://pyamg.org currently redirects to > > http://code.google.com/p/pyamg/. If you chose to run your own host in > > the future, then you can just point the DNS record for your domain to > > the host's IP. > > Thanks for the hints, they were very useful. I have registered > f2py.org and f2py.com domains and the redirections work > fine. This solves one problem: now one can advertise f2py via > www.f2py.com (or www.f2py.org) and it should be stable at > least for the next 10 years:). Excellent. Let me add, that I, as the user of f2py and also right now one of the maintainers of the numpy/scipy packages in Debian, I find it really bad, if the tools move around from a package to package so often. It makes thing really hard to package and follow. As an example, Debian sarge, Debian etch and Debian lenny all have different ways how to install and work with numpy, scipy and f2py. This is bad, because everyone has to relearn all the time and also if you, say, install scipy from sources (because the released version is too old), it's freaking hard, because I cannot just take use the same tricks as in unstable, because things has moved around since etch. So my suggestion is either choose numpy, or your own f2py package, but stick to it forever. :) Also, how are you going to handle and package this g3 f2py? Cannot it be just part of the regular f2py? Ondrej From cimrman3 at ntc.zcu.cz Mon Apr 14 04:11:32 2008 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 14 Apr 2008 10:11:32 +0200 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> Message-ID: <480311B4.9020301@ntc.zcu.cz> Hi Brian, Brian Granger wrote: > So, I am currently implementing a distributed memory array package for python: > > http://projects.scipy.org/ipython/ipython/browser/ipythondistarray > > The goal is to have distributed/parallel arrays that look and feel > just like numpy arrays. Here is an example: Very nice! > Obviously, my first though was scipy.sparse. I am _really_ excited > about the massive improvements that have been happening in this area > recently. Here are the problems I am running into: > > 1) I need N-dimensional sparse arrays. Some of the storage formats in > scipy.sparse (dok, coo, maybe lil) could be generalized to > N-dimensions, but some work would have to be done. > > 2) I need these things to be in numpy. I hate to start another > "should this go into numpy or scipy" thread, but I actually do think > there is a decent case for moving the core sparse arrays into numpy > (not the solvers though). Please hear me out: > > a) Numpy at its core is about arrays. Conceptually, sparse arrays fit > into this narrow vision of Numpy. > > b) Sparse arrays are just as foundational as dense arrays in many > areas of computing/science (I would argue, that they are more > foundational than ffts and random numbers). > > c) Moving the core sparse arrays into numpy would increase their > visibility and encourage other projects to rely on them. > > d) It would not make numpy more difficult to build. > > e) It is currently somewhat confusing that they are not in numpy > (remember Numpy = arrays). You can add f) Having sparse arrays in numpy would greatly improve unifying the two worlds. There have been tons of questions why this or that numpy function do not work for a sparse matrix, when in it works seamlessly. Definitely +1! r. From ondrej at certik.cz Mon Apr 14 04:18:44 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 10:18:44 +0200 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: <480311B4.9020301@ntc.zcu.cz> References: <6ce0ac130804111047p780e0298uc8e0dd57d1089391@mail.gmail.com> <480311B4.9020301@ntc.zcu.cz> Message-ID: <85b5c3130804140118v185b814chfb7cb3d16706dae9@mail.gmail.com> On Mon, Apr 14, 2008 at 10:11 AM, Robert Cimrman wrote: > Hi Brian, > > > Brian Granger wrote: > > So, I am currently implementing a distributed memory array package for python: > > > > http://projects.scipy.org/ipython/ipython/browser/ipythondistarray > > > > The goal is to have distributed/parallel arrays that look and feel > > just like numpy arrays. Here is an example: > > Very nice! > > > > Obviously, my first though was scipy.sparse. I am _really_ excited > > about the massive improvements that have been happening in this area > > recently. Here are the problems I am running into: > > > > 1) I need N-dimensional sparse arrays. Some of the storage formats in > > scipy.sparse (dok, coo, maybe lil) could be generalized to > > N-dimensions, but some work would have to be done. > > > > 2) I need these things to be in numpy. I hate to start another > > "should this go into numpy or scipy" thread, but I actually do think > > there is a decent case for moving the core sparse arrays into numpy > > (not the solvers though). Please hear me out: > > > > a) Numpy at its core is about arrays. Conceptually, sparse arrays fit > > into this narrow vision of Numpy. > > > > b) Sparse arrays are just as foundational as dense arrays in many > > areas of computing/science (I would argue, that they are more > > foundational than ffts and random numbers). > > > > c) Moving the core sparse arrays into numpy would increase their > > visibility and encourage other projects to rely on them. > > > > d) It would not make numpy more difficult to build. > > > > e) It is currently somewhat confusing that they are not in numpy > > (remember Numpy = arrays). > > You can add f) Having sparse arrays in numpy would greatly improve > unifying the two worlds. There have been tons of questions why this or > that numpy function do not work for a sparse matrix, when in popular commercial system> it works seamlessly. I think we should first get them working at least in scipy + scikits first. Ondrej From pearu at cens.ioc.ee Mon Apr 14 04:59:24 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Mon, 14 Apr 2008 10:59:24 +0200 Subject: [SciPy-dev] F2PY future In-Reply-To: <85b5c3130804132259m5fb1f313jadaf2c334187774d@mail.gmail.com> References: <47FF4688.8090005@cens.ioc.ee> <60131.88.89.195.89.1207940003.squirrel@cens.ioc.ee> <85b5c3130804132259m5fb1f313jadaf2c334187774d@mail.gmail.com> Message-ID: <48031CEC.1000709@cens.ioc.ee> Ondrej Certik wrote: > Let me add, that I, as the user of f2py and also right now one of the > maintainers of the numpy/scipy packages in Debian, I find it really > bad, if the tools > move around from a package to package so often. It makes thing really > hard to package and follow. > > As an example, Debian sarge, Debian etch and Debian lenny all have > different ways how to install and work with numpy, scipy and f2py. > This is bad, because everyone has to relearn all the time and also if > you, say, install scipy from sources (because the released version is > too old), it's freaking hard, because I cannot just take use the same > tricks as in unstable, because things has moved around since etch. > > So my suggestion is either choose numpy, or your own f2py package, but > stick to it forever. :) I agree with you. > Also, how are you going to handle and package this g3 f2py? Cannot it > be just part of the regular f2py? In short run, g3 f2py will be developed separately from numpy because g3 f2py requires lots of work and I don't want to disturb the stability of numpy svn. When g3 f2py will become usable, it's code can be copied to numpy tree as it is now. So, g3 f2py can be distributed with numpy together with the stable f2py and for users it is a matter of using f2py command line flags --2d-{numeric, numpy. numarray} --g3-numpy to choose which implementation of f2py will be used for wrapping. In long run, the future of f2py packaging depends on what will be numpy future policy regarding various extension generation and other non-array related tools. It is clear that when f2py (and numpy.distutils, etc) is shipped with numpy, supporting scipy is considerably easier. That is why f2py, scipy_distutils, and scipy_testing were included to numpy in the first place. On the other hand, cleaning up numpy from non-array stuff has an advantage of making numpy easier for python developers to accept it as a part of std library. So, I am not sure that I alone can give a definite promise that the packaging of f2py will never change in future. If the numpy cleanup decision will be made, it must take into account all possible backward compatibility issues, including packaging. Regards, Pearu From ondrej at certik.cz Mon Apr 14 05:10:57 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 11:10:57 +0200 Subject: [SciPy-dev] F2PY future In-Reply-To: <48031CEC.1000709@cens.ioc.ee> References: <47FF4688.8090005@cens.ioc.ee> <60131.88.89.195.89.1207940003.squirrel@cens.ioc.ee> <85b5c3130804132259m5fb1f313jadaf2c334187774d@mail.gmail.com> <48031CEC.1000709@cens.ioc.ee> Message-ID: <85b5c3130804140210h3dbb9371ud6058e207b9bfdc@mail.gmail.com> On Mon, Apr 14, 2008 at 10:59 AM, Pearu Peterson wrote: > > Ondrej Certik wrote: > > > Let me add, that I, as the user of f2py and also right now one of the > > maintainers of the numpy/scipy packages in Debian, I find it really > > bad, if the tools > > move around from a package to package so often. It makes thing really > > hard to package and follow. > > > > As an example, Debian sarge, Debian etch and Debian lenny all have > > different ways how to install and work with numpy, scipy and f2py. > > This is bad, because everyone has to relearn all the time and also if > > you, say, install scipy from sources (because the released version is > > too old), it's freaking hard, because I cannot just take use the same > > tricks as in unstable, because things has moved around since etch. > > > > So my suggestion is either choose numpy, or your own f2py package, but > > stick to it forever. :) > > I agree with you. > > > > Also, how are you going to handle and package this g3 f2py? Cannot it > > be just part of the regular f2py? > > In short run, g3 f2py will be developed separately from numpy because > g3 f2py requires lots of work and I don't want to disturb the stability > of numpy svn. When g3 f2py will become usable, it's code can be copied > to numpy tree as it is now. So, g3 f2py can be distributed with numpy > together with the stable f2py and for users it is a matter of using > f2py command line flags --2d-{numeric, numpy. numarray} --g3-numpy > to choose which implementation of f2py will be used for wrapping. > > In long run, the future of f2py packaging depends on what will be > numpy future policy regarding various extension generation > and other non-array related tools. It is clear > that when f2py (and numpy.distutils, etc) is shipped with numpy, > supporting scipy is considerably easier. That is why f2py, > scipy_distutils, and scipy_testing were included to numpy in > the first place. > > On the other hand, cleaning up numpy from non-array stuff > has an advantage of making numpy easier for python developers to accept > it as a part of std library. So, I am not sure that I alone can give a > definite promise that the packaging of f2py will never change in future. > If the numpy cleanup decision will be made, it must take into account > all possible backward compatibility issues, including packaging. Ah I see, I got the full picture now. Thanks for the explanation. Well, I think it's not a problem -- if numpy decides to kick out non-array things, you'll simply package f2py as a standalone package and that's it. That package, whether it is in numpy or standalone will include both f2py and g3 f2py, so no problem with packaging should occur. Ondrej From ondrej at certik.cz Mon Apr 14 05:58:41 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 11:58:41 +0200 Subject: [SciPy-dev] FTBFS on Debian Message-ID: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> Hi, for some reason, the scipy package 0.6.0 stopped building on Debian, both i386 and amd64: build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: In function `PyFortranObject_New': /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:41: undefined reference to `_PyObject_New' /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:42: undefined reference to `PyDict_New' /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:66: undefined reference to `PyDict_SetItemString' build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: In function `F2PyDict_SetItemString': /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:21: undefined reference to `PyErr_Occurred' /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:22: undefined reference to `PyErr_Print' /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:23: undefined reference to `PyErr_Clear' build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: In function `fortran_dealloc': /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: undefined reference to `PyObject_Free' /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: undefined reference to `PyObject_Free' build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: In function `F2PyDict_SetItemString': /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:27: undefined reference to `PyDict_SetItemString' /usr/lib/libgfortranbegin.a(fmain.o): In function `main': (.text+0x28): undefined reference to `MAIN__' collect2: ld returned 1 exit status error: Command "/usr/bin/gfortran -Wall build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/scipy/fftpack/_fftpackmodule.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfft.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/drfft.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/zrfft.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfftnd.o build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -lgfortran -o build/lib.linux-x86_64-2.4/scipy/fftpack/_fftpack.so" failed with exit status 1 make: *** [install] Error 1 dpkg-buildpackage: failure: fakeroot debian/rules binary gave error exit status 2 pbuilder: Failed autobuilding of package -> Aborting with an error -> unmounting dev/pts filesystem -> unmounting proc filesystem -> cleaning the build env -> removing directory /var/cache/pbuilder/build//30719 and its subdirectories Any ideas what went wrong? Ondrej From ondrej at certik.cz Mon Apr 14 06:03:43 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 12:03:43 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> Message-ID: <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> On Mon, Apr 14, 2008 at 11:58 AM, Ondrej Certik wrote: > Hi, > > for some reason, the scipy package 0.6.0 stopped building on Debian, > both i386 and amd64: > > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: > In function `PyFortranObject_New': > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:41: > undefined reference to `_PyObject_New' > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:42: > undefined reference to `PyDict_New' > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:66: > undefined reference to `PyDict_SetItemString' > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: > In function `F2PyDict_SetItemString': > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:21: > undefined reference to `PyErr_Occurred' > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:22: > undefined reference to `PyErr_Print' > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:23: > undefined reference to `PyErr_Clear' > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: > In function `fortran_dealloc': > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: > undefined reference to `PyObject_Free' > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: > undefined reference to `PyObject_Free' > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: > In function `F2PyDict_SetItemString': > /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:27: > undefined reference to `PyDict_SetItemString' > /usr/lib/libgfortranbegin.a(fmain.o): In function `main': > (.text+0x28): undefined reference to `MAIN__' > collect2: ld returned 1 exit status > error: Command "/usr/bin/gfortran -Wall > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/scipy/fftpack/_fftpackmodule.o > build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfft.o > build/temp.linux-x86_64-2.4/scipy/fftpack/src/drfft.o > build/temp.linux-x86_64-2.4/scipy/fftpack/src/zrfft.o > build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfftnd.o > build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o > -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -lgfortran > -o build/lib.linux-x86_64-2.4/scipy/fftpack/_fftpack.so" failed with > exit status 1 > make: *** [install] Error 1 > dpkg-buildpackage: failure: fakeroot debian/rules binary gave error > exit status 2 > pbuilder: Failed autobuilding of package > -> Aborting with an error > -> unmounting dev/pts filesystem > -> unmounting proc filesystem > -> cleaning the build env > -> removing directory /var/cache/pbuilder/build//30719 and its > subdirectories > > > Any ideas what went wrong? The original bug report is here: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=475990 If no one knows, I'll have to dig into this myself. :) Ondrej From david at ar.media.kyoto-u.ac.jp Mon Apr 14 06:01:42 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 14 Apr 2008 19:01:42 +0900 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> Message-ID: <48032B86.7010804@ar.media.kyoto-u.ac.jp> Ondrej Certik wrote: > > The original bug report is here: > > http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=475990 > > If no one knows, I'll have to dig into this myself. :) > Could you send the full build log ? How is setup.py invoked in the rules file ? cheers, David From david at ar.media.kyoto-u.ac.jp Mon Apr 14 07:04:06 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 14 Apr 2008 20:04:06 +0900 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> Message-ID: <48033A26.2050608@ar.media.kyoto-u.ac.jp> Ondrej Certik wrote: > On Mon, Apr 14, 2008 at 11:58 AM, Ondrej Certik wrote: > >> Hi, >> >> for some reason, the scipy package 0.6.0 stopped building on Debian, >> both i386 and amd64: >> >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: >> In function `PyFortranObject_New': >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:41: >> undefined reference to `_PyObject_New' >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:42: >> undefined reference to `PyDict_New' >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:66: >> undefined reference to `PyDict_SetItemString' >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: >> In function `F2PyDict_SetItemString': >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:21: >> undefined reference to `PyErr_Occurred' >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:22: >> undefined reference to `PyErr_Print' >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:23: >> undefined reference to `PyErr_Clear' >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: >> In function `fortran_dealloc': >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: >> undefined reference to `PyObject_Free' >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:91: >> undefined reference to `PyObject_Free' >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o: >> In function `F2PyDict_SetItemString': >> /tmp/buildd/python-scipy-0.6.0/build/src.linux-x86_64-2.4/fortranobject.c:27: >> undefined reference to `PyDict_SetItemString' >> /usr/lib/libgfortranbegin.a(fmain.o): In function `main': >> (.text+0x28): undefined reference to `MAIN__' >> collect2: ld returned 1 exit status >> error: Command "/usr/bin/gfortran -Wall >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/scipy/fftpack/_fftpackmodule.o >> build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfft.o >> build/temp.linux-x86_64-2.4/scipy/fftpack/src/drfft.o >> build/temp.linux-x86_64-2.4/scipy/fftpack/src/zrfft.o >> build/temp.linux-x86_64-2.4/scipy/fftpack/src/zfftnd.o >> build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o >> -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -lgfortran >> -o build/lib.linux-x86_64-2.4/scipy/fftpack/_fftpack.so" failed with >> For what it worth, the problem is right there: gfortran command builds a program, not a library (the -shared is not there), which of course cannot work for a python extension. The problem is to understand why the -shared flag disappear from the link command. David From robert.kern at gmail.com Mon Apr 14 07:29:12 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 14 Apr 2008 06:29:12 -0500 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <48033A26.2050608@ar.media.kyoto-u.ac.jp> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> Message-ID: <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> On Mon, Apr 14, 2008 at 6:04 AM, David Cournapeau wrote: > For what it worth, the problem is right there: gfortran command builds a > program, not a library (the -shared is not there), which of course > cannot work for a python extension. The problem is to understand why the > -shared flag disappear from the link command. This problem is almost always caused by LDFLAGS being set. LDFLAGS *overrides* all of the Fortran link flags. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From david at ar.media.kyoto-u.ac.jp Mon Apr 14 07:22:18 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 14 Apr 2008 20:22:18 +0900 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> Message-ID: <48033E6A.3050302@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > > This problem is almost always caused by LDFLAGS being set. LDFLAGS > *overrides* all of the Fortran link flags. > > That's what I thought first, but then why gcc still uses -shared ? Why only gfortran ? cheers, David From robert.kern at gmail.com Mon Apr 14 07:36:43 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 14 Apr 2008 06:36:43 -0500 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <48033E6A.3050302@ar.media.kyoto-u.ac.jp> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> Message-ID: <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> On Mon, Apr 14, 2008 at 6:22 AM, David Cournapeau wrote: > Robert Kern wrote: > > > > This problem is almost always caused by LDFLAGS being set. LDFLAGS > > *overrides* all of the Fortran link flags. > > > > > > That's what I thought first, but then why gcc still uses -shared ? Why > only gfortran ? As we've discussed before, this overriding behavior only exists because we need some way to override Fortran compiler flags since the code will always be a little bit out of date with respect to the various new versions of the Fortran compilers that we support. C extensions are left alone because the link flags are almost always correct because the C compiler is almost always the one used to build Python itself. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ondrej at certik.cz Mon Apr 14 08:48:15 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 14:48:15 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> Message-ID: <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> On Mon, Apr 14, 2008 at 1:36 PM, Robert Kern wrote: > On Mon, Apr 14, 2008 at 6:22 AM, David Cournapeau > > wrote: > > > Robert Kern wrote: > > > > > > This problem is almost always caused by LDFLAGS being set. LDFLAGS > > > *overrides* all of the Fortran link flags. > > > > > > > > > > That's what I thought first, but then why gcc still uses -shared ? Why > > only gfortran ? > > As we've discussed before, this overriding behavior only exists > because we need some way to override Fortran compiler flags since the > code will always be a little bit out of date with respect to the > various new versions of the Fortran compilers that we support. C > extensions are left alone because the link flags are almost always > correct because the C compiler is almost always the one used to build > Python itself. Wow, indeed, this solves the problem: $ svn di Index: debian/rules =================================================================== --- debian/rules (revision 5013) +++ debian/rules (working copy) @@ -8,6 +8,8 @@ # Uncomment this to turn on verbose mode. #export DH_VERBOSE=1 +unexport LDFLAGS + PYVERS:= $(shell pyversions -v -r debian/control) BASE=$(shell pwd)/debian #debian_patches= umfpack signals viewer sandbox #fft python23 amd64 Thanks a lot Robert! This would take me hours until I'd figure it out... Ondrej From ondrej at certik.cz Mon Apr 14 09:19:57 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 15:19:57 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> Message-ID: <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> On Mon, Apr 14, 2008 at 2:48 PM, Ondrej Certik wrote: > On Mon, Apr 14, 2008 at 1:36 PM, Robert Kern wrote: > > On Mon, Apr 14, 2008 at 6:22 AM, David Cournapeau > > > > wrote: > > > > > Robert Kern wrote: > > > > > > > > This problem is almost always caused by LDFLAGS being set. LDFLAGS > > > > *overrides* all of the Fortran link flags. > > > > > > > > > > > > > > That's what I thought first, but then why gcc still uses -shared ? Why > > > only gfortran ? > > > > As we've discussed before, this overriding behavior only exists > > because we need some way to override Fortran compiler flags since the > > code will always be a little bit out of date with respect to the > > various new versions of the Fortran compilers that we support. C > > extensions are left alone because the link flags are almost always > > correct because the C compiler is almost always the one used to build > > Python itself. > > Wow, indeed, this solves the problem: > > $ svn di > Index: debian/rules > =================================================================== > --- debian/rules (revision 5013) > +++ debian/rules (working copy) > @@ -8,6 +8,8 @@ > # Uncomment this to turn on verbose mode. > #export DH_VERBOSE=1 > > +unexport LDFLAGS > + > PYVERS:= $(shell pyversions -v -r debian/control) > BASE=$(shell pwd)/debian > #debian_patches= umfpack signals viewer sandbox #fft python23 amd64 > > Thanks a lot Robert! > > This would take me hours until I'd figure it out... Actually, one more problem. The above patch fixes this on i386, but for amd64, I get this error now: /usr/bin/gfortran -Wall -Wall -shared build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/scipy/fftpack/convolvemodule.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/convolve.o build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -lgfortran -o build/lib.linux-x86_64-2.4/scipy/fftpack/convolve.so /usr/bin/ld: build/temp.linux-x86_64-2.4/libdfftpack.a(dffti.o): relocation R_X86_64_32 against `a local symbol' can not be used when making a shared object; recompile with -fPIC build/temp.linux-x86_64-2.4/libdfftpack.a: could not read symbols: Bad value collect2: ld returned 1 exit status /usr/bin/ld: build/temp.linux-x86_64-2.4/libdfftpack.a(dffti.o): relocation R_X86_64_32 against `a local symbol' can not be used when making a shared object; recompile with -fPIC build/temp.linux-x86_64-2.4/libdfftpack.a: could not read symbols: Bad value collect2: ld returned 1 exit status error: Command "/usr/bin/gfortran -Wall -Wall -shared build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/scipy/fftpack/convolvemodule.o build/temp.linux-x86_64-2.4/scipy/fftpack/src/convolve.o build/temp.linux-x86_64-2.4/build/src.linux-x86_64-2.4/fortranobject.o -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -lgfortran -o build/lib.linux-x86_64-2.4/scipy/fftpack/convolve.so" failed with exit status 1 make: *** [install] Error 1 Now I am trying to figure out how to set the -fPIC for the gfortran. Probably in some setup.py. I have no idea why these errors suddenly started to pop up, but anyway, here we are. Ondrej From robert.kern at gmail.com Mon Apr 14 10:19:23 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 14 Apr 2008 09:19:23 -0500 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> Message-ID: <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> On Mon, Apr 14, 2008 at 8:19 AM, Ondrej Certik wrote: > Now I am trying to figure out how to set the -fPIC for the gfortran. > Probably in some setup.py. No, this flag is coded into the FCompiler implementation for each compiler. Check FFLAGS. This will override the compilation flags just like LDFLAGS overrides the link flags. > I have no idea why these errors suddenly started to pop up, but > anyway, here we are. Environment variables, usually. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ondrej at certik.cz Mon Apr 14 10:43:02 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 16:43:02 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> Message-ID: <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> On Mon, Apr 14, 2008 at 4:19 PM, Robert Kern wrote: > On Mon, Apr 14, 2008 at 8:19 AM, Ondrej Certik wrote: > > Now I am trying to figure out how to set the -fPIC for the gfortran. > > Probably in some setup.py. > > No, this flag is coded into the FCompiler implementation for each > compiler. Check FFLAGS. This will override the compilation flags just > like LDFLAGS overrides the link flags. Thanks for the tip. Indeed the following patch fixes the problem on amd64 too: $ svn di Index: debian/rules =================================================================== --- debian/rules (revision 5014) +++ debian/rules (working copy) @@ -9,6 +9,7 @@ #export DH_VERBOSE=1 unexport LDFLAGS +export FFLAGS="-fPIC" PYVERS:= $(shell pyversions -v -r debian/control) BASE=$(shell pwd)/debian > > I have no idea why these errors suddenly started to pop up, but > > anyway, here we are. > > Environment variables, usually. Right. Those packages are built in a clean environment (always starting from a base Debian install and then installing all necessary packages), so this means some other package changed that. Thanks again for pointing me to the right direction. Ondrej From robert.kern at gmail.com Mon Apr 14 11:58:52 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 14 Apr 2008 10:58:52 -0500 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> Message-ID: <3d375d730804140858l40b39d8fpf723af8f8f3ea46a@mail.gmail.com> On Mon, Apr 14, 2008 at 9:43 AM, Ondrej Certik wrote: > On Mon, Apr 14, 2008 at 4:19 PM, Robert Kern wrote: > > On Mon, Apr 14, 2008 at 8:19 AM, Ondrej Certik wrote: > > > Now I am trying to figure out how to set the -fPIC for the gfortran. > > > Probably in some setup.py. > > > > No, this flag is coded into the FCompiler implementation for each > > compiler. Check FFLAGS. This will override the compilation flags just > > like LDFLAGS overrides the link flags. > > Thanks for the tip. Indeed the following patch fixes the problem on amd64 too: > > > $ svn di > Index: debian/rules > =================================================================== > --- debian/rules (revision 5014) > +++ debian/rules (working copy) > @@ -9,6 +9,7 @@ > > #export DH_VERBOSE=1 > > unexport LDFLAGS > +export FFLAGS="-fPIC" No, no. -fPIC *should* be there out-of-box. We need to figure out the actual problem rather than pasting it over like this. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ondrej at certik.cz Mon Apr 14 12:27:01 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Mon, 14 Apr 2008 18:27:01 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <3d375d730804140858l40b39d8fpf723af8f8f3ea46a@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> <3d375d730804140858l40b39d8fpf723af8f8f3ea46a@mail.gmail.com> Message-ID: <85b5c3130804140927t1d7de273x67fc6c3e76842b67@mail.gmail.com> On Mon, Apr 14, 2008 at 5:58 PM, Robert Kern wrote: > On Mon, Apr 14, 2008 at 9:43 AM, Ondrej Certik wrote: > > On Mon, Apr 14, 2008 at 4:19 PM, Robert Kern wrote: > > > On Mon, Apr 14, 2008 at 8:19 AM, Ondrej Certik wrote: > > > > Now I am trying to figure out how to set the -fPIC for the gfortran. > > > > Probably in some setup.py. > > > > > > No, this flag is coded into the FCompiler implementation for each > > > compiler. Check FFLAGS. This will override the compilation flags just > > > like LDFLAGS overrides the link flags. > > > > Thanks for the tip. Indeed the following patch fixes the problem on amd64 too: > > > > > > $ svn di > > Index: debian/rules > > =================================================================== > > --- debian/rules (revision 5014) > > +++ debian/rules (working copy) > > @@ -9,6 +9,7 @@ > > > > #export DH_VERBOSE=1 > > > > unexport LDFLAGS > > +export FFLAGS="-fPIC" > > No, no. -fPIC *should* be there out-of-box. We need to figure out the > actual problem rather than pasting it over like this. Sure -- now when it builds, let's find the real problem. If you want, I can create you an account on amd64 Debian box. Or let me know what I should try. Ondrej From vshah at interactivesupercomputing.com Mon Apr 14 14:36:31 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Mon, 14 Apr 2008 11:36:31 -0700 Subject: [SciPy-dev] Ideas for scipy.sparse? Message-ID: Hi Brian, and others, I have been following this thread for the past few days, and its been quite interesting. Some of my comments interspersed below: > So, I am currently implementing a distributed memory array package > for python: > > http://projects.scipy.org/ipython/ipython/browser/ipythondistarray > > The goal is to have distributed/parallel arrays that look and feel > just like numpy arrays. Here is an example: > > import ipythondistarray as ipda > > a = ipda.random.rand((10,100,100), dist=(None,'b','c')) > b = ipda.random.rand((10,100,100), dist=(None,'b','c')) > c = 0.5*ipda.sin(a) + 0.5*ipda.cos(b) > print c.sum(), c.mean(), c.std(), c.var() > > This works today on multiple processors. Don't get too excited > though, there is still _tons_ of work to be done.... This looks really exciting. I looked at the project, and it seems like you are using MPI as the basic infrastructure - but I also see references to GASNet. My first question looking at this was, why not just use Global Arrays for the basic array infrastructure ? It has a bunch of stuff that would let you get quite far quickly: http://www.emsl.pnl.gov/docs/global/ I believe, looking at the project organization, that you plan to add fft, sparse arrays, the works. I must mention that I work with Star-P (www.interactivesupercomputing.com ), and we have plugged in our parallel runtime to overload parts of numpy and scipy. We have sparse support for M (matlab), but not for python yet. I am guessing that your framework allows one to run several instances of python scripts as well on each node ? That way, one could solve several ODEs across nodes. You'd need some dynamic scheduling for workload balancing, perhaps. Have you thought about integrating this with the stackless python project, for a nice programming model that abstracts away from MPI and allows high performance parallel programs to be written entirely in python ? > Here is the main issue: I am running into a need for sparse arrays. > There are two places I am running into this: > > 1) I want to implement sparse distributed arrays and need sparse local > arrays for this. > > 2) There are other places in the implementation where sparse arrays > are needed. > > Obviously, my first though was scipy.sparse. I am _really_ excited > about the massive improvements that have been happening in this area > recently. Here are the problems I am running into: Not having followed the numpy/scipy debate for too long, I must say that I was a little surprised to find sparse array support in scipy rather than numpy. > 1) I need N-dimensional sparse arrays. Some of the storage formats in > scipy.sparse (dok, coo, maybe lil) could be generalized to > N-dimensions, but some work would have to be done. From the discussion, it seemed to me that these formats are written in python itself. In general, I like the idea of writing data structures in the same language, and as much in higher level languages. Some people pointed out that this tends to be slow when called in for loops. I couldn't figure out what the cause of this was. Is it that for loops in python are generally slow, or is it that indexing individual elements in sparse data structures is slow or are python's data structures slow, or some combination of all three ? I can say something about Matlab's sparse matrices, which I am familiar with. Its not that Matlab's for loops are the cause of slowness (they may be slow), but that indexing single elements in a sparse matrix is inherently slow. Every instance of A(i, j) has to perform a binary search. Insertion is even more expensive because it requires a memmove for CSC/CSR formats. Thus you almost never work with for loops and single element indexing for sparse matrices in Matlab. That said, one could design a data structure that partially solves some of these problems. I thought initially that dok may solve this for some kinds of sparse matrix problems, but it seems that its too slow for large problems. Perhaps a little benchmark must be setup. Brian, For N-d are you thinking of something along the lines of Tammy Kolda's tensor toolbox for Matlab ? That could be accomplished with just distributed dense vectors - I'm not sure what the performance would be like. Its the same as using a distributed coo format. http://csmr.ca.sandia.gov/~tgkolda/TensorToolbox/ http://www.models.kvl.dk/source/nwaytoolbox/index.asp > > 3) That we begin to move their implementation to using Cython (as an > aside, cython does play very well with templated C++ code). This > could provide a much nicer way of tying into the c++ code than using > swig. Not being intimately familiar with cython or swig, I did take a look at the cython homepage and it sounded like a good fit. It does seem that some people have strong oppositions to it. Support for templated C++ code would be nice to have though ! -viral From vshah at interactivesupercomputing.com Mon Apr 14 14:45:15 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Mon, 14 Apr 2008 11:45:15 -0700 Subject: [SciPy-dev] sparse array indexing. Message-ID: Nathan, and others, A few more questions about the sparse array manipulation. Also, is there a sparse document in the works other than the one at http://www.scipy.org/SciPy_Tutorial ? I would be more than happy to put all that I am finding out in a document. But if one is in the works, I can edit that instead. 1. I notice that A[I, J] behaves differently than in Matlab. This is documented. My question is, if one wants Matlab like behaviour, is there a better way to do this than A[I, :], A[:, J] ? 2. Do the sparse data structures support deletion ? I got the svn, and it seems that the delete function that works on numpy arrays does not work on sparse arrays, yet. Until I figure this out, I am doing deletion with: def deleterowcol(self, A, delrow, delcol): A = A.tocsc() m = A.shape[0] n = A.shape[1] keep = delete (arange(0, m), delrow) A = A[keep, :] keep = delete (arange(0, m), delcol) A = A[:, keep] return A 3. How do I get the rows and columns as vectors out of the coo format, so that I can do the equivalent of find() in MATLAB ? 4. I forwarded the SuperLU bug to the upstream author, Sherry Li. Perhaps she will fix it in the next release. In general, I believe that Sherry does not spend much time maintaining the sequential SuperLU, but Tim Davis does support UMFPACK more actively. But UMFPACK does not have single precision.. So the solution for the default sparse direct solver is not as obvious. I would vote in favour of UMFPACK being the default solver for double precision, if its possible to put it in the scipy tree. -viral From pgmdevlist at gmail.com Mon Apr 14 14:55:09 2008 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 14 Apr 2008 14:55:09 -0400 Subject: [SciPy-dev] stats.mstats & stats.mmorestats Message-ID: <200804141455.10414.pgmdevlist@gmail.com> All, I just committed two modules (and the corresponding unittests) to the scipy.stats package. Feel free to test and send some feedback. These packages are meant to replace numpy.ma.mstats and numpy.ma.morestats: may I go and erase those packages from the numpy SVN ? Thanks a lot, P. From robert.kern at gmail.com Mon Apr 14 15:26:26 2008 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 14 Apr 2008 14:26:26 -0500 Subject: [SciPy-dev] sparse array indexing. In-Reply-To: References: Message-ID: <3d375d730804141226s65ce9530n42639d088ac1a9a9@mail.gmail.com> On Mon, Apr 14, 2008 at 1:45 PM, Viral Shah wrote: > I would vote in favour of > UMFPACK being the default solver for double precision, if its possible > to put it in the scipy tree. Since it is GPLed, it *can* not be the default and *should* not move back into the scipy tree even as option. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From wnbell at gmail.com Mon Apr 14 19:45:25 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 14 Apr 2008 18:45:25 -0500 Subject: [SciPy-dev] sparse array indexing. In-Reply-To: References: Message-ID: On Mon, Apr 14, 2008 at 1:45 PM, Viral Shah wrote: > Nathan, and others, > > A few more questions about the sparse array manipulation. Also, is > there a sparse document in the works other than the one at http://www.scipy.org/SciPy_Tutorial > ? I would be more than happy to put all that I am finding out in a > document. But if one is in the works, I can edit that instead. There are small number of updates in this document: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/sparse/info.py I don't know of any other sparse documentation. > 1. I notice that A[I, J] behaves differently than in Matlab. This is > documented. My question is, if one wants Matlab like behaviour, is > there a better way to do this than A[I, :], A[:, J] ? Like dense arrays the CSR and CSC formats now support indexing of the form A[ [[0,1]], [0,1] ] This ought to be equivalent to A[ [0,1], :][ :, [0,1] ]. Furthermore, it should always perform the "right" indexing first (i.e. row first for CSR) > 2. Do the sparse data structures support deletion ? Not to my knowledge. IIRC some formats like lil_matrix might remove explicit zeros. I don't believe CSR/CSC do however. > 3. How do I get the rows and columns as vectors out of the coo format, > so that I can do the equivalent of find() in MATLAB ? A = coo_matrix(A) I,J,V = A.row,A.col,A.data > 4. I forwarded the SuperLU bug to the upstream author, Sherry Li. > Perhaps she will fix it in the next release. In general, I believe > that Sherry does not spend much time maintaining the sequential > SuperLU, but Tim Davis does support UMFPACK more actively. But UMFPACK > does not have single precision.. So the solution for the default > sparse direct solver is not as obvious. I would vote in favour of > UMFPACK being the default solver for double precision, if its possible > to put it in the scipy tree. Unfortunately, as Robert K. pointed out, this is not possible. UMFPACK is great software, but I think the recent move from LGPL to GPL suggests that a relicensing under the BSD is unlikely. Are you sure the bug exists in the most recent version of SuperLU? Are you sure it's a SuperLU bug instead of a wrapper bug? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Mon Apr 14 19:55:19 2008 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 14 Apr 2008 18:55:19 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: Message-ID: On Mon, Apr 14, 2008 at 1:36 PM, Viral Shah wrote: > Some people pointed out that this tends to be slow when called in for > loops. I couldn't figure out what the cause of this was. Is it that > for loops in python are generally slow, or is it that indexing > individual elements in sparse data structures is slow or are python's > data structures slow, or some combination of all three ? Currently all access of the form A[1,2] are implemented in pure Python (even CSR/CSC). Looping in Python comes with a certain amount of overhead itself, however, I'd wager most time is spent doing the actual indexing. We can speed up the case A[[1,2,3],[5,6,7]] by sending the index arrays over to sparsetools. However, as you point out, individual lookups will always be slow. > I thought initially that dok may solve this for some kinds of sparse > matrix problems, but it seems that its too slow for large problems. > Perhaps a little benchmark must be setup. In my experience DOK/LIL are too slow for large matrices. This is a shame, because new users will want to construct matrices with one of these classes. More advanced users would probably build a COO initially and then convert to CSR/CSC. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From ondrej at certik.cz Tue Apr 15 05:43:01 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 15 Apr 2008 11:43:01 +0200 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: Message-ID: <85b5c3130804150243r3f1f1121v226394186f51245@mail.gmail.com> On Tue, Apr 15, 2008 at 1:55 AM, Nathan Bell wrote: > On Mon, Apr 14, 2008 at 1:36 PM, Viral Shah > wrote: > > Some people pointed out that this tends to be slow when called in for > > loops. I couldn't figure out what the cause of this was. Is it that > > for loops in python are generally slow, or is it that indexing > > individual elements in sparse data structures is slow or are python's > > data structures slow, or some combination of all three ? > > Currently all access of the form A[1,2] are implemented in pure Python > (even CSR/CSC). Looping in Python comes with a certain amount of > overhead itself, however, I'd wager most time is spent doing the > actual indexing. > > We can speed up the case A[[1,2,3],[5,6,7]] by sending the index > arrays over to sparsetools. However, as you point out, individual > lookups will always be slow. > > > > I thought initially that dok may solve this for some kinds of sparse > > matrix problems, but it seems that its too slow for large problems. > > Perhaps a little benchmark must be setup. > > In my experience DOK/LIL are too slow for large matrices. This is a > shame, because new users will want to construct matrices with one of > these classes. More advanced users would probably build a COO > initially and then convert to CSR/CSC. All formats are nicely documented in the sources (see the docstrings for usage examples). Ondrej From ondrej at certik.cz Tue Apr 15 06:02:30 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 15 Apr 2008 12:02:30 +0200 Subject: [SciPy-dev] removing "print z" from basic.py Message-ID: <85b5c3130804150302o5f13d526t87c3c18428b8e9e9@mail.gmail.com> Hi, we just got this bug report in Debian: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=476235 and it still applies to the curren svn, file: scipy/scipy/special/basic.py, line 235. Please someone remove the "print z" statement. Ondrej From ellisonbg.net at gmail.com Tue Apr 15 12:37:55 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 15 Apr 2008 10:37:55 -0600 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: Message-ID: <6ce0ac130804150937q3b78a030t56dec9f2d45ffc66@mail.gmail.com> > This looks really exciting. I looked at the project, and it seems like > you are using MPI as the basic infrastructure - but I also see > references to GASNet. My first question looking at this was, why not > just use Global Arrays for the basic array infrastructure ? It has a > bunch of stuff that would let you get quite far quickly: Yes, we are using MPI. The GASNet stuff was just playing around. I looked at global arrays as well, but (for now) have stuck with MPI for the following reasons: 1) There are really stable, easy to install Python bindings for MPI (mpi4py.scipy.org) 2) MPI is more available and easier to install. OS X comes with the standard and there are Linux packages for it. This makes like easier for our users who need to install it. 3) Everybody else is doing it 4) I tried installing GA on my Mac and the build failed. I am more than willing to debug an install process, but my users won't be. 5) There are not any good Python bindings for GA. I know Robert Harrison had some Python bindings, but I can't find them and they probably are not being maintained. That said, I would very much like to be able to user GA/GASNet stuff in the future. > http://www.emsl.pnl.gov/docs/global/ > > I believe, looking at the project organization, that you plan to add > fft, sparse arrays, the works. Ideally yes. > I must mention that I work with Star-P (www.interactivesupercomputing.com > ), and we have plugged in our parallel runtime to overload parts of > numpy and scipy. We have sparse support for M (matlab), but not for > python yet. Were you at PyCon 2007? This explains your interest in scipy.sparse. I guess it would be very good for you if the stuff in scipy.sparse moved into numpy? Also, having that code base settle down would also probably help. What are your plans. > I am guessing that your framework allows one to run several instances > of python scripts as well on each node ? That way, one could solve > several ODEs across nodes. You'd need some dynamic scheduling for > workload balancing, perhaps. Our "framework" is IPython1: http://ipython.scipy.org/moin/IPython1 Yes, IPython1 enables Python scripts to be distributed across a set of nodes. We have both static and dynamic load balancing. Compute nodes can be started with MPI, but that is optional. This allows compute node to come and go anytime and the dynamic load balancer has fault tolerance builtin. > Have you thought about integrating this with the stackless python > project, for a nice programming model that abstracts away from MPI and > allows high performance parallel programs to be written entirely in > python ? Yes, but I am really hesitant to require people to install a custom version of Python. As I am sure you know, if users can't easily install the tools you are building (especially if they are parallel/distributed), it is a no go. That kills stackless for us, even though I really like the ideas behind stackless. Another option is greenlets. > > Here is the main issue: I am running into a need for sparse arrays. > > There are two places I am running into this: > > > > 1) I want to implement sparse distributed arrays and need sparse local > > arrays for this. > > > > 2) There are other places in the implementation where sparse arrays > > are needed. > > > > > Obviously, my first though was scipy.sparse. I am _really_ excited > > about the massive improvements that have been happening in this area > > recently. Here are the problems I am running into: > Not having followed the numpy/scipy debate for too long, I must say > that I was a little surprised to find sparse array support in scipy > rather than numpy. Yep, the work that people have done on scipy.sparse (especially Nathan) is fantastic. But, I do think the basic sparse arrays should be in numpy. > > 1) I need N-dimensional sparse arrays. Some of the storage formats in > > scipy.sparse (dok, coo, maybe lil) could be generalized to > > N-dimensions, but some work would have to be done. > From the discussion, it seemed to me that these formats are written > in python itself. In general, I like the idea of writing data > structures in the same language, and as much in higher level languages. > > Some people pointed out that this tends to be slow when called in for > loops. I couldn't figure out what the cause of this was. Is it that > for loops in python are generally slow, or is it that indexing > individual elements in sparse data structures is slow or are python's > data structures slow, or some combination of all three ? Compared to loops is C, Python loops are quite slow - especially if you are calling Python functions at each iteration. > I can say something about Matlab's sparse matrices, which I am > familiar with. Its not that Matlab's for loops are the cause of > slowness (they may be slow), but that indexing single elements in a > sparse matrix is inherently slow. Every instance of A(i, j) has to > perform a binary search. Insertion is even more expensive because it > requires a memmove for CSC/CSR formats. Thus you almost never work > with for loops and single element indexing for sparse matrices in > Matlab. That said, one could design a data structure that partially > solves some of these problems. True, there is the algorithmic complexity of sparse array indexing, but that is on top of the overhead that Python (and likely Matlab) has. > I thought initially that dok may solve this for some kinds of sparse > matrix problems, but it seems that its too slow for large problems. > Perhaps a little benchmark must be setup. I think the dok implementation in scipy.sparse could be optimized, but I don't have time to pursue this right now. > Brian, > > For N-d are you thinking of something along the lines of Tammy Kolda's > tensor toolbox for Matlab ? That could be accomplished with just > distributed dense vectors - I'm not sure what the performance would be > like. Its the same as using a distributed coo format. > > http://csmr.ca.sandia.gov/~tgkolda/TensorToolbox/ > http://www.models.kvl.dk/source/nwaytoolbox/index.asp I will have a look at this. > > > > > > 3) That we begin to move their implementation to using Cython (as an > > aside, cython does play very well with templated C++ code). This > > could provide a much nicer way of tying into the c++ code than using > > swig. > Not being intimately familiar with cython or swig, I did take a look > at the cython homepage and it sounded like a good fit. It does seem > that some people have strong oppositions to it. Support for templated > C++ code would be nice to have though ! While it is sort of a hack, you can handle templates in cython: http://wiki.cython.org/WrappingCPlusPlus Even though this is a hack and it tacks a while to get used to, it works really well and people are starting to use it. The big advantage of this approach is that the generated C++ code underneath the hood is _really_ fast. There is almost no overhead that cython adds. This is much better than swig, which adds too many layers of cruft and forces you to call things from Python. In terms of opposition to cython, there is very little on the whole. The overall pattern I am seeing is that _tons_ of people are beginning to use it (mpi4py, ipython, numpy/scipy, nipy, sage, etc.). Cheers, > -viral > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From vshah at interactivesupercomputing.com Tue Apr 15 12:59:43 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Tue, 15 Apr 2008 09:59:43 -0700 Subject: [SciPy-dev] Ideas for scipy.sparse? Message-ID: > On Tue, Apr 15, 2008 at 1:55 AM, Nathan Bell wrote: >> On Mon, Apr 14, 2008 at 1:36 PM, Viral Shah >>> I thought initially that dok may solve this for some kinds of sparse >>> matrix problems, but it seems that its too slow for large problems. >>> Perhaps a little benchmark must be setup. >> >> In my experience DOK/LIL are too slow for large matrices. This is a >> shame, because new users will want to construct matrices with one of >> these classes. More advanced users would probably build a COO >> initially and then convert to CSR/CSC. > > All formats are nicely documented in the sources (see the docstrings > for usage examples). Yes, in general they are well documented. Perhaps it is a good idea to document the reverse process of extracting indices from coo_matrix as well ? Currently, it only provides a way to construct a matrix from indices and not the other way around, which is equally useful, like Nathan said in the last email. A = coo_matrix(A) I,J,V = A.row,A.col,A.data -viral From cookedm at physics.mcmaster.ca Tue Apr 15 16:48:39 2008 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 15 Apr 2008 16:48:39 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: (Jarrod Millman's message of "Mon\, 7 Apr 2008 03\:00\:44 -0700") References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> Message-ID: "Jarrod Millman" writes: > Hello, > > I would like remove numexpr from the scipy sandbox now that it is a > stand alone project: > http://code.google.com/p/numexpr/ > > If there are any problems with this, please let me know in the next > couple of days. Otherwise I am going to go ahead and clean it up. > > Thanks, No problem, I was going to do it myself as soon as I got a release of numexpr out based on this code. I don't think there are any hold-ups, so I'll do that soon. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Tue Apr 15 16:55:00 2008 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 15 Apr 2008 16:55:00 -0400 Subject: [SciPy-dev] Moving numexpr to its own project In-Reply-To: <47394214-82AD-4221-A59F-BB641B5DD515@nist.gov> (Jonathan Guyer's message of "Mon\, 7 Apr 2008 13\:07\:20 -0400") References: <6F89103E-9007-49C9-8B21-F0B4528EE9A3@physics.mcmaster.ca> <200712240856.08804.faltet@carabos.com> <47394214-82AD-4221-A59F-BB641B5DD515@nist.gov> Message-ID: Jonathan Guyer writes: > On Apr 7, 2008, at 12:36 PM, Jarrod Millman wrote: > >> I can't really speak for the authors, but I would guess they were >> looking for a large audience. Rather than numexpr being rejected from >> the scikit's namespace as "not 'sciency' enough", the authors wanted >> their project not to appear to be restricted to scientific uses. > > This is just a guess, but they also might not want to have appeared to > require SciPy, specifically. Speaking from our own experience, > although FiPy uses and recommends SciPy, we don't require it because > it's frankly a headache to install and we know from experience that > the support requests from our users are going to come to us, not to > the SciPy developers. Numexpr, even as part of scipy.sandbox, didn't require other parts of SciPy, so scipy.sandbox really was just a convenient place to park it. (It's much like weave in that respect.) Heck, it required Python 2.4 (scipy requires 2.3). So, yes, I see removing the association with SciPy as a plus. It's also used by PyTables (although currently embedded, not as a dependency), which is another numpy-using, non-scipy-requiring project. (As to a large audience, well, if there are other users out there of numexpr, I'd love to here from you! I'll do a 1.0 release soon.) -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ondrej at certik.cz Wed Apr 16 06:10:16 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 16 Apr 2008 12:10:16 +0200 Subject: [SciPy-dev] FTBFS on Debian In-Reply-To: <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> References: <85b5c3130804140258l4892b922j68db729ce2d511a0@mail.gmail.com> <85b5c3130804140303u4cb6e532ofaf1f23ed771b119@mail.gmail.com> <48033A26.2050608@ar.media.kyoto-u.ac.jp> <3d375d730804140429r7fbde7bcp89dda18014f58561@mail.gmail.com> <48033E6A.3050302@ar.media.kyoto-u.ac.jp> <3d375d730804140436i1ad0beabt6f9941db6d329f3a@mail.gmail.com> <85b5c3130804140548m745858bfo4a6a1e2865cdd592@mail.gmail.com> <85b5c3130804140619l6fdfe50at5c55da64ae881196@mail.gmail.com> <3d375d730804140719q5b935e30jfe51a5791e5c1d32@mail.gmail.com> <85b5c3130804140743r267fe286i3bd26dea243ffd32@mail.gmail.com> Message-ID: <85b5c3130804160310o6153bc0dq1dbc262a75d6f85e@mail.gmail.com> > > Environment variables, usually. > > Right. Those packages are built in a clean environment (always > starting from a base Debian install and then installing all necessary > packages), so this means some other package changed that. > > Thanks again for pointing me to the right direction. BTW, you were right again, the problem was caused by dpkg defining some environemnt variables: > Since dpkg 1.14.17, dpkg-buildpackage will define the environment > variables CFLAGS, CXXFLAGS, CPPFLAGS, LDFLAGS and FFLAGS. The goal is to > be able to easily recompile packages with supplementary compilation flags > and to simplify the debian/rules files since CFLAGS has the right default > value (no need to special case for DEB_BUILD_OPTIONS=noopt). Ondrej From vshah at interactivesupercomputing.com Wed Apr 16 14:16:52 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Wed, 16 Apr 2008 11:16:52 -0700 Subject: [SciPy-dev] sparse deletion and sparse random matrices. Message-ID: Thanks to all the help I got on this list, I have been able to figure out how to get a bunch of our code working with sparse matrices in scipy. The two things I believe could be immediately useful are some simple routines for deleting rows and columns, and routines to create sparse random matrices. The latter would be greatly useful for examples, exploration, debugging etc. The routines might look something like below. If there's interest, I can help with whatever needs to be done to put them in the tree: # sparse random matrices.. def sprand (self, m, n, density): nnz = m*n*density ij = fix(nnz*rand(2, nnz)) data = rand(nnz) s = sparse.coo_matrix((data, ij), shape=(m, n)) return s # Deletion. Performance should be comparable to native code.. def deleterowcol(self, A, delrow, delcol): A = A.tocsc() m = A.shape[0] n = A.shape[1] keep = delete (arange(0, m), delrow) A = A[keep, :] keep = delete (arange(0, m), delcol) A = A[:, keep] return A -viral From nwagner at iam.uni-stuttgart.de Wed Apr 16 16:22:56 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 16 Apr 2008 22:22:56 +0200 Subject: [SciPy-dev] The Matrix Function Toolbox Message-ID: Hi all, To all whom it may concern http://www.maths.manchester.ac.uk/~higham/mftoolbox/ Nils From cookedm at physics.mcmaster.ca Thu Apr 17 18:02:43 2008 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 17 Apr 2008 18:02:43 -0400 Subject: [SciPy-dev] Scipy tests can't be run when scipy installed as an egg Message-ID: <6D17D64C-E4EE-491B-8557-32E7621DBFA7@physics.mcmaster.ca> I usually install scipy as an egg, and I noticed[1] that you can't run the test suite when installed that way: dave at saturn% py -c 'import scipy; scipy.test()' /Users/dave/Library/Python/2.5/site-packages/scipy-0.7.0.dev4146-py2.5- macosx-10.3-fat.egg/scipy/sparse/linalg/dsolve/linsolve.py:20: DeprecationWarning: scipy.sparse.linalg.dsolve.umfpack will be removed, install scikits.umfpack instead ' install scikits.umfpack instead', DeprecationWarning ) /Users/dave/Library/Python/2.5/site-packages/scipy-0.7.0.dev4146-py2.5- macosx-10.3-fat.egg/scipy/linsolve/__init__.py:4: DeprecationWarning: scipy.linsolve has moved to scipy.sparse.linalg.dsolve warn('scipy.linsolve has moved to scipy.sparse.linalg.dsolve', DeprecationWarning) /Users/dave/Library/Python/2.5/site-packages/scipy-0.7.0.dev4146-py2.5- macosx-10.3-fat.egg/scipy/splinalg/__init__.py:3: DeprecationWarning: scipy.splinalg has moved to scipy.sparse.linalg warn('scipy.splinalg has moved to scipy.sparse.linalg', DeprecationWarning) ---------------------------------------------------------------------- Ran 0 tests in 0.049s OK (Same results using the `nosetests` program`.) This is probably a problem with nose (or our use of it). Tests run fine when scipy is installed as normal. Nose and numpy are the latest versions. Any thoughts? [1] would've noticed eariler, but I've been pretty busy. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From rob.clewley at gmail.com Thu Apr 17 22:35:11 2008 From: rob.clewley at gmail.com (Rob Clewley) Date: Thu, 17 Apr 2008 22:35:11 -0400 Subject: [SciPy-dev] Help and input needed for adapting pydstool code for a scikit or scipy Message-ID: Hi, I do not wish to cross-post our long messages, but I realized that the discussion of short and long-term dynamical modeling support needs to get the attention of SciPy developers. I know people are busy with the upcoming numpy release, but I would like to draw your attention to the two related threads that I started on the users list. Briefly, we want to get your input on adapting and porting some aspects of PyDSTool (not necessarily *just* the ODE integrators) into scipy -- at least as a scikit -- and also to seek help to do so from interested parties. There are lots of technical issues arising, and they may never get solved unless someone express willingness to get involved. I think it was Travis who, a year or two ago, expressed interest in getting help in re-writing ODE support for scipy, and this is our much belated response to that. We also have technical questions about using distutils that maybe someone on this list can help us with. Thanks, Rob Clewley Erik Sherwood From tournesol33 at gmail.com Thu Apr 17 23:23:57 2008 From: tournesol33 at gmail.com (tournesol) Date: Fri, 18 Apr 2008 12:23:57 +0900 Subject: [SciPy-dev] Input value from data file to 2D array Message-ID: <4808144D.2020204@gmail.com> Hi All. I'm a newbie of python and scipy, numpy and try to rewriting the Fortran77 code to Python. First, here is my Fortran code. DIMENSION A(10,10), B(10,10) OPEN(21,FILE='aaa.dat') DO 10 K=1,2 READ(21,1602) ((A(I,J),J=1,3),I=1,2) READ(21,1602) ((B(I,J),J=1,3),I=1,2) 10 CONTINUE WRITE(6,1602) ((A(I,J),J=1,3),I=1,2) WRITE(6,1602) ((B(I,J),J=1,3),I=1,2) 1602 FORMAT(3I4) END and aaa.dat file is 0 1 1 1 1 1 2 1 1 ?; ; ; ; ; ; 18 1 1 19 1 1 The code is going to read each value of the data file(aaa.dat) and input the value to the array A, B. According the "DO 10 K=1,2" which means a loop of Fortran it is going to try to read 1st 2x3 array and set them to A, also read 2nd 2x3 array to B. Two "WRITE(6,1602) "s mean that the final value of array A and B are 3rd 2x3 array adn 4th 2x3 array ? 4 1 1 5 1 1 6 1 1 7 1 1 Question 1: How to input value from data file to 2D array ? Can I write code like following ? ?a=zeros([2,3], Float) ??? for i in range(1,2): ?? ?for j in range(1,3): ??? a[i][j]=i,j (If the size of data file is very big , using read_array() or fromfile() will be very hard. ) Question 2: Such like "DO 10 K=1,1500", I just read a part of data file(<=very big file). How cat write this with Python ? Any advice please! From ggellner at uoguelph.ca Fri Apr 18 01:51:39 2008 From: ggellner at uoguelph.ca (Gabriel Gellner) Date: Fri, 18 Apr 2008 01:51:39 -0400 Subject: [SciPy-dev] Input value from data file to 2D array In-Reply-To: <4808144D.2020204@gmail.com> References: <4808144D.2020204@gmail.com> Message-ID: <20080418055139.GA3515@basestar> > First, here is my Fortran code. > > DIMENSION A(10,10), B(10,10) > OPEN(21,FILE='aaa.dat') > DO 10 K=1,2 > READ(21,1602) ((A(I,J),J=1,3),I=1,2) > READ(21,1602) ((B(I,J),J=1,3),I=1,2) > 10 CONTINUE > WRITE(6,1602) ((A(I,J),J=1,3),I=1,2) > WRITE(6,1602) ((B(I,J),J=1,3),I=1,2) > 1602 FORMAT(3I4) > END > > and aaa.dat file is > > 0 1 1 > 1 1 1 > 2 1 1 > ?; ; ; > ; ; ; > 18 1 1 > 19 1 1 > > The code is going to read each value of the data file(aaa.dat) > and input the value to the array A, B. According the "DO 10 K=1,2" > which means a loop of Fortran it is going to try to read 1st 2x3 array > and set them to A, also read 2nd 2x3 array to B. > Two "WRITE(6,1602) "s mean that the final value of array A and B are > 3rd 2x3 array adn 4th 2x3 array > > ? 4 1 1 > 5 1 1 > 6 1 1 > 7 1 1 > > > Question 1: How to input value from data file to 2D array ? > Can I write code like following ? > > ?a=zeros([2,3], Float) > First you don't need to use Float unless you a using a really old version of numpy just write a = zeros((2, 3)) Also, it is good to pass a tuple not a list as passing a mutable argument is asking for trouble, even though in this case it is safe. > ??? for i in range(1,2): > ?? ?for j in range(1,3): > ??? a[i][j]=i,j > Remember that python has 0 based arrays so you would want to write for i in range(2): for j in range(3): a[i, j] = so that range(2) returns the indices 0, 1 and range(3) returns the indices 0, 1, 2 otherwise you are asking for out of bounds exceptions. Also notice that numpy arrays are indexed using comma's to separate the dimensions, unlike nested lists in regular python or C. > (If the size of data file is very big , using > read_array() or fromfile() will be very hard. ) > Maybe not, as most file objects return iterators so they don't read in the entire file into memory, try it the ugly way first, you might be surprised. That is you can usually just iterate over the file like for line in open('aaa.dat'): print line.split() to print out each line split by spaces, and the open statement will just return an iterator, not read the entire file into memory. To read the entire file into memory you would do (not that you really want to do this): for line in open('aaa.dat').readlines(): print line.split() > Question 2: Such like "DO 10 K=1,1500", I just read a part of > data file(<=very big file). How cat write this with Python ? > Using the above iterator method I would do for lnum, line in enumerate(open('aaa.dat')): if lnum >= 1500: break print line.split() or if you don't like the break file = open('aaa.dat') for k in range(1500): line = file.read() Okay so with all this information lets try doing your problem. import numpy a = numpy.zeros((10, 10)) b = numpy.zeros((10, 10)) file = open('aaa.dat') for k in range(2): for i in range(2): a[k + i, :3] = [int(line) for line in file.read().strip().split()] for i in range(2): b[k + i, :3] = [int(line) for line in file.read().strip().split()] Which is from the best I can make out the porting of your fortran code (the reading part, the writing is similiar), though it seems like A, B are to big, that is you make the dimensions of the arrays much larger than you use both the number of rows and columns. Maybe I misunderstand your code. Hopefully this helps a little. And finally welcome to the wonderful world of python! Gabriel From wnbell at gmail.com Fri Apr 18 13:49:23 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 18 Apr 2008 12:49:23 -0500 Subject: [SciPy-dev] sparse deletion and sparse random matrices. In-Reply-To: References: Message-ID: On Wed, Apr 16, 2008 at 1:16 PM, Viral Shah wrote: > The two things I believe could be immediately useful are some simple > routines for deleting rows and columns, and routines to create sparse > random matrices. The latter would be greatly useful for examples, > exploration, debugging etc. > > The routines might look something like below. If there's interest, I > can help with whatever needs to be done to put them in the tree: sprand() would be useful however deleterowcol() is too special purpose IMO. Two comments on your sprand() 1) leads to values outside [0,1) when an (i,j) entry occured more than once. 2) fix(rand()) may be better implemented with numpy.random.random_integers() With a docstring + unittests this would make a nice addition to sparse/construct.py -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From fperez.net at gmail.com Fri Apr 18 18:14:02 2008 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 18 Apr 2008 15:14:02 -0700 Subject: [SciPy-dev] [OT - IPython] Old 'broken terminal' bug finally fixed Message-ID: [ Sorry for the cross-post, but I know this is something that has hit quite a few people on this list. If you have any questions on it, please ask on the ipython list, this is just an FYI ] Hi all, there's a very old, *extremely* annoying bug that multiple people have asked about (on list and in person) that is finally just fixed. The behavior was that at some point during a normal session, after a call to 'foo?', your terminal would be totally messed up, with no displayed input. You could (blindly) type !reset to issue the system terminal reset command, but that would only help until the next time foo? was called, and the problem would then return. Most of us would end up just quitting ipython and restarting, often losing useful session state. The problem with this is that we never knew how to reliably reproduce it... Anyway, it's fixed now in current bzr: http://bazaar.launchpad.net/~ipython/ipython/stable-dev/revision/79 I don't actually know how to trigger it, but it hit me during an important session where I really couldn't afford to lose what I was working on, and I managed to track it down to what I'm pretty sure is a curses bug. Basically curses.initscr() fails to correctly initialize the terminal sometimes (I still don't have a clue why), and after that it's all lost. But it turns out that via termios one can in fact reset the terminal state reliably , so now we unconditionally do that. Anyway, I figured this would be worth mentioning here, since I know the problem is one that quite often bites people in the middle of work sessions an it can be very, very annoying. Cheers, f back to what I was busy doing, with my terminal now fully functional again... :) From wnbell at gmail.com Fri Apr 18 20:26:59 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 18 Apr 2008 19:26:59 -0500 Subject: [SciPy-dev] Ideas for scipy.sparse? In-Reply-To: References: Message-ID: On Tue, Apr 15, 2008 at 11:59 AM, Viral Shah wrote: > Perhaps it is a good idea to document the reverse process of > extracting indices from coo_matrix as well ? Currently, it only > provides a way to construct a matrix from indices and not the other > way around, which is equally useful, like Nathan said in the last email. > > A = coo_matrix(A) > I,J,V = A.row,A.col,A.data We could make a function scipy.sparse.find() to match MATLAB's find() def find(A): A = coo_matrix(A, copy=True) return A.row, A.col, A.data The copy=True ensures that the result arrays are distinct from A's data when A is a coo_matrix. We could then add sparse.find() to the table of MATLAB equivalences. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From vshah at interactivesupercomputing.com Sat Apr 19 13:31:52 2008 From: vshah at interactivesupercomputing.com (Viral Shah) Date: Sat, 19 Apr 2008 10:31:52 -0700 Subject: [SciPy-dev] sparse deletion and sparse random matrices. Message-ID: >> The two things I believe could be immediately useful are some simple >> routines for deleting rows and columns, and routines to create sparse >> random matrices. The latter would be greatly useful for examples, >> exploration, debugging etc. >> >> The routines might look something like below. If there's interest, I >> can help with whatever needs to be done to put them in the tree: > > sprand() would be useful however deleterowcol() is too special > purpose IMO. I meant that something like deleterowcol() could be used to implement the delete() routines for sparse matrices. Currently, you can't delete rows/cols from sparse matrices. I believe it is basic functionality that would be nice to have, even if early implementations were not the best. > Two comments on your sprand() > 1) leads to values outside [0,1) when an (i,j) entry occured more > than once. You are right. I did find that bug earlier this week, and fixed it. > 2) fix(rand()) may be better implemented with > numpy.random.random_integers() Still familiarizing myself with all the stuff thats in numpy. That would definitely be a better function to use. > With a docstring + unittests this would make a nice addition to > sparse/construct.py Ok, I will dig into the source and figure out what all needs to be done to put this in. Thanks, -viral From hoytak at gmail.com Sat Apr 19 18:48:13 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sat, 19 Apr 2008 15:48:13 -0700 Subject: [SciPy-dev] random seed in scipy.stats Message-ID: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> Hello, I may have missed something here, so I'll first ask: Is there a way to specify a random number seed in scipy.stats? I'm building a project using it which requires exactly reproducible distribution samples, so being able to specify a random seed is a must. I've been looking around in the code, and I don't see a way to do it. Even if it is there, it calls numpy.random directly, which would not work reliably. However, it seems that it'd be easy to add in by initializing a RandomState object in the constructors of both rv_continuous and rv_discrete, seeding it using an optional kw arg that defaults to None. All the code that currently draws samples from numpy.random directly would then draw samples from this. As far as I can see, this wouldn't break backwards compatibility and would add in a useful feature. The only change (I think) would be to distributions.py. Have I missed anything? If this is acceptable, I'll gladly make the change and post a diff and a unit test. --Hoyt -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ From robert.kern at gmail.com Sun Apr 20 03:14:31 2008 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 20 Apr 2008 02:14:31 -0500 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> Message-ID: <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> On Sat, Apr 19, 2008 at 5:48 PM, Hoyt Koepke wrote: > Hello, > > I may have missed something here, so I'll first ask: Is there a way to > specify a random number seed in scipy.stats? I'm building a project > using it which requires exactly reproducible distribution samples, so > being able to specify a random seed is a must. > > I've been looking around in the code, and I don't see a way to do it. > Even if it is there, it calls numpy.random directly, which would not > work reliably. numpy.random.seed() is exposed, although I only recommend using it to work around poorly designed APIs like the one in scipy.stats. > However, it seems that it'd be easy to add in by initializing a > RandomState object in the constructors of both rv_continuous and > rv_discrete, seeding it using an optional kw arg that defaults to > None. All the code that currently draws samples from numpy.random > directly would then draw samples from this. As far as I can see, this > wouldn't break backwards compatibility and would add in a useful > feature. The only change (I think) would be to distributions.py. > > Have I missed anything? If this is acceptable, I'll gladly make the > change and post a diff and a unit test. My preference is for a pattern like this: def somefunc(x,y,z, prng=numpy.random): w = prng.standard_normal() ... This allows the default uses to share the same generator state which could avoid some problematic cases. But yes, such a change would be very welcome. It might be a substantial chunk of work, though, which is why I haven't done it. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From hoytak at gmail.com Sun Apr 20 03:44:29 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sun, 20 Apr 2008 00:44:29 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> Message-ID: <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> > My preference is for a pattern like this: > > def somefunc(x,y,z, prng=numpy.random): > w = prng.standard_normal() > ... > > This allows the default uses to share the same generator state which > could avoid some problematic cases. This makes sense; I agree. I also like the option of passing a straight random seed as an option. I will at least include your suggestion, and both if it isn't too hard and no one objects. Since scipy.stats works with objects, it seems there are three options: 1. pass prng to the constructor; then all of the members use it. 2. pass prng individually to each rvs() function. 3. both 1 & 2 I can see arguments for both, so 3 might be the best option. > But yes, such a change would be very welcome. It might be a > substantial chunk of work, though, which is why I haven't done it. Maybe, but unless I'm missing something it looks fairly straightforward. It imports numpy.random as mtrand, nice and unique, so it looks like it will be a lot of search and replace. Like I said, it would be quite useful for me, so I'll see what I can do. --Hoyt -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ From millman at berkeley.edu Sun Apr 20 04:14:34 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Sun, 20 Apr 2008 01:14:34 -0700 Subject: [SciPy-dev] Blitz license request for SciPy In-Reply-To: <969620f20803251155v1acf99catf4dd33a228aa6275@mail.gmail.com> References: <969620f20803251155v1acf99catf4dd33a228aa6275@mail.gmail.com> Message-ID: Hello Todd, Thanks very much for Blitz and for allowing us to include it in SciPy with a BSD license. I plan to go ahead and replace the GNU GPL license with the revised BSD license. I have cced the SciPy developer's list, so that we have a public record of your permission for us to make this change. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ On Tue, Mar 25, 2008 at 11:55 AM, Todd Veldhuizen wrote: > I have no objections to releasing it under a BSD license. > > best > Todd > > On 21/03/2008, Jarrod Millman wrote: > > Julian and Todd, > > > > I'm writing to see if we can get an exception to the Blitz license so > > that it can be released with a BSD license within SciPy. Currently, > > the Fedora project is classifying the entire SciPy package as GPLed > > because one package, scipy.weave, uses blitz. (the rest of the code is > > all BSD). We were about to launch into the re-factoring step to make > > the blitz portion optional, when it occurred to us to just ask you > > guys first. :-) > > > > Blitz is included in the scipy.weave package which allows you to > > inline C/C++ within Python. Users have the option of converting NumPy > > arrays to blitz arrays for use in C++. There is also a function > > (weave.blitz) that will automatically convert NumPy expressions in > > Python to Blitz expressions in C++. Often this shows notable speed > > improvements. Our re-factoring step will require us to split the > > blitz code into the blitz and non-blitz portions. This is a bit of a > > pain, and ends up with more installation issues than if we didn't do > > it. > > > > Thanks for your consideration, > > > > -- > > Jarrod Millman From millman at berkeley.edu Sun Apr 20 04:42:16 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Sun, 20 Apr 2008 01:42:16 -0700 Subject: [SciPy-dev] Blitz license request for SciPy In-Reply-To: References: <969620f20803251155v1acf99catf4dd33a228aa6275@mail.gmail.com> Message-ID: Hello, I have filled a ticket for this: http://scipy.org/scipy/scipy/ticket/649 There are 260 files that need to be modified. I doubt that I will have time to do this in the near future, so if anyone feels motivated.... Thanks, Jarrod On Sun, Apr 20, 2008 at 1:14 AM, Jarrod Millman wrote: > Hello Todd, > > Thanks very much for Blitz and for allowing us to include it in SciPy > with a BSD license. I plan to go ahead and replace the GNU GPL > license with the revised BSD license. I have cced the SciPy > developer's list, so that we have a public record of your permission > for us to make this change. > > Thanks, > Jarrod Millman > > On Tue, Mar 25, 2008 at 11:55 AM, Todd Veldhuizen wrote: > > I have no objections to releasing it under a BSD license. > > > > best > > Todd From hoytak at gmail.com Sun Apr 20 16:23:51 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sun, 20 Apr 2008 13:23:51 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> Message-ID: <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> Okay, I'm planning to add this to the stats module this afternoon, and I need to check with people on the best way to include the new random number features. Option 1: Allow two additional keyword arguments to the constructor and to the rvs functions, namely randseed and prng. If neither are given, it would default to the current behavior and draw straight from numpy.random. If randseed is given, but not prng, it would create a random state object with that seed. prng would be a RandomState object from numpy.random; if given randseed is ignored and that is used to draw the random numbers. Any concerns? Option 2: have a single additional kw arg like rnginfo or randominfo. If it's a RandomState instance, it is used to draw numbers from. If it's anything else, it's used to seed the random number generator. If it's None, it's discarded. With both options, arguments to the rvs method would override the arguments to the constructor, but only for that function call. I would personally go for 1., as it seems the potential for API confusion is a lot less. Also, 2 wouldn't allow people to pass in arbitrary random number generating objects that defined the correct methods, whereas 1 would (though this would probably be pretty rare). Now if we do 1, is prng the best argument name for the randomstate object? I could also use rso, rng, randomstateobject, etc. --Hoyt On Sun, Apr 20, 2008 at 12:44 AM, Hoyt Koepke wrote: > > My preference is for a pattern like this: > > > > def somefunc(x,y,z, prng=numpy.random): > > w = prng.standard_normal() > > ... > > > > This allows the default uses to share the same generator state which > > could avoid some problematic cases. > > This makes sense; I agree. I also like the option of passing a > straight random seed as an option. I will at least include your > suggestion, and both if it isn't too hard and no one objects. > > Since scipy.stats works with objects, it seems there are three options: > > 1. pass prng to the constructor; then all of the members use it. > 2. pass prng individually to each rvs() function. > 3. both 1 & 2 > > I can see arguments for both, so 3 might be the best option. > > > > But yes, such a change would be very welcome. It might be a > > substantial chunk of work, though, which is why I haven't done it. > > Maybe, but unless I'm missing something it looks fairly > straightforward. It imports numpy.random as mtrand, nice and unique, > so it looks like it will be a lot of search and replace. Like I said, > it would be quite useful for me, so I'll see what I can do. > > > > --Hoyt > > > -- > +++++++++++++++++++++++++++++++++++ > Hoyt Koepke > UBC Department of Computer Science > http://www.cs.ubc.ca/~hoytak/ > hoytak at gmail.com > +++++++++++++++++++++++++++++++++++ > -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ From robert.kern at gmail.com Sun Apr 20 20:30:43 2008 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 20 Apr 2008 19:30:43 -0500 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> Message-ID: <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> On Sun, Apr 20, 2008 at 3:23 PM, Hoyt Koepke wrote: > Okay, I'm planning to add this to the stats module this afternoon, and > I need to check with people on the best way to include the new random > number features. > > Option 1: > Allow two additional keyword arguments to the constructor and to the > rvs functions, namely randseed and prng. If neither are given, it > would default to the current behavior and draw straight from > numpy.random. If randseed is given, but not prng, it would create a > random state object with that seed. prng would be a RandomState > object from numpy.random; if given randseed is ignored and that is > used to draw the random numbers. Any concerns? > > Option 2: > have a single additional kw arg like rnginfo or randominfo. If it's a > RandomState instance, it is used to draw numbers from. If it's > anything else, it's used to seed the random number generator. If it's > None, it's discarded. I prefer just being able to pass in a RandomState object. It's a subtle encouragement to do the right thing, which is to share a single RandomState object. People who *really* want to just pass in a seed just need to wrap that seed with RandomState(seed). "There should be one-- and preferably only one --obvious way to do it." > With both options, arguments to the rvs method would override the > arguments to the constructor, but only for that function call. > > I would personally go for 1., as it seems the potential for API > confusion is a lot less. Also, 2 wouldn't allow people to pass in > arbitrary random number generating objects that defined the correct > methods, whereas 1 would (though this would probably be pretty rare). > > Now if we do 1, is prng the best argument name for the randomstate > object? I could also use rso, rng, randomstateobject, etc. I like it. I think it's the best mix of terseness and comprehensibility. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From hoytak at gmail.com Sun Apr 20 20:36:21 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sun, 20 Apr 2008 17:36:21 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> Message-ID: <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> > I prefer just being able to pass in a RandomState object. It's a > subtle encouragement to do the right thing, which is to share a single > RandomState object. People who *really* want to just pass in a seed > just need to wrap that seed with RandomState(seed). "There should be > one-- and preferably only one --obvious way to do it." Fair enough. I'll do that. It works best for my particular application anyway. > I like it. I think it's the best mix of terseness and comprehensibility. k, prng it is. -- Hoyt -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ From hoytak at gmail.com Sun Apr 20 23:46:07 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sun, 20 Apr 2008 20:46:07 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> Message-ID: <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> Hello, I've got the changes done, complete with test coverage. I also cleaned up a few minor coding things that make it easier to read the code but don't change the behavior. I'll attach the diff if anyone is curious / wants to give me feedback. However, I have a quick question about something that strikes me as rather odd behavior. When rvs() is called with no size or a size of 1 it returns a python array of size (1,). I find this odd for two reasons: 1. It's inconsistent with numpy.random 2. One of the unit tests in test_distributions.py tests to make sure the default returns a scalar (it now fails). While I'm at it, should I change this behavior? It seems like it could break some backwards compatibility stuff, so I'll leave it for everyone else to decide. -- Hoyt +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ -------------- next part -------------- A non-text attachment was scrubbed... Name: distributions-diff.tar.bz2 Type: application/x-bzip2 Size: 7352 bytes Desc: not available URL: From hoytak at gmail.com Sun Apr 20 23:50:19 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Sun, 20 Apr 2008 20:50:19 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> Message-ID: <4db580fd0804202050g63ece587g5eb9c3d6d9324c2b@mail.gmail.com> Oops -- ignore the line in the diff that says "reshape(, size..." -- I was playing around with having it return a scalar and forgot to undo it before taking the diff. --Hoyt On Sun, Apr 20, 2008 at 8:46 PM, Hoyt Koepke wrote: > Hello, > > I've got the changes done, complete with test coverage. I also > cleaned up a few minor coding things that make it easier to read the > code but don't change the behavior. I'll attach the diff if anyone is > curious / wants to give me feedback. > > However, I have a quick question about something that strikes me as > rather odd behavior. When rvs() is called with no size or a size of 1 > it returns a python array of size (1,). I find this odd for two > reasons: > > 1. It's inconsistent with numpy.random > 2. One of the unit tests in test_distributions.py tests to make sure > the default returns a scalar (it now fails). > > While I'm at it, should I change this behavior? It seems like it > could break some backwards compatibility stuff, so I'll leave it for > everyone else to decide. > > > > -- Hoyt > > +++++++++++++++++++++++++++++++++++ > Hoyt Koepke > UBC Department of Computer Science > http://www.cs.ubc.ca/~hoytak/ > hoytak at gmail.com > +++++++++++++++++++++++++++++++++++ > -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ From peridot.faceted at gmail.com Mon Apr 21 01:04:50 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Mon, 21 Apr 2008 01:04:50 -0400 Subject: [SciPy-dev] scipy.interpolate missing functions Message-ID: Hi, It was recently pointed out on scipy-user that scipy.interpolate has "lagrange", a function to compute the Lagrange interpolating polynomial, but that it is not documented and doesn't work because of a missing import. Not surprisingly, it wasn't tested either. Attached is a patch to fix all three issues (though I haven't tested it because I have not yet figured out how to compile a local version of scipy that (a) doesn't clobber my real installation and (b) actually works). As I was doing this I noticed that there are some other functions in interpolate.py that are not exported, documented, or tested. They do various things to do with splines; I think they're designed to construct splines satisfying various criteria, as well as evaluate those splines and (optionally) convert them to a piecewise-polynomial representation. What's the story on these functions? I notice a lot of "raise NotImplementedError"s; are they some project that was never finished? Anne -------------- next part -------------- A non-text attachment was scrubbed... Name: lagrange.patch Type: text/x-patch Size: 2046 bytes Desc: not available URL: From strawman at astraw.com Mon Apr 21 01:57:05 2008 From: strawman at astraw.com (Andrew Straw) Date: Sun, 20 Apr 2008 22:57:05 -0700 Subject: [SciPy-dev] scipy.interpolate missing functions In-Reply-To: References: Message-ID: <480C2CB1.2040102@astraw.com> Anne Archibald wrote: > I have not yet figured out how to compile a local version of scipy > that (a) doesn't clobber my real installation and (b) actually works). > I suggest virtualenv: http://pypi.python.org/pypi/virtualenv It lets one easily and reproducibly create isolated Python pseudo-installations that you can install packages in without clobbering your system. When I want to test out a development version of scipy, I uninstall scipy from my system (just to be sure that version is not used), and then install to the virtualenv I've created just for the purpose. It doesn't mess with my system Python and it's very easy to use, just "python virtualenv.py ~/ENV" to create a working environment based in ~/ENV and then "source ~/ENV/bin/activate" in any shell where I want to use that environment. (Caveat: I have only used virtualenv in linux.) -Andrew From pav at iki.fi Mon Apr 21 03:25:05 2008 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 21 Apr 2008 07:25:05 +0000 (UTC) Subject: [SciPy-dev] scipy.interpolate missing functions References: <480C2CB1.2040102@astraw.com> Message-ID: Sun, 20 Apr 2008 22:57:05 -0700, Andrew Straw wrote: > Anne Archibald wrote: >> I have not yet figured out how to compile a local version of scipy that >> (a) doesn't clobber my real installation and (b) actually works). > > I suggest virtualenv: http://pypi.python.org/pypi/virtualenv Another way: python setup.py --install=dist/inst export PYTHONPATH=dist/inst/lib/python2.5/site-packages Also, use setupegg.py if your Python system uses setuptools; this is actually important because setuptools looks first for .eggs in PYTHONPATH before looking for ordinary packages. -- Pauli Virtanen From stefan at sun.ac.za Mon Apr 21 03:29:19 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Mon, 21 Apr 2008 09:29:19 +0200 Subject: [SciPy-dev] scipy.interpolate missing functions In-Reply-To: References: Message-ID: <9457e7c80804210029j4aeec2es3f3580b891cd1a6e@mail.gmail.com> On 21/04/2008, Anne Archibald wrote: > a missing import. Not surprisingly, it wasn't tested either. Attached > is a patch to fix all three issues (though I haven't tested it because > I have not yet figured out how to compile a local version of scipy > that (a) doesn't clobber my real installation and (b) actually works). I install SciPy to my home directory: python setup.py install --prefix=${HOME} and then make sure that ${HOME}/lib/python2.5/site-packages is in the PYTHONPATH. As for (b), what specifically goes wrong? Regards St?fan From david at ar.media.kyoto-u.ac.jp Mon Apr 21 05:38:43 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 21 Apr 2008 18:38:43 +0900 Subject: [SciPy-dev] [numscons] 0.6.1 release: it build scipy, and on windows ! Message-ID: <480C60A3.9030201@ar.media.kyoto-u.ac.jp> Hi, I am pleased to announce the 0.6.1 release of numscons. You can get tarballs, eggs for python 2.4/2.5 and windows installers on launchpad: https://launchpad.net/numpy.scons.support/0.6/0.6.1 I did not announce any release of numscons for some time, but this one is a significant milestone: it can build both numpy and scipy on linux, mac os X and windows (mingw only for now), and all tests pass on those three platforms (that is, all tests which pass on distutils build). Here is a full changelog since the last announced release: - Update the local copy of scons to 0.98. All internal modifications to scons needed for numscons are now integrated upstream, which means it is theoretically possible to use an external scons - f2py scons tool is now executed in its own process to avoid race issues with parallel builds. Concretely, this means building scipy with the --jobs options to use multi core should work. - f2py has been almost entirely rewritten: it can now scan the module name automatically, and should be much more reliable. - Msvc runtime issues are now fixed: the PythonExtension builder does not link the msvc runtime if its sources contain some fortran files - Added initial support for C++ flags customization cheers, David From nmarais at sun.ac.za Mon Apr 21 09:13:41 2008 From: nmarais at sun.ac.za (Neilen Marais) Date: Mon, 21 Apr 2008 13:13:41 +0000 (UTC) Subject: [SciPy-dev] Interesting ARPACK branch Message-ID: Hi, I've found that figuring out what is the latest ARPACK release and weather you've got all the newest patches installed is quite a mission. It seems someone has take up a little of the the maintainership issue: http://mathema.tician.de/software/arpack Might be useful to pull the ARPACK code for future scipy releases from there. Just my lazy web-procrastination-induced 2c Neilen From nwagner at iam.uni-stuttgart.de Mon Apr 21 16:02:34 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Mon, 21 Apr 2008 22:02:34 +0200 Subject: [SciPy-dev] Eigensolver Message-ID: FWIW, there is another eigensolver for sparse matrices available at http://www.cs.umd.edu/~rogerlee/ Cheers Nils From travis at enthought.com Mon Apr 21 19:59:24 2008 From: travis at enthought.com (Travis Vaught) Date: Mon, 21 Apr 2008 18:59:24 -0500 Subject: [SciPy-dev] ANN: EPD - Enthought Python Distribution released Message-ID: Greetings, Enthought is pleased to announce the release of the Enthought Python Distribution (EPD) version 2.5.2001. http://www.enthought.com/epd This release makes available both the RedHat 3.x (amd64) and Windows XP (x86) installers. OS X, Ubuntu and more (modern) RHEL versions are coming soon(!). About EPD --------- The Enthought Python Distribution is a "kitchen-sink-included" distribution of the Python Programming Language as well as over 60 additional tools and libraries. It includes NumPy, SciPy, IPython, 2D and 3D visualization, database adapters and a lot of other tools right out of the box. Enthought is offering access to this bundle as a free service to academic and other non-profit organizations. We also offer an annual fee-based subscription service for Commercial and Governmental users to download and update the software bundle. (Everyone may try it out for free. Please see the License Information below.) Included Software ----------------- A short list includes: Python 2.5.2, NumPy, SciPy, Traits, Mayavi, Chaco, Kiva, Enable, Matplotlib, wxPython and VTK. The complete list of software with version numbers is available here: http://www.enthought.com/products/epdlibraries.php License Information ------------------- EPD is a bundle of software, every piece of which is available separately for free under various open-source licenses. Not-for- profit, private-sector access to the bundle and its updates is, and will remain, free under the terms of the Subscription Agreement (see http://www.enthought.com/products/epdlicense.php ). Commercial and Governmental users may try the bundle for free for 30 days. After the trial period, users may purchase a one-year subscription to download and update the bundle. Downloaded software obtained under the subscription agreement may be used by the subscriber in perpetuity. This model should sound familiar, as our commercial offering is quite similar to the business model of a certain linux distributor. More information is also available in the FAQ ( http://www.enthought.com/products/epdfaq.php ). For larger deployments, or those with special build or distribution needs, an Enterprise Subscription is also available. Thanks ------ EPD is compelling because it solves a thorny packaging and distribution problem, but also because of the libraries which it includes. The folks here at Enthought would like to thank the Python developer community and the wider community that authors and contributes to these included libraries. We put these things to work every day and would be much less productive without them. So, thanks! From peridot.faceted at gmail.com Mon Apr 21 22:59:19 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Mon, 21 Apr 2008 22:59:19 -0400 Subject: [SciPy-dev] scipy.interpolate missing functions In-Reply-To: <9457e7c80804210029j4aeec2es3f3580b891cd1a6e@mail.gmail.com> References: <9457e7c80804210029j4aeec2es3f3580b891cd1a6e@mail.gmail.com> Message-ID: On 21/04/2008, St?fan van der Walt wrote: > On 21/04/2008, Anne Archibald wrote: > > a missing import. Not surprisingly, it wasn't tested either. Attached > > is a patch to fix all three issues (though I haven't tested it because > > I have not yet figured out how to compile a local version of scipy > > that (a) doesn't clobber my real installation and (b) actually works). > > I install SciPy to my home directory: > > python setup.py install --prefix=${HOME} > > and then make sure that > > ${HOME}/lib/python2.5/site-packages > > is in the PYTHONPATH. Ah, got it working, nose, cython, eggs, and all. > As for (b), what specifically goes wrong? Well, the problem I had when I sent the email was version skew between numpy and scipy. I've cured that; now I have a fistful of test failures: ====================================================================== ERROR: Tests pdist(X, 'cosine') on the Iris data set. ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/tests/test_hierarchy.py", line 263, in test_pdist_cosine_iris Y_test1 = pdist(X, 'cosine') File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/hierarchy.py", line 1410, in pdist dm = squareform(dm) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/hierarchy.py", line 766, in squareform raise ValueError('The distance matrix must be symmetrical.') ValueError: The distance matrix must be symmetrical. ====================================================================== ERROR: Tests pdist(X, 'cosine') on random data. ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/tests/test_hierarchy.py", line 244, in test_pdist_cosine_random Y_test1 = pdist(X, 'cosine') File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/hierarchy.py", line 1410, in pdist dm = squareform(dm) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/cluster/hierarchy.py", line 766, in squareform raise ValueError('The distance matrix must be symmetrical.') ValueError: The distance matrix must be symmetrical. ====================================================================== ERROR: Failure: ImportError (cannot import name _bspline) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py", line 364, in loadTestsFromName addr.filename, addr.module) File "/home/peridot/devlib/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/peridot/devlib/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_bspline.py", line 9, in import scipy.stats.models.bspline as B File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/bspline.py", line 23, in from scipy.stats.models import _bspline ImportError: cannot import name _bspline ====================================================================== ERROR: test_factor3 (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_formula.py", line 231, in test_factor3 m = fac.main_effect(reference=1) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/formula.py", line 273, in main_effect reference = names.index(reference) ValueError: list.index(x): x not in list ====================================================================== ERROR: test_factor4 (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_formula.py", line 239, in test_factor4 m = fac.main_effect(reference=2) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/formula.py", line 273, in main_effect reference = names.index(reference) ValueError: list.index(x): x not in list ====================================================================== ERROR: test_huber (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_scale.py", line 35, in test_huber m = scale.huber(X) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/peridot/devlib/lib/python2.5/site-packages/numpy/core/fromnumeric.py", line 930, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== ERROR: test_huberaxes (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_scale.py", line 40, in test_huberaxes m = scale.huber(X, axis=0) File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/home/peridot/devlib/lib/python2.5/site-packages/numpy/core/fromnumeric.py", line 930, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== FAIL: test_gammaincinv (test_basic.TestGamma) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/special/tests/test_basic.py", line 1056, in test_gammaincinv assert_almost_equal(0.05, x, decimal=10) File "/home/peridot/devlib/lib/python2.5/site-packages/numpy/testing/utils.py", line 158, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: 0.050000000000000003 DESIRED: 0.0 ====================================================================== FAIL: test_namespace (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/peridot/devlib/lib/python2.5/site-packages/scipy/stats/models/tests/test_formula.py", line 119, in test_namespace self.assertEqual(xx.namespace, Y.namespace) AssertionError: {} != {'Y': array([ 0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78, 80, 82, 84, 86, 88, 90, 92, 94, 96, 98]), 'X': array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49])} I probably don't have development versions of ATLAS and whatnot installed, but scipy purports to have installed correctly. (Well, there is the usual monstrous stream of messages, but none of them appear to indicate that I should expect a broken installation.) I am able to test scipy.interpolate, and with a minor fix, lagrange() now passes its test. I have write access to numpy SVN, which apparently gives me write access to scipy SVN too (though I haven't tried committing anything). Should I just check in the fix, or file a bug and attach a patch? Anne From david at ar.media.kyoto-u.ac.jp Tue Apr 22 05:13:19 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 22 Apr 2008 18:13:19 +0900 Subject: [SciPy-dev] [numscons] 0.6.3 release: building scipy with MS compilers works Message-ID: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> Hi, Sorry for announcing one more numscons release in such a short time: this releases can finally handle the last common platform, Visual Studio on win32. Win32 installers and source tarballs can be found on launchpad, as usual: https://code.launchpad.net/numpy.scons.support/0.6/0.6.3 Python eggs on pypi should follow soon (I cannot upload to pypi from my lab, unfortunately). Except the various fixes necessary to make the build work with MS compilers, there were a few improvements: - some more improvements for f2py scons tool, which simplified a bit some scipy scons scripts - the addition of a silent mode, to get more terse command lines output (not perfect yet), ala kbuild for people familiar with compiling linux (e.g. you get CC foo.c instead of gcc -W -Wall blas blas blas). The goal is to make warning more obvious (and to fix them at some point, of course :) ). Just use python setupscons.py scons --silent=N with N between 1 and 3 to see the result. As this was the last major platform I wanted to support, I can now move on polishing the API as well as starting a real documentation for package developers. cheers, David From ondrej at certik.cz Tue Apr 22 06:07:14 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Tue, 22 Apr 2008 12:07:14 +0200 Subject: [SciPy-dev] Eigensolver In-Reply-To: References: Message-ID: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> On Mon, Apr 21, 2008 at 10:02 PM, Nils Wagner wrote: > FWIW, there is another eigensolver for sparse matrices > available at > > http://www.cs.umd.edu/~rogerlee/ Indeed. Thanks for the tip. We really need to get all these solvers under one hood. I was just too busy lately, but I am sure either me, or someone else will finally get fed up and do it. :) Ondrej From stefan at sun.ac.za Tue Apr 22 11:06:52 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Tue, 22 Apr 2008 17:06:52 +0200 Subject: [SciPy-dev] scipy.interpolate missing functions In-Reply-To: References: <9457e7c80804210029j4aeec2es3f3580b891cd1a6e@mail.gmail.com> Message-ID: <9457e7c80804220806p164dcd67o6446162c3a065d55@mail.gmail.com> 2008/4/22 Anne Archibald : > Well, the problem I had when I sent the email was version skew between > numpy and scipy. I've cured that; now I have a fistful of test > failures: Sadly, that is the current state of things. As soon as the next NumPy is out the door (cheers to Chuck who has been hunting down those memory leaks) we'll focus our attention here. Regards St?fan From swisher at enthought.com Tue Apr 22 12:19:13 2008 From: swisher at enthought.com (Janet Swisher) Date: Tue, 22 Apr 2008 11:19:13 -0500 Subject: [SciPy-dev] [Enthought-dev] [SciPy-user] ANN: EPD - Enthought Python Distribution released In-Reply-To: <480D8B37.6020108@gmail.com> References: <480D8B37.6020108@gmail.com> Message-ID: <480E1001.5030308@enthought.com> Stef Mientki wrote: > Travis Vaught wrote: >> Greetings, >> >> Enthought is pleased to announce the release of the Enthought Python >> Distribution (EPD) version 2.5.2001. >> >> http://www.enthought.com/epd >> >> > Could someone tell me the difference between EPD and ETS ? > If I look at the summary, I see > EPD = ETS + 10 other packages, > but 8 of the 10 other packages are already in ETS ??? See http://www.enthought.com/products/epdlibraries.php for a complete list of libraries in EPD. It is much more than ETS. -- Janet Swisher, Sr. Technical Writer Enthought, Inc., http://www.enthought.com From nwagner at iam.uni-stuttgart.de Tue Apr 22 13:02:22 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 22 Apr 2008 19:02:22 +0200 Subject: [SciPy-dev] Nastran generated sparse matrices was Re: Eigensolver In-Reply-To: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> References: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> Message-ID: On Tue, 22 Apr 2008 12:07:14 +0200 "Ondrej Certik" wrote: > On Mon, Apr 21, 2008 at 10:02 PM, Nils Wagner > wrote: >> FWIW, there is another eigensolver for sparse matrices >> available at >> >> http://www.cs.umd.edu/~rogerlee/ > > Indeed. Thanks for the tip. We really need to get all >these solvers > under one hood. I was just too busy lately, but I am >sure either me, > or someone else will finally get fed up and do it. :) > > Ondrej > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Hi all, Is there a chance to add the following tool as a scikit ? The software by Al Danial is released under GPL. http://cvs.savannah.gnu.org/viewvc/tops/tops/usr/extra/op4tools/ The purpose is to read and write Nastran op4 files. Nils From hoytak at gmail.com Tue Apr 22 19:10:44 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Tue, 22 Apr 2008 16:10:44 -0700 Subject: [SciPy-dev] code for weave / blitz function wrapper generation Message-ID: <4db580fd0804221610j6d5924e8ia413889af13d6c39@mail.gmail.com> Hello, I recently wrote a general purpose function for automatically creating wrappers to C++ functions using weave, and I thought others might find it useful as well. In particular, I think a good place for it would be as a method in the weave ext_module class if others agree. I'm also looking for thoughts, suggestions, and for more people to test it. All the heavy lifting is done by weave, so hopefully the subtle errors are minimal. I've attached the code and an example (with setup.py) for people to look at. To try it out, run ./setup.py build and then ./test_wrapper.py. The interesting function is in weave_wrap_functions.py. Here's a brief overview: To create a function module, you specify the python name, the cpp function's call name, a list of argument types as strings, and whether it returns a value. It supports argument types of (from the doc string): Scalars: "double" (python float) "float" (python float) "int" (python int) "long int" (python int) Blitz Arrays (converted from numpy arrays), 1 <= dim <= 11 "Array", (numpy dtype=float64) "Array", (numpy dtype=float32) "Array" (numpy dtype=int32) "Array" (numpy dtype=int32) "Array" (numpy dtype=int32) "Array" (numpy dtype=int16) "Array" (numpy dtype=int16) "Array" (numpy dtype=uint32) "Array" (numpy dtype=uint32) "Array" (numpy dtype=uint32) "Array" (numpy dtype=uint32) "Array" (numpy dtype=uint16) C++ types: "string" (python string) Thus, for example, if you have the following c++ function definition: string printData(Array& array2d, Array& idxarray, const string& msg, double a_double, int an_int); You could wrap it by calling add_function_wrapper(wrappermod, "printData", "example::printData", arglist = ["Array", "Array", "string", "double", "int"], returns_value=True) This would add a function to the ext_module wrappermod called printData which simply converts everything and calls the function. Anyway, I hope this might be useful to someone! I'll welcome any feedback. --Hoyt -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ -------------- next part -------------- A non-text attachment was scrubbed... Name: weave_wrapper.tar.gz Type: application/x-gzip Size: 3688 bytes Desc: not available URL: From peridot.faceted at gmail.com Tue Apr 22 19:19:57 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Tue, 22 Apr 2008 19:19:57 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy Message-ID: Hi, What do people think about including cython code in scipy? Pros: * Easy way to write faster code * Convenient way to interface to compiled code Cons: * Yet another interfacing setup * Yet another language readers of the code need to understand * Additional dependency for developers * cython is in flux and drastic improvements in its handling of numpy arrays are hoped for * Requires support in build tools (distutils/makefiles/scons magic) For one data point, a quick and nasty cythonization of my polynomial interpolation code got something like a 4x speedup for large arrays of evaluation points. This is a big improvement but far from the 20-100x cython claims for raw python. I can't judge the behaviour for shorter arrays without doing a more reasonable job of converting to cython, but by looking at the (horrible) generated C code, the inner loops are raw C with no python objects at all. (The 4x figure also uses gcc's -march and -O options.) I take this as evidence that numpy's arrays are doing a fairly good job of handing off the heavy lifting to compiled code. I can definitely see that if a little work went into cython's support of numpy arrays specifically the code could get a lot cleaner: things like c[i*k+j] could become c[i,k] with support for arbitrary strides and (possibly) type conversion. So maybe it doesn't make sense to start producing lots of cython code until that happens. Anne From robert.kern at gmail.com Tue Apr 22 19:28:29 2008 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 22 Apr 2008 18:28:29 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> On Tue, Apr 22, 2008 at 6:19 PM, Anne Archibald wrote: > Hi, > > What do people think about including cython code in scipy? > > Pros: > * Easy way to write faster code > * Convenient way to interface to compiled code > > Cons: > * Yet another interfacing setup > * Yet another language readers of the code need to understand > * Additional dependency for developers > * cython is in flux and drastic improvements in its handling of numpy > arrays are hoped for > * Requires support in build tools (distutils/makefiles/scons magic) Well, we already have SWIG code in scipy and even Pyrex code in numpy. The way we have handled it thus far is to make developers check in the generated code. The setup scripts refer to the generated C files rather than the real sources. In most places, though, the lines to derive directly from the real sources are just commented out so developers can switch between them if they are touching that code. Checking in generated sources into source control is evil, frankly, but it resolves some of these headaches and replaces them with only lesser ones. I am all for your using Cython. Keep track of the version that you are using in a comment near the top of the file to warn other developers to use the same version or otherwise coordinate on upgrading. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From bsouthey at gmail.com Tue Apr 22 21:21:33 2008 From: bsouthey at gmail.com (Bruce Southey) Date: Tue, 22 Apr 2008 20:21:33 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> References: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> Message-ID: Hi, Surely Alan McIntyre's project (Cython-based enhancements to NumPy: http://code.google.com/soc/2008/psf/appinfo.html?csaid=CDEB395EBA2BB1D8 ) for Google of Summer Code will be a result in a maintained base for everyone who wants to include it. Bruce On Tue, Apr 22, 2008 at 6:28 PM, Robert Kern wrote: > On Tue, Apr 22, 2008 at 6:19 PM, Anne Archibald > wrote: > > Hi, > > > > What do people think about including cython code in scipy? > > > > Pros: > > * Easy way to write faster code > > * Convenient way to interface to compiled code > > > > Cons: > > * Yet another interfacing setup > > * Yet another language readers of the code need to understand > > * Additional dependency for developers > > * cython is in flux and drastic improvements in its handling of numpy > > arrays are hoped for > > * Requires support in build tools (distutils/makefiles/scons magic) > > Well, we already have SWIG code in scipy and even Pyrex code in numpy. > The way we have handled it thus far is to make developers check in the > generated code. The setup scripts refer to the generated C files > rather than the real sources. In most places, though, the lines to > derive directly from the real sources are just commented out so > developers can switch between them if they are touching that code. > Checking in generated sources into source control is evil, frankly, > but it resolves some of these headaches and replaces them with only > lesser ones. > > I am all for your using Cython. Keep track of the version that you are > using in a comment near the top of the file to warn other developers > to use the same version or otherwise coordinate on upgrading. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > -- Umberto Eco > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From phaustin at gmail.com Tue Apr 22 22:07:09 2008 From: phaustin at gmail.com (Phil Austin) Date: Tue, 22 Apr 2008 19:07:09 -0700 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> Message-ID: <480E99CD.7020008@gmail.com> Bruce Southey wrote: > Hi, > Surely Alan McIntyre's project (Cython-based enhancements to NumPy: > http://code.google.com/soc/2008/psf/appinfo.html?csaid=CDEB395EBA2BB1D8 > ) for Google of Summer Code will be a result in a maintained base for > everyone who wants to include it. > and there's also Dag Sverre Seljebotn'S project: http://code.google.com/soc/2008/psf/appinfo.html?csaid=8C68A3548D9F97B1 From wnbell at gmail.com Wed Apr 23 02:17:33 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 01:17:33 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: On Tue, Apr 22, 2008 at 6:19 PM, Anne Archibald wrote: > > What do people think about including cython code in scipy? > Writing a substantial amount of code in a special-purpose language like Cython is always risky. Will Cython exist in 5,10,15 years? I have little doubt that in 10 years I'll still be able to compile the C/C++ codes I develop today. Furthermore, if Cython/SWIG/SciPy/NumPy disappeared overnight I'd still have a decent library to work with. IMO using Cython for one-off application or wrappers is fine, but large-scale Cython implementation is probably a bad long-term approach. I don't mean to dismiss the merits of Cython. It certainly has a niche and might very well become the "right" way to interface Python to compiled code. OTOH I find the Cython proponents to be a bit overzealous, so I'm wary of signing on just yet :) -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From alan.mcintyre at gmail.com Wed Apr 23 02:33:39 2008 From: alan.mcintyre at gmail.com (Alan McIntyre) Date: Wed, 23 Apr 2008 02:33:39 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> Message-ID: <1d36917a0804222333q2a2710b5se7a19fcf094f19f6@mail.gmail.com> I suppose I should say hello, so Hi, everybody! :) On Tue, Apr 22, 2008 at 9:21 PM, Bruce Southey wrote: > Surely Alan McIntyre's project (Cython-based enhancements to NumPy: > http://code.google.com/soc/2008/psf/appinfo.html?csaid=CDEB395EBA2BB1D8 > ) for Google of Summer Code will be a result in a maintained base for > everyone who wants to include it. I tend to agree with Anne and Robert; there should be some demonstrably significant advantage before adding a bunch of Cython to the SciPy codebase, and I'm not a big fan of checking in generated code if it can be helped. If it turns out that Cython is not the correct tool for the job, then I think there's enough leeway in the GSoC guidelines to allow me to contribute something useful to SciPy without using Cython. I believe that the indexing improvement Anne listed is on Dag's list of things to do for his project, by the way. (Even if I'm remembering incorrectly and it's not, maybe we can talk him into it. :) Cheers, Alan From alan.mcintyre at gmail.com Wed Apr 23 02:52:07 2008 From: alan.mcintyre at gmail.com (Alan McIntyre) Date: Wed, 23 Apr 2008 02:52:07 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> On Wed, Apr 23, 2008 at 2:17 AM, Nathan Bell wrote: > I don't mean to dismiss the merits of Cython. It certainly has a > niche and might very well become the "right" way to interface Python > to compiled code. OTOH I find the Cython proponents to be a bit > overzealous, so I'm wary of signing on just yet :) If it helps any, I'm not a Cython proponent, just a student that (1) wanted to work on some numerical software and (2) needed some income over the summer, so I proposed to do some of the items listed on the NumPy/SciPy wish lists, which happened to mention Cython. :) If using Cython isn't acceptable to the SciPy/NumPy community at large (i.e., if I'm going to work on something for 2-3 months that won't ever be committed), then it would be nice to know that up front. I can either choose to use other tools, or do some work to help make using Cython less of a pain. Either way is fine with me, of course. Alan From matthew.brett at gmail.com Wed Apr 23 03:12:06 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 07:12:06 +0000 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> References: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> Message-ID: <1e2af89e0804230012y26b1b38flcbb6fba9cc004226@mail.gmail.com> Hi, > If using Cython isn't acceptable to the SciPy/NumPy community at large > (i.e., if I'm going to work on something for 2-3 months that won't > ever be committed), then it would be nice to know that up front. I > can either choose to use other tools, or do some work to help make > using Cython less of a pain. Either way is fine with me, of course. It seems to me likely, especially from what I hear about cython in SAGE, that cython is going to be a very important tool for scientific python, and could well be absolutely central. Python makes code readable, and maintainable, and yet, for scientific applications, we often need the speed of C or its libraries. There is a much smaller number of Numpy / Scipy developers who feel comfortable working on the Numpy C code, and that means fewer eyes reviewing and maintaining large parts of Numpy. Cython has the potential to make writing C with Numpy very nearly as easy as writing python. If that happens, it could make it an order of magnitude easier to develop with Numpy, and that in turn could really change the speed and quality of development. So, I ask, on bended knee, please, go for it. Many of us are very excited to see how your project goes. . Cheers, Matthew From matthew.brett at gmail.com Wed Apr 23 03:48:48 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 07:48:48 +0000 Subject: [SciPy-dev] Scipy tests can't be run when scipy installed as an egg In-Reply-To: <6D17D64C-E4EE-491B-8557-32E7621DBFA7@physics.mcmaster.ca> References: <6D17D64C-E4EE-491B-8557-32E7621DBFA7@physics.mcmaster.ca> Message-ID: <1e2af89e0804230048h67db184agdc915f7813b34e92@mail.gmail.com> Hi, On Thu, Apr 17, 2008 at 10:02 PM, David M. Cooke wrote: > I usually install scipy as an egg, and I noticed[1] that you can't run > the test suite when installed that way: Thanks for tracking this down. As you'd expect, this is a nose testing problem. It's known to the developers: http://code.google.com/p/python-nose/issues/detail?id=78 I'll let them know that this is a problem for us... Thanks, Matthew From pearu at cens.ioc.ee Wed Apr 23 03:56:48 2008 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Wed, 23 Apr 2008 09:56:48 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> References: <3d375d730804221628r5dabea98ld7896e1eaa66bf9c@mail.gmail.com> Message-ID: <480EEBC0.6030208@cens.ioc.ee> Robert Kern wrote: > On Tue, Apr 22, 2008 at 6:19 PM, Anne Archibald > wrote: >> Hi, >> >> What do people think about including cython code in scipy? >> >> Pros: >> * Easy way to write faster code >> * Convenient way to interface to compiled code >> >> Cons: >> * Yet another interfacing setup >> * Yet another language readers of the code need to understand >> * Additional dependency for developers >> * cython is in flux and drastic improvements in its handling of numpy >> arrays are hoped for >> * Requires support in build tools (distutils/makefiles/scons magic) > > Well, we already have SWIG code in scipy and even Pyrex code in numpy. > The way we have handled it thus far is to make developers check in the > generated code. The setup scripts refer to the generated C files > rather than the real sources. In most places, though, the lines to > derive directly from the real sources are just commented out so > developers can switch between them if they are touching that code. > Checking in generated sources into source control is evil, frankly, > but it resolves some of these headaches and replaces them with only > lesser ones. > > I am all for your using Cython. Keep track of the version that you are > using in a comment near the top of the file to warn other developers > to use the same version or otherwise coordinate on upgrading. In a longer term, we can add cython support to numpy.distutils (like we currently have support for Pyrex, SWIG, f2py, template tools). I wonder how big is the cython code base, may be it would be a worth to make a snapshot of cython (when it becomes more stable) and include it to numpy so that there would be no need to commit generated sources to the repository. It might actually make scipy distribution smaller as the generated sources may have higher LOC number than the cython package. Pearu From stefan at sun.ac.za Wed Apr 23 04:02:27 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Wed, 23 Apr 2008 10:02:27 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: <9457e7c80804230102q3f9f1f04u7fdb87dcf5a641d0@mail.gmail.com> 2008/4/23 Nathan Bell : > Writing a substantial amount of code in a special-purpose language > like Cython is always risky. Will Cython exist in 5,10,15 years? The idea is, of course, that Cython should ultimately *not* be a special-purpose language, but as close to Python as possible. That is also why we rely on those two GSOC projects, which try to eliminate the ndarray indexing differences between Python and Cython. That way, in 10 years you'll still have Python code, even if cython doesn't exist any more. Hopefully, by then, PyPy or some such project would have capitalised on VM developments to such extent that your Python code runs as fast as the current C code (look at what the jruby guys have accomplished, for example). Your SWIG wrappers are easy to read, but that is not the case for some other parts of NumPy and SciPy, though, so I buy Matthew's argument: to increase the number of eyes on our code, we should improve readability. For example, I could see that implementing lil_matrix in Cython would make sense, given that it runs for loops over Python lists. Also, a great number of bugs in NumPy originate from reference counting errors, which are notoriously hard to catch; Cython helps skirt those issues. To be clear, I'm all for leaving sparsetools alone (it works). For implementing new code in SciPy -- especially sections that interface heavily with Python -- I'd use Cython, though. Cheers St?fan From ondrej at certik.cz Wed Apr 23 09:00:53 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 23 Apr 2008 15:00:53 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> On Wed, Apr 23, 2008 at 8:17 AM, Nathan Bell wrote: > On Tue, Apr 22, 2008 at 6:19 PM, Anne Archibald > wrote: > > > > > What do people think about including cython code in scipy? > > > > Writing a substantial amount of code in a special-purpose language > like Cython is always risky. Will Cython exist in 5,10,15 years? > > I have little doubt that in 10 years I'll still be able to compile the > C/C++ codes I develop today. Furthermore, if Cython/SWIG/SciPy/NumPy > disappeared overnight I'd still have a decent library to work with. > IMO using Cython for one-off application or wrappers is fine, but > large-scale Cython implementation is probably a bad long-term > approach. Isn't this a similar (if not the same) problem when writing your code using SWIG? Let's take sparsetools.i as an example, it also contains a lot of non python, non C code. So one needs to depend (and maintain!) SWIG. Well, Cython is a lot more lightweight and more robust solution imho. Of course it'd be great if the Cython code would also be a regular Python code, but that will still take some time. But they seem to like it, for example when I first learned about Cython, I told Robert and William -- why to use this ugly "for i from 1 <= i < dim:", why not to use the python way "for i in range(1, dim):"? And Robert has implemented this in less than a month. The next step could be to generate Cython code from regular Python code with annotations in comments or strings imho. But even if Cython stays as yet another language -- it's still easier and less error prone than to write the C code by hand. Ondrej From wnbell at gmail.com Wed Apr 23 10:41:06 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 09:41:06 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 8:00 AM, Ondrej Certik wrote: > > Isn't this a similar (if not the same) problem when writing your code > using SWIG? No, because the implementation (i.e. part that's harder to recreate) is still written in C++. > Let's take sparsetools.i as an example, it also contains a lot of non > python, non C code. So one needs to depend (and maintain!) SWIG. Well, > Cython is a lot more lightweight and more robust solution imho. A lot? I took numpy.i from the NumPy repository, made a few changes, and added sparsetools.i to generate the templates. If SWIG disappeared overnight, I'd wrap *the very same, unmodified C++ code* using another wrapper generator (perhaps Cython). If I wanted to use sparsetools in another VHLL then I would wrap it for that one too. OTOH if I had written sparsetools in Cython (as opposed to wrapping sparsetools with Cython) and Cython disappeared, then I'd have to rewrite the a substantial amount of the implementation. Again, I think using Cython in place of SWIG is a fine thing to do. I just wouldn't want write too much of it any more than I'd want to write reams of hand-coded C wrappers or reams of SWIG. > But even if Cython stays as yet another language -- it's still easier > and less error prone than to write the C code by hand. I don't disagree. However, writing C++ and wrapping with SWIG is also easier than writing hand-coded extension modules, with the added benefit of decoupling the core implementation from the interface. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Apr 23 10:52:22 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 09:52:22 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804230102q3f9f1f04u7fdb87dcf5a641d0@mail.gmail.com> References: <9457e7c80804230102q3f9f1f04u7fdb87dcf5a641d0@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 3:02 AM, St?fan van der Walt wrote: > > The idea is, of course, that Cython should ultimately *not* be a > special-purpose language, but as close to Python as possible. That is > also why we rely on those two GSOC projects, which try to eliminate > the ndarray indexing differences between Python and Cython. That way, > in 10 years you'll still have Python code, even if cython doesn't > exist any more. Hopefully, by then, PyPy or some such project would > have capitalised on VM developments to such extent that your Python > code runs as fast as the current C code (look at what the jruby guys > have accomplished, for example). Then, of course, the discussion is moot. However, a cursory look at the current SAGE source code reveals that *in practice* Cython is very much a special-purpose language. > Your SWIG wrappers are easy to read, but that is not the case for some > other parts of NumPy and SciPy, though, so I buy Matthew's argument: > to increase the number of eyes on our code, we should improve > readability. That argument works as long as Cython source code resembles Python source code in appearance and function. > For example, I could see that implementing lil_matrix in > Cython would make sense, given that it runs for loops over Python > lists. Also, a great number of bugs in NumPy originate from reference > counting errors, which are notoriously hard to catch; Cython helps > skirt those issues. I agree, lil_matrix is a good candidate for Cython. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Apr 23 11:01:24 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 10:01:24 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <1e2af89e0804230012y26b1b38flcbb6fba9cc004226@mail.gmail.com> References: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> <1e2af89e0804230012y26b1b38flcbb6fba9cc004226@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 2:12 AM, Matthew Brett wrote: > > It seems to me likely, especially from what I hear about cython in > SAGE, that cython is going to be a very important tool for scientific > python, and could well be absolutely central. I wouldn't look to SAGE for an unbiased assessment on that point. They already have a vested interest in Cython, we don't. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Wed Apr 23 11:10:38 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 15:10:38 +0000 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> <1e2af89e0804230012y26b1b38flcbb6fba9cc004226@mail.gmail.com> Message-ID: <1e2af89e0804230810t43611205v1f6f108c3f5b341f@mail.gmail.com> Hi, > > It seems to me likely, especially from what I hear about cython in > > SAGE, that cython is going to be a very important tool for scientific > > python, and could well be absolutely central. > > I wouldn't look to SAGE for an unbiased assessment on that point. > They already have a vested interest in Cython, we don't. Well, I haven't talked to the SAGE people directly, but in any case, I think their interest is the same as ours - moving difficult-to-maintain C code out into easy-to-maintain cython code. Best, Matthew From ggellner at uoguelph.ca Wed Apr 23 11:28:40 2008 From: ggellner at uoguelph.ca (Gabriel Gellner) Date: Wed, 23 Apr 2008 11:28:40 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> Message-ID: <20080423152840.GA11364@encolpuis> I am confused by what Cython disappearing means. It is pure python, and generates C code. Lets say it is no longer maintained, I can still build my existing code using the last version and I am no worse off. For my code to become useless python would also have to disappear (or change so dramatically that I couldn't use my old codes). By this logic no core tool should be written in python either? The code is open source it can't be taken away from me, it just might become archaic, but that is no different from the current situation with g77, which is still used extensively. I agree that C++ is more portable, but are there examples of open tools really disappearing? Gabriel On Wed, Apr 23, 2008 at 09:41:06AM -0500, Nathan Bell wrote: > On Wed, Apr 23, 2008 at 8:00 AM, Ondrej Certik wrote: > > > > Isn't this a similar (if not the same) problem when writing your code > > using SWIG? > > No, because the implementation (i.e. part that's harder to recreate) > is still written in C++. > > > Let's take sparsetools.i as an example, it also contains a lot of non > > python, non C code. So one needs to depend (and maintain!) SWIG. Well, > > Cython is a lot more lightweight and more robust solution imho. > > A lot? I took numpy.i from the NumPy repository, made a few changes, > and added sparsetools.i to generate the templates. If SWIG > disappeared overnight, I'd wrap *the very same, unmodified C++ code* > using another wrapper generator (perhaps Cython). If I wanted to use > sparsetools in another VHLL then I would wrap it for that one too. > > OTOH if I had written sparsetools in Cython (as opposed to wrapping > sparsetools with Cython) and Cython disappeared, then I'd have to > rewrite the a substantial amount of the implementation. > > Again, I think using Cython in place of SWIG is a fine thing to do. I > just wouldn't want write too much of it any more than I'd want to > write reams of hand-coded C wrappers or reams of SWIG. > > > But even if Cython stays as yet another language -- it's still easier > > and less error prone than to write the C code by hand. > > I don't disagree. However, writing C++ and wrapping with SWIG is also > easier than writing hand-coded extension modules, with the added > benefit of decoupling the core implementation from the interface. > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From ellisonbg.net at gmail.com Wed Apr 23 12:00:54 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Wed, 23 Apr 2008 10:00:54 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: Message-ID: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> > Writing a substantial amount of code in a special-purpose language > like Cython is always risky. Will Cython exist in 5,10,15 years? At some level, I agree with this assessment. I think it would be a bad decision if there was a strong chance that cython would vanish while scipy/numpy was still using it actively. But I think this is unlikely and here's why: 1) The high-profile scientific computing+python projects that are written using it (or pyrex): mpi4py (mpi binding in python - currently being rewritten to use cython) pytables (hdf5 python bindings) sage numpy.random RIght now, the worst case scenario is if cython became unmaintained. But, with the large number of people depending on it, I think the community would simply pick up the slack - the beauty of open source. 2) Cython is unique in what it does. All the other solutions for making python code fast (swig, f2py, ctypes, boost, etc.) still require spending lots of time writing low level C/C++/fortran code. Sure, many of us have the skills to do that, but the reality is that few of us have the _time_ to write that code. If I had to write the ipython distributed array package in C/C++ and then wrap it into python it simply _would_not_happen_ (because of time constraints). > I have little doubt that in 10 years I'll still be able to compile the > C/C++ codes I develop today. Furthermore, if Cython/SWIG/SciPy/NumPy > disappeared overnight I'd still have a decent library to work with. > IMO using Cython for one-off application or wrappers is fine, but > large-scale Cython implementation is probably a bad long-term > approach. True, a C/C++ library would live on _if_ these projects vanished. But I don't think you have made a strong argument that these projects will in fact vanish. Furthermore, I think your actions say the opposite. You have invested a huge amount of time working on scipy.sparse (which is greatly appreciated). I could be wrong, but I don't think you would do that if you really thought scipy was going to die soon :) Another relevant point: recently I have spent quite a lot of time wrapping templated C++ code with Cython (a 1 million LOC parallel c++ electromagnetics code). This code uses the STL extensively and Cython has worked really well. If someone really wants to use cython in the same way you have used swig, that is completely possible - you still end up with the nice C++ library underneath. The big difference though between cython and swig is that the top level layer in swig is slow pure python code. In cython, you get fast C/python extension types that can be called from other C/C++ extension code without going through a slow python layer. That is a huge difference. Brian > I don't mean to dismiss the merits of Cython. It certainly has a > niche and might very well become the "right" way to interface Python > to compiled code. OTOH I find the Cython proponents to be a bit > overzealous, so I'm wary of signing on just yet :) > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ellisonbg.net at gmail.com Wed Apr 23 12:28:38 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Wed, 23 Apr 2008 10:28:38 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> References: <1d36917a0804222352y201b490cw2d1736fd5998b051@mail.gmail.com> Message-ID: <6ce0ac130804230928j34e27decw325ca26dfda11724@mail.gmail.com> > If it helps any, I'm not a Cython proponent, just a student that (1) > wanted to work on some numerical software and (2) needed some income > over the summer, so I proposed to do some of the items listed on the > NumPy/SciPy wish lists, which happened to mention Cython. :) > > If using Cython isn't acceptable to the SciPy/NumPy community at large > (i.e., if I'm going to work on something for 2-3 months that won't > ever be committed), then it would be nice to know that up front. I > can either choose to use other tools, or do some work to help make > using Cython less of a pain. Either way is fine with me, of course. Overall the support for using cython in scipy/numpy is immense. People are _very_ excited about your GSoC project. I don't think there is another approach you could use other than cython that would enable you to do all the things you have proposed in such a short period of time. Cheers, Brian > Alan > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From wnbell at gmail.com Wed Apr 23 12:49:22 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 11:49:22 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 11:00 AM, Brian Granger wrote: > 1) The high-profile scientific computing+python projects that are > written using it (or pyrex): > > mpi4py (mpi binding in python - currently being rewritten to use cython) > pytables (hdf5 python bindings) These are wrappers. Writing wrappers in Cython is a fine idea. > sage Sage claims over 100K lines of Cython source code, so I agree, they do have an interest in maintaining Cython. They also have an interest in getting others to sign on. > numpy.random Doesn't this wrap the Mersenne Twister PRNG? > RIght now, the worst case scenario is if cython became unmaintained. > But, with the large number of people depending on it, I think the > community would simply pick up the slack - the beauty of open source. I disagree. It is rare for an open source project to reach a sustainable level before the original authors depart. > 2) Cython is unique in what it does. > > All the other solutions for making python code fast (swig, f2py, > ctypes, boost, etc.) still require spending lots of time writing low > level C/C++/fortran code. Sure, many of us have the skills to do > that, but the reality is that few of us have the _time_ to write that > code. If I had to write the ipython distributed array package in > C/C++ and then wrap it into python it simply _would_not_happen_ > (because of time constraints). I don't see how writing Cython is easier than writing C/C++/Fortran. Clearly if you only know Python, then Cython is an easier language to learn. However, I doubt the Cython abstraction is so complete that one can avoid learning C. You conflate the issues of writing code and wrapping it for Python. I agree that the latter is a painful excercise, even with tools like SWIG. I'd happily replace my SWIG wrappers with Cython if the need arose. However, in the context of larger-scale implementations, I find it hard to believe that the claimed benefits of Cython outweigh the concerns I've laid out. > True, a C/C++ library would live on _if_ these projects vanished. But > I don't think you have made a strong argument that these projects will > in fact vanish. Furthermore, I think your actions say the opposite. > You have invested a huge amount of time working on scipy.sparse (which > is greatly appreciated). I could be wrong, but I don't think you > would do that if you really thought scipy was going to die soon :) Clearly I think SciPy has a long life ahead of it. At the same time, I spent effort ensuring that sparsetools knows absolutely nothing about NumPy/SciPy. IMO the utility of a stand-alone library is greater than a tighter coupling between sparsetools and Python. For instance, I can now combine the C++ I've written for PyAMG with sparsetools to make a lightweight algebraic multigrid code in pure C++. > Another relevant point: recently I have spent quite a lot of time > wrapping templated C++ code with Cython (a 1 million LOC parallel c++ > electromagnetics code). This code uses the STL extensively and Cython > has worked really well. If someone really wants to use cython in the > same way you have used swig, that is completely possible - you still > end up with the nice C++ library underneath. As I've said before, using Cython as a wrapper generator is perfectly reasonable. Using Cython to implement the core functionality of that 1M LOC C++ library would be foolish. > The big difference though between cython and swig is that the top > level layer in swig is slow pure python code. In cython, you get fast > C/python extension types that can be called from other C/C++ extension > code without going through a slow python layer. That is a huge > difference. Which support the case that Cython is a decent wrapper generator. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Apr 23 12:52:50 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 11:52:50 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <20080423152840.GA11364@encolpuis> References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> Message-ID: On Wed, Apr 23, 2008 at 10:28 AM, Gabriel Gellner wrote: > I am confused by what Cython disappearing means. It is pure python, and > generates C code. So I can take an arbitrary .pyx file from Sage and use it sans-Cython? > Lets say it is no longer maintained, I can still build my > existing code using the last version and I am no worse off. Good luck with that. I hope it's bug-free and feature-complete. > For my code to become useless python would also have to disappear > (or change so dramatically that I couldn't use my old codes). By this logic > no core tool should be written in python either? See first point. > The code is open source it can't be taken away from me, it just might become > archaic, but that is no different from the current situation with g77, which > is still used extensively. Bad analogy. Fortran 77 still exists because a lot of people have Fortran 77 code that's worth keeping. In fact, the Fortran analogy works against you. Consider something like LAPACK which was written long before people cared about Python. You can access LAPACK via a dozen different higher-level and cross-language interfaces. Why? Because it's tied to a stable, well-supported programming language and can be interfaced to other languages/environments as necessary. > I agree that C++ is more portable, but are there examples of open tools > really disappearing? Take a look at sourceforge. While those projects haven't "disappeared" they're not exactly useful either. Am I really the only one who expresses the least bit of concern about writing lot of Cython code? -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Wed Apr 23 13:01:48 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 17:01:48 +0000 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> Message-ID: <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> Hi, > Am I really the only one who expresses the least bit of concern about > writing lot of Cython code? I imagine, like all these kind of discussions, it will be resolved by use. If cython isn't useful, we won't get a lot of cython code. If it is, that is very likely to mean that cython will reach a critical mass of interested developers - even if it has not already. Let's just see what happens. I really don't think there's a reason to discourage it at this point, and if it works, as several of us have guessed, then we have gained a great deal. Best, Matthew From stefan at sun.ac.za Wed Apr 23 13:11:10 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Wed, 23 Apr 2008 19:11:10 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: <9457e7c80804231011v64ffd8a0laa6810d2b51b68d7@mail.gmail.com> 2008/4/23 Nathan Bell : > I don't see how writing Cython is easier than writing C/C++/Fortran. > Clearly if you only know Python, then Cython is an easier language to > learn. However, I doubt the Cython abstraction is so complete that > one can avoid learning C. For one, memory management. Teaching someone to use the Python C API properly is a pain, which is why we often revert to tools like SWIG and Ctypes. Look at ndimage, or the numpy core -- why do so few people dare touch it? That code is hard to wrap your head around. If C code has a well designed API (in other words, if it is being used like FORTRAN), then sure -- it is easy to wrap in almost any language. The moment you start to implement anything more advanced than bit twiddling in pre-allocated memory, you're in for a long haul. > Clearly I think SciPy has a long life ahead of it. At the same time, > I spent effort ensuring that sparsetools knows absolutely nothing > about NumPy/SciPy. IMO the utility of a stand-alone library is > greater than a tighter coupling between sparsetools and Python. For > instance, I can now combine the C++ I've written for PyAMG with > sparsetools to make a lightweight algebraic multigrid code in pure > C++. I think you made the right decision for your specific application, but I don't think that is necessarily the route all our packages should take. Besides, most of the code we write is in Python, and cannot be used in non-Python libraries anyway. Regards St?fan From nwagner at iam.uni-stuttgart.de Wed Apr 23 14:16:03 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 23 Apr 2008 20:16:03 +0200 Subject: [SciPy-dev] Nastran generated sparse matrices was Re: Eigensolver In-Reply-To: References: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> Message-ID: On Tue, 22 Apr 2008 19:02:22 +0200 "Nils Wagner" wrote: > On Tue, 22 Apr 2008 12:07:14 +0200 > "Ondrej Certik" wrote: >> On Mon, Apr 21, 2008 at 10:02 PM, Nils Wagner >> wrote: >>> FWIW, there is another eigensolver for sparse matrices >>> available at >>> >>> http://www.cs.umd.edu/~rogerlee/ >> >> Indeed. Thanks for the tip. We really need to get all >>these solvers >> under one hood. I was just too busy lately, but I am >>sure either me, >> or someone else will finally get fed up and do it. :) >> >> Ondrej >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev > > Hi all, > > Is there a chance to add the following tool as a scikit >? > The software by Al Danial is released under GPL. > > http://cvs.savannah.gnu.org/viewvc/tops/tops/usr/extra/op4tools/ > > The purpose is to read and write Nastran op4 files. > > Nils Hi all, Who can help me writing an "interface" for these C routines ? It should work like io,mmread('A.mtx'). Maybe io.nasread('A.op4') Any pointer would be appreciated. Thanks in advance Nils From wnbell at gmail.com Wed Apr 23 14:32:33 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 13:32:33 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 12:01 PM, Matthew Brett wrote: > > I really don't think there's a reason to discourage it at this point, and if it works, > as several of us have guessed, then we have gained a great deal. Conversely, there's no reason to blindly encourage the use of an unproven technology. I really *would* like Cython to succeed, but I really *dislike* the approach of "we need to reimplement X in Cython, because Cython is amazing". I'm not arguing against the use of Cython, I'm only moderating what comes across as naive evangelism for a shiny new technology. For example, I have heard numerous statements about how incredibly slow SWIG wrappers are ("because the top layer is pure Python"). This may be true, on the other hand it may not be true. It may be true only for some applications and not for others. Ultimately, the burden of proof is on the Cython proponents to provide supporting evidence for their claims. Once we have this evidence we can make an informed decision about how to proceed. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Wed Apr 23 14:54:18 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 18:54:18 +0000 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> Message-ID: <1e2af89e0804231154m58637870jbc2c0787b488d51b@mail.gmail.com> Hi, > For example, I have heard numerous statements about how incredibly > slow SWIG wrappers are ("because the top layer is pure Python"). This > may be true, on the other hand it may not be true. It may be true > only for some applications and not for others. Ultimately, the burden > of proof is on the Cython proponents to provide supporting evidence > for their claims. Once we have this evidence we can make an informed > decision about how to proceed. But, it seems to me that no-one is going to use cython much if it's not obviously useful. It's the usual open-source idea of allowing people to try tools if they fit - actual contributions of code to answer the question. I don't think cython needs to make an argument, nor do I think it commands blind allegiance. It's just going to be one of those tools that looks very promising, and will show its worth if people find they like using it and move towards it. And if, in practice, it creates good maintainable, readable code. We'll know it's good, if that happens. We'll know it's not good, if it doesn't. And if it is good, it is very likely to survive and flourish. ctypes all over again, no? Best, Matthew From wnbell at gmail.com Wed Apr 23 15:44:38 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 14:44:38 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <1e2af89e0804231154m58637870jbc2c0787b488d51b@mail.gmail.com> References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> <1e2af89e0804231154m58637870jbc2c0787b488d51b@mail.gmail.com> Message-ID: On Wed, Apr 23, 2008 at 1:54 PM, Matthew Brett wrote: > > But, it seems to me that no-one is going to use cython much if it's > not obviously useful. It's the usual open-source idea of allowing > people to try tools if they fit The usual open-source approach is fueled by people who enjoy reinventing the wheel over, and over,... > I don't think cython needs to make an argument, It does if it wants to replace existing code > nor do I think it commands blind allegiance. I have a hard time reconciling this statement with the previous. > It's just going to be one of those tools that looks very promising, > and will show its worth if people find they like using it and move towards it. And if, in > practice, it creates good maintainable, readable code. We'll know > it's good, if that happens. We'll know it's not good, if it doesn't. > And if it is good, it is very likely to survive and flourish. And if it's a bad choice, can we blame you for deflecting valid criticism in favor of a wait-and-see approach? The question is not so much whether Cython should be used, but rather to what extent. I have expressed concern that writing hundreds of thousands of lines of Cython code (as SAGE has done) is potentially dangerous. On the other hand using Cython as a wrapper generator carries little risk and might represent an improvement over SWIG for instance. It's irresponsible to continue promoting an unproven technology like Cython without first addressing these concerns. Furthermore, it's profoundly ignorant and extremely offensive to argue for replacing existing implementations without first making a solid case of the deficiencies of the status quo. This sort of fanboyism causes people like me, who help maintain an existing part of SciPy, to treat Cython with skepticism. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Apr 23 15:52:17 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 14:52:17 -0500 Subject: [SciPy-dev] [numscons] 0.6.3 release: building scipy with MS compilers works In-Reply-To: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> References: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> Message-ID: On Tue, Apr 22, 2008 at 4:13 AM, David Cournapeau wrote: > Hi, > > Sorry for announcing one more numscons release in such a short time: > this releases can finally handle the last common platform, Visual Studio > on win32. Win32 installers and source tarballs can be found on > launchpad, as usual: > > https://code.launchpad.net/numpy.scons.support/0.6/0.6.3 > > Python eggs on pypi should follow soon (I cannot upload to pypi from my > lab, unfortunately). > > Except the various fixes necessary to make the build work with MS > compilers, there were a few improvements: > - some more improvements for f2py scons tool, which simplified a bit > some scipy scons scripts > - the addition of a silent mode, to get more terse command lines > output (not perfect yet), ala kbuild for people familiar with compiling > linux (e.g. you get CC foo.c instead of gcc -W -Wall blas blas blas). > The goal is to make warning more obvious (and to fix them at some point, > of course :) ). Just use python setupscons.py scons --silent=N with N > between 1 and 3 to see the result. > > As this was the last major platform I wanted to support, I can now move > on polishing the API as well as starting a real documentation for > package developers. > > cheers, > > David > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Wed Apr 23 15:57:22 2008 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 23 Apr 2008 14:57:22 -0500 Subject: [SciPy-dev] [numscons] 0.6.3 release: building scipy with MS compilers works In-Reply-To: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> References: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> Message-ID: On Tue, Apr 22, 2008 at 4:13 AM, David Cournapeau wrote: > Hi, > > Sorry for announcing one more numscons release in such a short time: > this releases can finally handle the last common platform, Visual Studio > on win32. Win32 installers and source tarballs can be found on > launchpad, as usual: > > https://code.launchpad.net/numpy.scons.support/0.6/0.6.3 David, does numscons facilitate the creation of binary installers for various platforms (e.g. .deb packages, Window installers)? If so, is the process straightforward? I'm interested in producing binary packages for PyAMG[1], but so far I've been scared off by the complexity of the packaging process. [1] http://code.google.com/p/pyamg/ -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From matthew.brett at gmail.com Wed Apr 23 16:08:39 2008 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 23 Apr 2008 20:08:39 +0000 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> <1e2af89e0804231154m58637870jbc2c0787b488d51b@mail.gmail.com> Message-ID: <1e2af89e0804231308k20c72379w1597945dca727ff8@mail.gmail.com> Hi, > And if it's a bad choice, can we blame you for deflecting valid > criticism in favor of a wait-and-see approach? Of course! > It's irresponsible to continue promoting an unproven technology like > Cython without first addressing these concerns. But that's the point - I'm not promoting it - I mean - I think it's exciting stuff - but it doesn't matter what I think, what matters is whether people find use for it. It would be complete madness to make a set policy of rewriting everything in cython right now - of course - but that's not the suggestion - I don't think it's anyone's suggestion. It also seems severe to say that we're not going to accept any cython code until - well - until when? I'm sorry, really, if I'm being offensive, I don't mean to be - at all. I think Stefan's right - there will be times when a carefully designed C++ library with python wrappers is the right thing to do, and other times when dipping into C in an easy flexible way is the right thing to do. Don't you think? Best, Matthew From ondrej at certik.cz Wed Apr 23 16:21:31 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 23 Apr 2008 22:21:31 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> <1e2af89e0804231154m58637870jbc2c0787b488d51b@mail.gmail.com> Message-ID: <85b5c3130804231321x81c7d57hd27381567c0c7366@mail.gmail.com> > It's irresponsible to continue promoting an unproven technology like > Cython without first addressing these concerns. Furthermore, it's > profoundly ignorant and extremely offensive to argue for replacing > existing implementations without first making a solid case of the > deficiencies of the status quo. This sort of fanboyism causes people > like me, who help maintain an existing part of SciPy, to treat Cython > with skepticism. I think it's a little misunderstanding here -- while I advocate Cython and I for example would write the sparsetools in C (with macros) + Cython, it's the usual "talk is cheap, show me the code" and it's you who wrote it, so it's you who decide on the technologies used and I respect that (and I think other too). If I wanted to change it, I'd rewrite the same thing using technologies that I like and then we'd see. Until then it's just talk. So I am sorry if it sounded like preaching, I only wanted to discuss things. Ondrej From stefan at sun.ac.za Wed Apr 23 16:51:02 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Wed, 23 Apr 2008 22:51:02 +0200 Subject: [SciPy-dev] Nastran generated sparse matrices was Re: Eigensolver In-Reply-To: References: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> Message-ID: <9457e7c80804231351m5f213ff1n7941595becdeaecf@mail.gmail.com> 2008/4/23 Nils Wagner : > Who can help me writing an "interface" for these C > routines ? Is there a bounty involved? I think many people would help you, given a bit of incentive. Regards St?fan From ellisonbg.net at gmail.com Wed Apr 23 17:16:22 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Wed, 23 Apr 2008 15:16:22 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <85b5c3130804230600i73fcb142x6f04222836ad3922@mail.gmail.com> <20080423152840.GA11364@encolpuis> <1e2af89e0804231001j7787774rbf5afd48d244f616@mail.gmail.com> Message-ID: <6ce0ac130804231416u64971890v71c25e60af99a2eb@mail.gmail.com> > > I really don't think there's a reason to discourage it at this point, and if it works, > > as several of us have guessed, then we have gained a great deal. > > Conversely, there's no reason to blindly encourage the use of an > unproven technology. Let's be clear on this. Many people consider Cython/Pyrex to be extremely well proven. Just ask the developers of any of the projects that have been using cython/pyrex for years. Cython may not have proven itself in your mind, but those of us who are advocating its use _do_ consider it to be a proven technology. Second, for many of us, our advocacy comes not out of blindness, but out of actually using cython for real projects and understanding its strengths and weaknesses. > really *dislike* the approach of "we need to reimplement X in Cython, > because Cython is amazing". I'm not arguing against the use of > Cython, I'm only moderating what comes across as naive evangelism for > a shiny new technology. > For example, I have heard numerous statements about how incredibly > slow SWIG wrappers are ("because the top layer is pure Python"). This > may be true, on the other hand it may not be true. It may be true > only for some applications and not for others. Ultimately, the burden > of proof is on the Cython proponents to provide supporting evidence > for their claims. Once we have this evidence we can make an informed > decision about how to proceed. > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > > > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From nwagner at iam.uni-stuttgart.de Wed Apr 23 17:33:45 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 23 Apr 2008 23:33:45 +0200 Subject: [SciPy-dev] Nastran generated sparse matrices was Re: Eigensolver In-Reply-To: <9457e7c80804231351m5f213ff1n7941595becdeaecf@mail.gmail.com> References: <85b5c3130804220307h5ac4a574s736c5b5138349f84@mail.gmail.com> <9457e7c80804231351m5f213ff1n7941595becdeaecf@mail.gmail.com> Message-ID: On Wed, 23 Apr 2008 22:51:02 +0200 "St?fan van der Walt" wrote: > 2008/4/23 Nils Wagner : >> Who can help me writing an "interface" for these C >> routines ? > > Is there a bounty involved? What do you keep in mind ? I think many people would >help you, given > a bit of incentive. > Well, Nastran is an industry-leading, linear and nonlinear finite element analysis solver. Therefore it would be nice if system matrices generated by NASTRAN [1] could be imported by SciPy. [1] http://www.mscsoftware.com/assets/MSC_DS_MSCNastran.pdf There are other finite-element packages, e.g. PERMAS [2] which supports the output of system matrices in a user-friendly way :-) You can use K = coo_matrix((vals,(rows,cols))) [2] http://www.intes.de Abaqus 6.7 supports the output of system matrices, too. I am not familiar with Ansys [3] but I think there exists a way to export system matrices as well. [3] http://www.imtek.uni-freiburg.de/simulation/mor4ansys/ Nils From millman at berkeley.edu Wed Apr 23 19:56:38 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 23 Apr 2008 18:56:38 -0500 Subject: [SciPy-dev] scipy.interpolate missing functions In-Reply-To: References: <9457e7c80804210029j4aeec2es3f3580b891cd1a6e@mail.gmail.com> Message-ID: On Mon, Apr 21, 2008 at 9:59 PM, Anne Archibald wrote: > I am able to test scipy.interpolate, and with a minor fix, lagrange() > now passes its test. I have write access to numpy SVN, which > apparently gives me write access to scipy SVN too (though I haven't > tried committing anything). Should I just check in the fix, or file a > bug and attach a patch? Go ahead and just check-in the fix and tests. Your account should work, but feel free to let me know if it doesn't. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From peridot.faceted at gmail.com Wed Apr 23 20:22:54 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Wed, 23 Apr 2008 20:22:54 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: My goodness, I had no intention of touching off such a discussion. On 23/04/2008, Nathan Bell wrote: > I don't see how writing Cython is easier than writing C/C++/Fortran. > Clearly if you only know Python, then Cython is an easier language to > learn. However, I doubt the Cython abstraction is so complete that > one can avoid learning C. Here's a concrete example. I recently wrote code to do Lagrange polynomial interpolation using barycentric interpolation (based on Berrut and Trefethen 2004). It's relatively simple code, but it accomplishes a delicate numerical task. I wrote the initial version in python, along with a test suite. It works fine, and is quite stable numerically. But the whole point of interpolation is that it be fast to evaluate, so I thought about how to make it run faster. Since cython was an active topic on the list these days, I thought I'd give it a try, in spite of never having looked into it before. Simply running cython on the file required me to rewrite a couple of in-place operations, but the nearly unmodified file ran fine with no speedups. Annotating the evaluation routine and rewriting the loops took maybe an hour, and got me a fourfold speedup. Annotating the whole class took another couple of hours (remember I'd never done this before), and got me a significant speedup in the construction (my tests report a twenty-fold improvement but I don't much believe that number). Writing cython to do this is easier than writing C to do this. You do not need to know C to do this. There are a number of shortcomings in cython - dealing with numpy arrays could be made vastly easier by one minor change (make A[i] become ((A.data))[i*A.strides[0]] when A is an ndarray and i is an integer), and documentation is currently sparse, but having tried writing C code to interface with python, I'll take this. I'll even take it over writing raw C. It would also be extremely straightforward to turn it back into python if cython were suddenly wiped from the face of the planet. Whether we want to allow it into scipy is a different question, but I thought a specific example would help focus the discussion. Thus find attached polyint.py, cpolyint.pyx, and test_cpolyint.py. Anne -------------- next part -------------- A non-text attachment was scrubbed... Name: polyint.py Type: text/x-python Size: 3626 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: cpolyint.pyx Type: application/octet-stream Size: 6001 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: test_cpolyint.py Type: text/x-python Size: 2056 bytes Desc: not available URL: From ondrej at certik.cz Wed Apr 23 20:55:16 2008 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 24 Apr 2008 02:55:16 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: <85b5c3130804231755m62f42318nfa096c6ecb3a7ab@mail.gmail.com> On Thu, Apr 24, 2008 at 2:22 AM, Anne Archibald wrote: > My goodness, I had no intention of touching off such a discussion. > > > On 23/04/2008, Nathan Bell wrote: > > > I don't see how writing Cython is easier than writing C/C++/Fortran. > > Clearly if you only know Python, then Cython is an easier language to > > learn. However, I doubt the Cython abstraction is so complete that > > one can avoid learning C. > > Here's a concrete example. > > I recently wrote code to do Lagrange polynomial interpolation using > barycentric interpolation (based on Berrut and Trefethen 2004). It's > relatively simple code, but it accomplishes a delicate numerical task. > > I wrote the initial version in python, along with a test suite. It > works fine, and is quite stable numerically. But the whole point of > interpolation is that it be fast to evaluate, so I thought about how > to make it run faster. Since cython was an active topic on the list > these days, I thought I'd give it a try, in spite of never having > looked into it before. > > Simply running cython on the file required me to rewrite a couple of > in-place operations, but the nearly unmodified file ran fine with no > speedups. Annotating the evaluation routine and rewriting the loops > took maybe an hour, and got me a fourfold speedup. Annotating the > whole class took another couple of hours (remember I'd never done this > before), and got me a significant speedup in the construction (my > tests report a twenty-fold improvement but I don't much believe that > number). > > Writing cython to do this is easier than writing C to do this. You do > not need to know C to do this. > > There are a number of shortcomings in cython - dealing with numpy > arrays could be made vastly easier by one minor change (make A[i] > become ((A.data))[i*A.strides[0]] when A is an ndarray and i > is an integer), and documentation is currently sparse, but having > tried writing C code to interface with python, I'll take this. I'll > even take it over writing raw C. It would also be extremely > straightforward to turn it back into python if cython were suddenly > wiped from the face of the planet. > > Whether we want to allow it into scipy is a different question, but I > thought a specific example would help focus the discussion. Thus find > attached polyint.py, cpolyint.pyx, and test_cpolyint.py. Thanks for the example, looks good. One comment: http://wiki.cython.org/DifferencesFromPyrex#head-08526c363e41a3ef37e49ee90449e9a134ebc393 you can leave the range() calls in there (maybe you'll have to change xrange to range), but it will produce the same C code as your version. Ondrej From david at ar.media.kyoto-u.ac.jp Wed Apr 23 22:34:29 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Thu, 24 Apr 2008 11:34:29 +0900 Subject: [SciPy-dev] [numscons] 0.6.3 release: building scipy with MS compilers works In-Reply-To: References: <480DAC2F.9010602@ar.media.kyoto-u.ac.jp> Message-ID: <480FF1B5.3040700@ar.media.kyoto-u.ac.jp> Nathan Bell wrote: > > David, does numscons facilitate the creation of binary installers for > various platforms No. Numscons only aims a better control of the build part. The installation/packaging/deployement is still handled by distutils/setuptools. cheers, David From prabhu at aero.iitb.ac.in Thu Apr 24 00:27:18 2008 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Thu, 24 Apr 2008 09:57:18 +0530 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: <48100C26.2000001@aero.iitb.ac.in> Brian Granger wrote: > The big difference though between cython and swig is that the top > level layer in swig is slow pure python code. In cython, you get fast > C/python extension types that can be called from other C/C++ extension > code without going through a slow python layer. That is a huge > difference. Then why not start a project to get SWIG to do away with the shadow module? The biggest problem IMHO with pyrex (and perhaps cython, which I've only glanced at) is that it isn't straight C or C++ and neither is it pure Python and you are unsure what the right incantation is that is supposed to work most efficiently. From what I've seen, most people seem to like to interface straightforward code with the lower level tools like pyrex and cython. I've never seen (please show me an example I can look at) a more complex case where you have complicated data structures (say a large mutable container) rather than simple elementary data types that you need to manipulate. If you need to do this, you always need to go back to the traditional low level languages like C or C++ which SWIG wraps out of the box for the most part. Lets take a simple case of someone wanting to handle a growing collection of say a million particles and do something to them. How do you do that in cython/pyrex and get the performance of C and interface to numpy? Worse, even if it were possible, you'll still need to know something about allocating memory in C and manipulating pointers. I can do that with C++ and SWIG today. cheers, prabhu From stefan at sun.ac.za Thu Apr 24 06:39:17 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 24 Apr 2008 12:39:17 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <48100C26.2000001@aero.iitb.ac.in> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> Message-ID: <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> 2008/4/24 Prabhu Ramachandran : > Lets take a simple case of someone wanting to handle a growing > collection of say a million particles and do something to them. How do > you do that in cython/pyrex and get the performance of C and interface > to numpy? Worse, even if it were possible, you'll still need to know > something about allocating memory in C and manipulating pointers. I can > do that with C++ and SWIG today. That's the point: you, being a well-established programmer can do it easily, but most Python programmers would struggle doing that through some C or C++ API. I think this would be pretty easy to do in Cython: 1. Write a function, say create_workspace(nr_elements), that creates a new ndarray and returns it: cdef ndarray results_arr = np.empty((nr_elements,), dtype=np.double) 2. Grab a pointer to the memory (this should become a lot easier after GSOC 2008): cdef double* results = results_arr.data 3. Run your loop in which you produce data points. The moment you have more results than the output array can hold, call create_workspace(current_size**2), and use normal numpy indexing to copy the old results to the new location: new_results_arr[:current_size] = old_results_arr 4. Rinse and repeat The beauty of the Cython approach is that you a) Never have to worry about INCREF and DECREF b) Can use Python calls within C functions. You don't want to do that in your fast inner loop, but take the example above: we only copy arrays infrequently, and then we'd like to have the full power of numpy indexing. Suddenly, sorting, averaging, summing becomes a one-liner, just like in Python, at the expense of one Python call (and this won't affect execution time in the above example). c) Debug in a much cleaner way than C++ or C code: fewer memory leaks, introspection of source etc. Cheers St?fan From stefan at sun.ac.za Thu Apr 24 06:43:42 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 24 Apr 2008 12:43:42 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: <9457e7c80804240343q48a91db6rfe27c835d4fdf472@mail.gmail.com> 2008/4/24 Anne Archibald : > Whether we want to allow it into scipy is a different question, but I > thought a specific example would help focus the discussion. Thus find > attached polyint.py, cpolyint.pyx, and test_cpolyint.py. Cython (Pyrex) is already in NumPy, so unless someone proposed *removing* that code, this is a moot point. Anne, would you start looking at integrating your code with scipy.interpolate? This would be a valuable addition for 0.7. Thanks St?fan From ndbecker2 at gmail.com Thu Apr 24 07:19:30 2008 From: ndbecker2 at gmail.com (Neal Becker) Date: Thu, 24 Apr 2008 07:19:30 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> Message-ID: There are clearly 2 different use cases for cython discussed here. 1) Convert python code to cython 2) Use cython to wrap existing c/c++ code. They have rather different requirements and usage. At this point, there seems to be more documentation for 1) than for 2). From faltet at carabos.com Thu Apr 24 07:42:14 2008 From: faltet at carabos.com (Francesc Altet) Date: Thu, 24 Apr 2008 13:42:14 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> References: <48100C26.2000001@aero.iitb.ac.in> <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> Message-ID: <200804241342.14671.faltet@carabos.com> A Thursday 24 April 2008, St?fan van der Walt escrigu?: > 2008/4/24 Prabhu Ramachandran : > > Lets take a simple case of someone wanting to handle a growing > > collection of say a million particles and do something to them. > > How do you do that in cython/pyrex and get the performance of C and > > interface to numpy? Worse, even if it were possible, you'll still > > need to know something about allocating memory in C and > > manipulating pointers. I can do that with C++ and SWIG today. > > That's the point: you, being a well-established programmer can do it > easily, but most Python programmers would struggle doing that through > some C or C++ API. I think this would be pretty easy to do in > Cython: > > 1. Write a function, say create_workspace(nr_elements), that creates > a new ndarray and returns it: > > cdef ndarray results_arr = np.empty((nr_elements,), > dtype=np.double) > > 2. Grab a pointer to the memory (this should become a lot easier > after GSOC 2008): > > cdef double* results = results_arr.data > > 3. Run your loop in which you produce data points. The moment you > have more results than > the output array can hold, call create_workspace(current_size**2), > and use normal numpy indexing to copy the old results to the new > location: > > new_results_arr[:current_size] = old_results_arr > > 4. Rinse and repeat > > The beauty of the Cython approach is that you > > a) Never have to worry about INCREF and DECREF > > b) Can use Python calls within C functions. You don't want to do > that in your fast inner loop, but take the example above: we only > copy arrays infrequently, and then we'd like to have the full power > of numpy indexing. Suddenly, sorting, averaging, summing becomes a > one-liner, just like in Python, at the expense of one Python call > (and this won't affect execution time in the above example). > > c) Debug in a much cleaner way than C++ or C code: fewer memory > leaks, introspection of source etc. St?fan has shown excellent points about Pyrex/Cython. Let me just add that if you start to have a large library of extensions, you can also avoid the cost of Python calls if what you want is to use one extension method from another extension method. For example, when I know that a method is going to be public, I'm very used to declare two versions: one that is callable directly from another extension (i.e. without the Python call cost) and another that is callable from Python. So, in the code: def getitem(self, long nslot, ndarray nparr, long start): self.getitem_(nslot, nparr.data, start) cdef getitem_(self, long nslot, void *data, long start): cdef void *cachedata cachedata = self.getitem1_(nslot) # Copy the data in cache to destination memcpy(data + start * self.itemsize, cachedata, self.slotsize * self.itemsize) calling MyClass.getitem_() from another extension will save you the Python call. This is not really important for most of occasions, but it can certainly be in others. My two cents, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From prabhu at aero.iitb.ac.in Thu Apr 24 08:09:50 2008 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Thu, 24 Apr 2008 17:39:50 +0530 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> Message-ID: <4810788E.9010003@aero.iitb.ac.in> Hi St?fan, St?fan van der Walt wrote: > 2008/4/24 Prabhu Ramachandran : >> Lets take a simple case of someone wanting to handle a growing >> collection of say a million particles and do something to them. How do >> you do that in cython/pyrex and get the performance of C and interface >> to numpy? Worse, even if it were possible, you'll still need to know >> something about allocating memory in C and manipulating pointers. I can >> do that with C++ and SWIG today. > > That's the point: you, being a well-established programmer can do it > easily, but most Python programmers would struggle doing that through > some C or C++ API. I think this would be pretty easy to do in Cython: > 1. Write a function, say create_workspace(nr_elements), that creates a > new ndarray and returns it: > > cdef ndarray results_arr = np.empty((nr_elements,), dtype=np.double) This is not what I want and precisely the point. I don't want an array of doubles. I want an array of objects (particles). I am not talking about manipulating an array of numbers -- I can do that just fine, thanks. > 3. Run your loop in which you produce data points. The moment you > have more results than > the output array can hold, call create_workspace(current_size**2), and > use normal numpy indexing to copy the old results to the new location: > > new_results_arr[:current_size] = old_results_arr This just shows how bad things are w.r.t. basic data types that you expect when you program at any lower level. Is there a fast link list? What about maps(dicts)? What about tree structures containing arbitrary data structures. Sure, they can be done but you need a one to one (or at least something close) mapping between commonly used data types and those we are familiar with in Python. All you seem to get with cython/pyrex is arrays and you have to implement everything else. Its almost like having to reinvent a full fledged language. Once again, I urge you to look beyond the simple functions that manipulate arrays of numbers to something more realistic (at least for me). To this end I'll write something up that explains what I am talking about with real code and show you comparisons with different approaches. I'll try and do it this weekend. > The beauty of the Cython approach is that you > > a) Never have to worry about INCREF and DECREF I don't have to with swig or weave for that matter and have one less language to worry about. > c) Debug in a much cleaner way than C++ or C code: fewer memory leaks, > introspection of source etc. I'm afraid I can't buy this at all. Good test-driven programming practice makes debugging easier, but when it comes down to it, cython or otherwise you are just going to have to roll up your sleeves and debug C anyway. Lets face it, C is underneath all of this and if something goes wrong at that level you need to know how it works to debug at all. cheers, prabhu From matthieu.brucher at gmail.com Thu Apr 24 09:50:28 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 24 Apr 2008 15:50:28 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <4810788E.9010003@aero.iitb.ac.in> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> <4810788E.9010003@aero.iitb.ac.in> Message-ID: > > I'm afraid I can't buy this at all. Good test-driven programming > practice makes debugging easier, but when it comes down to it, cython or > otherwise you are just going to have to roll up your sleeves and debug C > anyway. Lets face it, C is underneath all of this and if something goes > wrong at that level you need to know how it works to debug at all. > What is cool with Cython is that you can use the Python debugger to start the debugging process, where a lot can be done. Indeed, Cython gives back the Cython code to the Python traceback, and this is great. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Thu Apr 24 09:57:03 2008 From: oliphant at enthought.com (Travis E. Oliphant) Date: Thu, 24 Apr 2008 08:57:03 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> Message-ID: <481091AF.6030806@enthought.com> St?fan van der Walt wrote: > 2008/4/24 Prabhu Ramachandran : > >> Lets take a simple case of someone wanting to handle a growing >> collection of say a million particles and do something to them. How do >> you do that in cython/pyrex and get the performance of C and interface >> to numpy? Worse, even if it were possible, you'll still need to know >> something about allocating memory in C and manipulating pointers. I can >> do that with C++ and SWIG today. >> > > Thanks for this example Stefan. It helps to clarify the situation. I don't have strong feelings right now either way and so I'm staying out of the discussion. However, there are a few points I should respond to. > The beauty of the Cython approach is that you > > a) Never have to worry about INCREF and DECREF > This isn't quite true. There are some NumPy C-API calls that Pyrex does not get right (e.g. recognizing that nearly all calls taking a PyArray_Descr object steal the reference to it), and so you have to manage it yourself. This could perhaps be fixed in Cython > b) Can use Python calls within C functions. You don't want to do that > in your fast inner loop, but take the example above: we only copy > arrays infrequently, and then we'd like to have the full power of > numpy indexing. Suddenly, sorting, averaging, summing becomes a > one-liner, just like in Python, at the expense of one Python call (and > this won't affect execution time in the above example). > > c) Debug in a much cleaner way than C++ or C code: fewer memory leaks, > introspection of source etc. > I've also found this to be not quite true, either. Every time I've done something serious with Pyrex (Cython), I've had to look at the produced C-code to "get it right" and then it's only because I know the Python C-API and understand reference counting that I can "get it right." I'm not sure how debugging is easier, especially if the bug is a misunderstanding of how the code-generator actually works. Perhaps the code-generator is improving and will get to the point where that is not true. There are a lot of nice things about Cython, however, lest you think I'm against what is being suggested. I'm not. I generally like the direction it is heading. For example, the instrospection of source is useful, however (the way you can look at the Cython source code when getting help on a function). I would prefer it, however, if the ability to compile Python were done without changing the language. It looks like Cython is moving in that direction, but it is not quite there yet. My ideal world would be the ability to emit machine instructions directly at various points in the Python code by just writing a restricted set of Python syntax. Perhaps this could be seamless. Then, it would be much easier to just write Python code and not worry about it being "too slow" during critical loops. -Travis From cookedm at physics.mcmaster.ca Thu Apr 24 11:43:22 2008 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 24 Apr 2008 11:43:22 -0400 Subject: [SciPy-dev] Scipy tests can't be run when scipy installed as an egg In-Reply-To: <1e2af89e0804230048h67db184agdc915f7813b34e92@mail.gmail.com> References: <6D17D64C-E4EE-491B-8557-32E7621DBFA7@physics.mcmaster.ca> <1e2af89e0804230048h67db184agdc915f7813b34e92@mail.gmail.com> Message-ID: <927793EA-7001-4C38-B2C5-0C9910749116@physics.mcmaster.ca> On Apr 23, 2008, at 03:48 , Matthew Brett wrote: > Hi, > > On Thu, Apr 17, 2008 at 10:02 PM, David M. Cooke > wrote: >> I usually install scipy as an egg, and I noticed[1] that you can't >> run >> the test suite when installed that way: > > Thanks for tracking this down. As you'd expect, this is a nose > testing problem. It's known to the developers: > > http://code.google.com/p/python-nose/issues/detail?id=78 > > I'll let them know that this is a problem for us... That bug relates to testing with zipped eggs; scipy installs as an unzipped egg (and the same error with testing occurs when installed with easy_install --always-unzip), so it's not the same thing. (Although it may stem from the same cause.) -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ellisonbg.net at gmail.com Thu Apr 24 11:44:49 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 24 Apr 2008 09:44:49 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <48100C26.2000001@aero.iitb.ac.in> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> Message-ID: <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> > Then why not start a project to get SWIG to do away with the shadow module? I don't have a passion for actually working seriously on these projects - I am lazy and simply want to use them :) > Lets take a simple case of someone wanting to handle a growing > collection of say a million particles and do something to them. How do > you do that in cython/pyrex and get the performance of C and interface > to numpy? Worse, even if it were possible, you'll still need to know > something about allocating memory in C and manipulating pointers. I can > do that with C++ and SWIG today. C++ STL containers are truly great for things like this. I would write a simple c++ header file that defines the Particle class, create an std::vector to hold them and wrap the whole thing into Cython. I have already done stuff like this (templated c++ with STL) with cython and it works great. Furthermore, you end up with an actual C/Python extension type that 1) is super fast 2) has its own C api that can be called from other C extensions. Sure SWIG can do this too. If you don't mind the extra pure python layer that SWIG has, then it is really a matter of personal preference and experience. But, if performance is really important and you don't want the extra python layer, SWIG doesn't cut it. The one place where Cython does not do well with C++ code currently is if you have a very large C++ API (100s of classes). Then SWIG is the way to go. > cheers, > prabhu > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From ndbecker2 at gmail.com Thu Apr 24 11:59:30 2008 From: ndbecker2 at gmail.com (Neal Becker) Date: Thu, 24 Apr 2008 11:59:30 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> Message-ID: Brian Granger wrote: >> Then why not start a project to get SWIG to do away with the shadow >> module? > > I don't have a passion for actually working seriously on these > projects - I am lazy and simply want to use them :) > >> Lets take a simple case of someone wanting to handle a growing >> collection of say a million particles and do something to them. How do >> you do that in cython/pyrex and get the performance of C and interface >> to numpy? Worse, even if it were possible, you'll still need to know >> something about allocating memory in C and manipulating pointers. I can >> do that with C++ and SWIG today. > > C++ STL containers are truly great for things like this. I would > write a simple c++ header file that defines the Particle class, create > an std::vector to hold them and wrap the whole thing into > Cython. I have already done stuff like this (templated c++ with STL) > with cython and it works great. Furthermore, you end up with an > actual C/Python extension type that 1) is super fast 2) has its own C > api that can be called from other C extensions. > Could you point me to a simple example of this, please? From ellisonbg.net at gmail.com Thu Apr 24 12:19:01 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 24 Apr 2008 10:19:01 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> Message-ID: <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> On Thu, Apr 24, 2008 at 9:59 AM, Neal Becker wrote: > Brian Granger wrote: > > >> Then why not start a project to get SWIG to do away with the shadow > >> module? > > > > I don't have a passion for actually working seriously on these > > projects - I am lazy and simply want to use them :) > > > >> Lets take a simple case of someone wanting to handle a growing > >> collection of say a million particles and do something to them. How do > >> you do that in cython/pyrex and get the performance of C and interface > >> to numpy? Worse, even if it were possible, you'll still need to know > >> something about allocating memory in C and manipulating pointers. I can > >> do that with C++ and SWIG today. > > > > C++ STL containers are truly great for things like this. I would > > write a simple c++ header file that defines the Particle class, create > > an std::vector to hold them and wrap the whole thing into > > Cython. I have already done stuff like this (templated c++ with STL) > > with cython and it works great. Furthermore, you end up with an > > actual C/Python extension type that 1) is super fast 2) has its own C > > api that can be called from other C extensions. > > Unfortunately, the context I have done this in is proprietary code for my employer. But, I will try to put together a simple example this weekend and post it to launchpad. The documentation on these capabilities of Cython isn't great right now for sure, so some example code would sure help. From faltet at carabos.com Thu Apr 24 12:25:00 2008 From: faltet at carabos.com (Francesc Altet) Date: Thu, 24 Apr 2008 18:25:00 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> References: <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> Message-ID: <200804241825.00626.faltet@carabos.com> A Thursday 24 April 2008, Brian Granger escrigu?: > > Then why not start a project to get SWIG to do away with the shadow > > module? > > I don't have a passion for actually working seriously on these > projects - I am lazy and simply want to use them :) > > > Lets take a simple case of someone wanting to handle a growing > > collection of say a million particles and do something to them. > > How do you do that in cython/pyrex and get the performance of C and > > interface to numpy? Worse, even if it were possible, you'll still > > need to know something about allocating memory in C and > > manipulating pointers. I can do that with C++ and SWIG today. > > C++ STL containers are truly great for things like this. I would > write a simple c++ header file that defines the Particle class, > create an std::vector to hold them and wrap the whole thing > into Cython. I have already done stuff like this (templated c++ with > STL) with cython and it works great. Furthermore, you end up with an > actual C/Python extension type that 1) is super fast 2) has its own C > api that can be called from other C extensions. > > Sure SWIG can do this too. If you don't mind the extra pure python > layer that SWIG has, then it is really a matter of personal > preference and experience. But, if performance is really important > and you don't want the extra python layer, SWIG doesn't cut it. The > one place where Cython does not do well with C++ code currently is if > you have a very large C++ API (100s of classes). Then SWIG is the > way to go. Can you develop a bit more in detail why Cython should not be adequate for wrapping 100s of C++ classes? Just curious. Thanks, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From ellisonbg.net at gmail.com Thu Apr 24 12:32:56 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 24 Apr 2008 10:32:56 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <200804241825.00626.faltet@carabos.com> References: <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <200804241825.00626.faltet@carabos.com> Message-ID: <6ce0ac130804240932w3dab35dbw8e2ea82cf1fe23f3@mail.gmail.com> On Thu, Apr 24, 2008 at 10:25 AM, Francesc Altet wrote: > A Thursday 24 April 2008, Brian Granger escrigu?: > > > > Then why not start a project to get SWIG to do away with the shadow > > > module? > > > > I don't have a passion for actually working seriously on these > > projects - I am lazy and simply want to use them :) > > > > > Lets take a simple case of someone wanting to handle a growing > > > collection of say a million particles and do something to them. > > > How do you do that in cython/pyrex and get the performance of C and > > > interface to numpy? Worse, even if it were possible, you'll still > > > need to know something about allocating memory in C and > > > manipulating pointers. I can do that with C++ and SWIG today. > > > > C++ STL containers are truly great for things like this. I would > > write a simple c++ header file that defines the Particle class, > > create an std::vector to hold them and wrap the whole thing > > into Cython. I have already done stuff like this (templated c++ with > > STL) with cython and it works great. Furthermore, you end up with an > > actual C/Python extension type that 1) is super fast 2) has its own C > > api that can be called from other C extensions. > > > > Sure SWIG can do this too. If you don't mind the extra pure python > > layer that SWIG has, then it is really a matter of personal > > preference and experience. But, if performance is really important > > and you don't want the extra python layer, SWIG doesn't cut it. The > > one place where Cython does not do well with C++ code currently is if > > you have a very large C++ API (100s of classes). Then SWIG is the > > way to go. > > Can you develop a bit more in detail why Cython should not be adequate > for wrapping 100s of C++ classes? Just curious. Sure. With Cython, you have to wrap each C++ class by hand. It is not a lot of coding, but to do it well, it does take time. Whereas for SWIG, a lot of that can be automated with a good .i file. To make matters worse, because Cython does not have _native_ support for generating template instances, you currently have to wrap each template instance by hand. If you have classes that are templated over multiple things, each of which are templated over multiple things themselves, you end up having an exponential explosion in the number of hand wrapped classes you need to handle. For some of this, simple code generation techniques would work fine, but for other things, it would be quite painful. It really depends on how patient you are :) Brian > Thanks, > > > -- > >0,0< Francesc Altet http://www.carabos.com/ > V V C?rabos Coop. V. Enjoy Data > "-" > _______________________________________________ > > > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From prabhu at aero.iitb.ac.in Thu Apr 24 12:38:03 2008 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Thu, 24 Apr 2008 22:08:03 +0530 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> Message-ID: <4810B76B.6000204@aero.iitb.ac.in> Brian Granger wrote: >> > C++ STL containers are truly great for things like this. I would >> > write a simple c++ header file that defines the Particle class, create >> > an std::vector to hold them and wrap the whole thing into >> > Cython. I have already done stuff like this (templated c++ with STL) >> > with cython and it works great. Furthermore, you end up with an >> > actual C/Python extension type that 1) is super fast 2) has its own C >> > api that can be called from other C extensions. For some strange reason I still haven't gotten this email but Brian, that is the point I was making, STL has all the needed container classes for this and this is precisely what I use. I just use SWIG to wrap this C++ code and it works great. Much of the STL is already wrapped via SWIG and with numpy.i it interfaces quite well with numpy, in addition you can inject doxygen generated documentation almost automatically so your generated wrapper code is fully documented. I've also explored using D and PyD and it makes for a fantastic combination. D seems to be (gdc) about 1.7 times slower than C++ but is really nice to work with. PyD's support for numpy is lacking but could be fixed. The point is (and what I understand of what Nathan was saying), pyrex and cython let you wrap the code to an extent but do not provide the underlying solution. People were arging that Cython may be used in place of C, and I think our point is that it isn't there yet and for people used to C or C++ it is much easier to just use those instead. > Unfortunately, the context I have done this in is proprietary code for > my employer. But, I will try to put together a simple example this > weekend and post it to launchpad. The documentation on these > capabilities of Cython isn't great right now for sure, so some example > code would sure help. So why don't we do this together. I'll put together my tiny example of a shootout with Python C++ and D and lets see how they stack up. If you would be so kind you can add the cython/pyrex parts to this. Thanks. cheers, prabhu From ndbecker2 at gmail.com Thu Apr 24 12:47:55 2008 From: ndbecker2 at gmail.com (Neal Becker) Date: Thu, 24 Apr 2008 12:47:55 -0400 Subject: [SciPy-dev] Inclusion of cython code in scipy References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> Message-ID: Prabhu Ramachandran wrote: > Brian Granger wrote: >>> > C++ STL containers are truly great for things like this. I would >>> > write a simple c++ header file that defines the Particle class, >>> > create an std::vector to hold them and wrap the whole thing >>> > into >>> > Cython. I have already done stuff like this (templated c++ with STL) >>> > with cython and it works great. Furthermore, you end up with an >>> > actual C/Python extension type that 1) is super fast 2) has its own C >>> > api that can be called from other C extensions. > > For some strange reason I still haven't gotten this email but Brian, > that is the point I was making, STL has all the needed container classes > for this and this is precisely what I use. I just use SWIG to wrap this > C++ code and it works great. Much of the STL is already wrapped via > SWIG and with numpy.i it interfaces quite well with numpy, in addition > you can inject doxygen generated documentation almost automatically so > your generated wrapper code is fully documented. > > I've also explored using D and PyD and it makes for a fantastic > combination. D seems to be (gdc) about 1.7 times slower than C++ but is > really nice to work with. PyD's support for numpy is lacking but could > be fixed. > > The point is (and what I understand of what Nathan was saying), pyrex > and cython let you wrap the code to an extent but do not provide the > underlying solution. People were arging that Cython may be used in > place of C, and I think our point is that it isn't there yet and for > people used to C or C++ it is much easier to just use those instead. > >> Unfortunately, the context I have done this in is proprietary code for >> my employer. But, I will try to put together a simple example this >> weekend and post it to launchpad. The documentation on these >> capabilities of Cython isn't great right now for sure, so some example >> code would sure help. > > So why don't we do this together. I'll put together my tiny example of > a shootout with Python C++ and D and lets see how they stack up. If you > would be so kind you can add the cython/pyrex parts to this. Thanks. > Is pyd still actively developed? Seems to be rather quiet for a while. From ellisonbg.net at gmail.com Thu Apr 24 13:28:17 2008 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 24 Apr 2008 11:28:17 -0600 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <4810B76B.6000204@aero.iitb.ac.in> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> Message-ID: <6ce0ac130804241028n54dbbad9n41dc2e3a65d1b99a@mail.gmail.com> On Thu, Apr 24, 2008 at 10:38 AM, Prabhu Ramachandran wrote: > Brian Granger wrote: > >> > C++ STL containers are truly great for things like this. I would > >> > write a simple c++ header file that defines the Particle class, create > >> > an std::vector to hold them and wrap the whole thing into > >> > Cython. I have already done stuff like this (templated c++ with STL) > >> > with cython and it works great. Furthermore, you end up with an > >> > actual C/Python extension type that 1) is super fast 2) has its own C > >> > api that can be called from other C extensions. > > For some strange reason I still haven't gotten this email but Brian, > that is the point I was making, STL has all the needed container classes > for this and this is precisely what I use. I just use SWIG to wrap this > C++ code and it works great. Much of the STL is already wrapped via > SWIG and with numpy.i it interfaces quite well with numpy, in addition > you can inject doxygen generated documentation almost automatically so > your generated wrapper code is fully documented. > > I've also explored using D and PyD and it makes for a fantastic > combination. D seems to be (gdc) about 1.7 times slower than C++ but is > really nice to work with. PyD's support for numpy is lacking but could > be fixed. > > The point is (and what I understand of what Nathan was saying), pyrex > and cython let you wrap the code to an extent but do not provide the > underlying solution. People were arging that Cython may be used in > place of C, and I think our point is that it isn't there yet and for > people used to C or C++ it is much easier to just use those instead. I do think that in many cases, Cython _can_ be used quite well instead of C/C++. But, for the example you gave, the C++ STL is a great tool and probably the tool I would use if I really needed the performance. But, for someone that doesn't know or want to deal with C/C++, Cython does provide a nice way of writing the hypothetical Particle class as a C/Python extension type. Here is the Cython code: cdef class Particle: double public x, y, z double public px, py, pz double public mass, charge The resulting extension type would be nearly as fast as the corresponding c++ class (I will test this), and is already wrapped into Python. Of course, the user would need to store these objects in a Python list rather than a c++ std::vector, but depending on what they are doing, it could be totally fine. For a user who would otherwise write this class in pure Python, Cython represents a _massive_ improvement. One cool possibility would be to make an stl::vector of these python extension types, which I think Cython would allow. > > > Unfortunately, the context I have done this in is proprietary code for > > my employer. But, I will try to put together a simple example this > > weekend and post it to launchpad. The documentation on these > > capabilities of Cython isn't great right now for sure, so some example > > code would sure help. > > So why don't we do this together. I'll put together my tiny example of > a shootout with Python C++ and D and lets see how they stack up. If you > would be so kind you can add the cython/pyrex parts to this. Thanks. > > cheers, > prabhu > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From wbaxter at gmail.com Thu Apr 24 20:35:30 2008 From: wbaxter at gmail.com (Bill Baxter) Date: Fri, 25 Apr 2008 09:35:30 +0900 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> Message-ID: On Fri, Apr 25, 2008 at 1:47 AM, Neal Becker wrote: > > Prabhu Ramachandran wrote: > > > Brian Granger wrote: > >>> > C++ STL containers are truly great for things like this. I would > >>> > write a simple c++ header file that defines the Particle class, > >>> > create an std::vector to hold them and wrap the whole thing > >>> > into > >>> > Cython. I have already done stuff like this (templated c++ with STL) > >>> > with cython and it works great. Furthermore, you end up with an > >>> > actual C/Python extension type that 1) is super fast 2) has its own C > >>> > api that can be called from other C extensions. > > > > For some strange reason I still haven't gotten this email but Brian, > > that is the point I was making, STL has all the needed container classes > > for this and this is precisely what I use. I just use SWIG to wrap this > > C++ code and it works great. Much of the STL is already wrapped via > > SWIG and with numpy.i it interfaces quite well with numpy, in addition > > you can inject doxygen generated documentation almost automatically so > > your generated wrapper code is fully documented. > > > > I've also explored using D and PyD and it makes for a fantastic > > combination. D seems to be (gdc) about 1.7 times slower than C++ but is > > really nice to work with. PyD's support for numpy is lacking but could > > be fixed. > > > > The point is (and what I understand of what Nathan was saying), pyrex > > and cython let you wrap the code to an extent but do not provide the > > underlying solution. People were arging that Cython may be used in > > place of C, and I think our point is that it isn't there yet and for > > people used to C or C++ it is much easier to just use those instead. > > > >> Unfortunately, the context I have done this in is proprietary code for > >> my employer. But, I will try to put together a simple example this > >> weekend and post it to launchpad. The documentation on these > >> capabilities of Cython isn't great right now for sure, so some example > >> code would sure help. > > > > So why don't we do this together. I'll put together my tiny example of > > a shootout with Python C++ and D and lets see how they stack up. If you > > would be so kind you can add the cython/pyrex parts to this. Thanks. > > > > Is pyd still actively developed? Seems to be rather quiet for a while. D! Yay! I don't think Kirk is doing a lot of development on it, but he's still around in the D community. Nice to see some other folks here keeping an eye on D. BUT ... I think this thread was about alternatives for rewriting parts of NumPy, and I have to say that writing *any* of numpy in D seems like the wrong choice at this point. Examples of using D with numpy on the numpy wiki would be great though. Also, Prabhu, I can back up your "1.7x slower than C++" assessment. My own tests have usually come out around that ballpark too. But I prefer to phrase it as "560 times faster than pure Python". :-) --bb From kirklin.mcdonald at gmail.com Thu Apr 24 21:02:22 2008 From: kirklin.mcdonald at gmail.com (Kirk McDonald) Date: Thu, 24 Apr 2008 18:02:22 -0700 Subject: [SciPy-dev] Pyd Message-ID: <25bd58d10804241802x83d3b1fr580f00dc87b5c0f8@mail.gmail.com> Neal Becker wrote: > Is pyd still actively developed? Seems to be rather quiet for a while. I haven't worked on it in months, it is true. I ran into certain issues with D and became discouraged (Pyd pushes D to its metaprogramming limits, and D's compile-time introspection is still slightly less than perfect). After that, I started working on some other projects. However, I don't consider the project dead, and I've had a number people request better interoperability with NumPy. It's certainly a feature I would like to add. However, I don't think I'll be getting back to work on Pyd in the immediate future. I have a new job, and I am moving, so things may take a while to settle down. In the future, I may be reworking Pyd's API (again) into a more SWIG-like model. (That is, Pyd would become a utility which generates code, rather than a library which generates code entirely with D's templates.) This would neatly sidestep many of the current weaknesses of Pyd, and it might even motivate me to do more serious work on it. --Kirk McDonald http://pyd.dsource.org http://kirkmcdonald.blogspot.com From david at ar.media.kyoto-u.ac.jp Fri Apr 25 08:19:47 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 25 Apr 2008 21:19:47 +0900 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <4810B76B.6000204@aero.iitb.ac.in> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> Message-ID: <4811CC63.6030501@ar.media.kyoto-u.ac.jp> Prabhu Ramachandran wrote: > Brian Granger wrote: >>> > C++ STL containers are truly great for things like this. I would >>> > write a simple c++ header file that defines the Particle class, create >>> > an std::vector to hold them and wrap the whole thing into >>> > Cython. I have already done stuff like this (templated c++ with STL) >>> > with cython and it works great. Furthermore, you end up with an >>> > actual C/Python extension type that 1) is super fast 2) has its own C >>> > api that can be called from other C extensions. > > For some strange reason I still haven't gotten this email but Brian, > that is the point I was making, STL has all the needed container classes > for this and this is precisely what I use. I just use SWIG to wrap this > C++ code and it works great. Much of the STL is already wrapped via > SWIG and with numpy.i it interfaces quite well with numpy, in addition > you can inject doxygen generated documentation almost automatically so > your generated wrapper code is fully documented. - there is almost no C++ code in scipy, and none in numpy - there is already some pyrex code in numpy (numpy.random) - there is no STL use, as far as I know at least, in numpy or scipy. So I don't see how a discussion about C++ and stl is relevant for numpy/scipy. The original email was about including cython code: if the choice is between no code and cython code, I know which side I am on. Swig has problems too, BTW. For example, when I was working on scipy.cluster last year, I could not regenerate the source file because of SWIG version mismatches, and at the end, I just found easier to rewrite the extension in pure C (it was really small). I am sure that for you, swig + C++ is better. But for other, it isn't. If swig is allowed, I really don't see why cython would not be. > > I've also explored using D and PyD and it makes for a fantastic > combination. D seems to be (gdc) about 1.7 times slower than C++ but is > really nice to work with. PyD's support for numpy is lacking but could > be fixed. > > The point is (and what I understand of what Nathan was saying), pyrex > and cython let you wrap the code to an extent but do not provide the > underlying solution. People were arging that Cython may be used in > place of C, and I think our point is that it isn't there yet and for > people used to C or C++ it is much easier to just use those instead. I consider myself to be a decent C programmer, and I don't like swig. It just depends on people and their problems. cheers, David From david at ar.media.kyoto-u.ac.jp Fri Apr 25 08:23:48 2008 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 25 Apr 2008 21:23:48 +0900 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <481091AF.6030806@enthought.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <9457e7c80804240339w6d4e4ec9ndfd0b23fd0431ddd@mail.gmail.com> <481091AF.6030806@enthought.com> Message-ID: <4811CD54.3040300@ar.media.kyoto-u.ac.jp> Travis E. Oliphant wrote: > > I would prefer it, however, if the ability to compile Python were done > without changing the language. That's certainly the best way, and that's why I am a bit wary about replacing python code by cython (once it becomes possible to compile/accelerate pure python code, we would have to code cython modules again; I would much prefer replacing some C by cython, and be sure there is always a pure python implementation available, even if really slow). > It looks like Cython is moving in that > direction, but it is not quite there yet. > > My ideal world would be the ability to emit machine instructions > directly at various points in the Python code by just writing a > restricted set of Python syntax. Perhaps this could be seamless. > Not that it is in any useful form, but pypy folks are now working on supporting floating point operations in their jit: http://morepypy.blogspot.com/2008/04/float-operations-for-jit.html cheers, David From wnbell at gmail.com Fri Apr 25 10:43:37 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Apr 2008 09:43:37 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <4811CC63.6030501@ar.media.kyoto-u.ac.jp> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> Message-ID: On Fri, Apr 25, 2008 at 7:19 AM, David Cournapeau wrote: > I am sure that for you, swig + C++ is better. But for other, it isn't. > If swig is allowed, I really don't see why cython would not be. No one has objected to using Cython as a wrapper like SWIG. Replacing exisiting Python/C/C++ code with Cython is my only objection. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From stefan at sun.ac.za Fri Apr 25 11:23:29 2008 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Fri, 25 Apr 2008 17:23:29 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> Message-ID: <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> 2008/4/25 Nathan Bell : > Replacing exisiting Python/C/C++ code with Cython is my only objection. Just to be clear: do you think using the raw NumPy and Python C API is easier and cleaner than using Cython? Regards St?fan From matthieu.brucher at gmail.com Fri Apr 25 11:32:08 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 25 Apr 2008 17:32:08 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> Message-ID: 2008/4/25, St?fan van der Walt : > > 2008/4/25 Nathan Bell : > > > Replacing exisiting Python/C/C++ code with Cython is my only objection. > > > Just to be clear: do you think using the raw NumPy and Python C API is > easier and cleaner than using Cython? I think he meant developing a C/C++ module, without any Python object/API, ..., and then wrap it with Cython and steer it from Python. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Fri Apr 25 12:53:59 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 25 Apr 2008 18:53:59 +0200 Subject: [SciPy-dev] svn scipy.sparse Message-ID: Hi Nathan, I found some new errors in scipy svn ====================================================================== ERROR: test_matvec (test_base.TestDOK) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/tests/test_base.py", line 257, in test_matvec assert_array_almost_equal(M * col, M.todense() * col) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 361, in __mul__ return self.dot(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 376, in dot return self.matvec(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 492, in matvec new = array(new) ValueError: setting an array element with a sequence. ====================================================================== ERROR: Does the matrix's .mean(axis=...) method work? ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/tests/test_base.py", line 125, in test_mean assert_array_equal(self.dat.mean(), self.datsp.mean()) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 474, in mean return self.sum(None) * 1.0 / (self.shape[0]*self.shape[1]) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 457, in sum return ( self * asmatrix(ones((n, 1), dtype=self.dtype)) ).sum() File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 361, in __mul__ return self.dot(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 376, in dot return self.matvec(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 492, in matvec new = array(new) ValueError: setting an array element with a sequence. ====================================================================== ERROR: test_rmatvec (test_base.TestDOK) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/tests/test_base.py", line 252, in test_rmatvec assert_array_almost_equal(row*M, row*M.todense()) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 387, in __rmul__ return self.transpose().dot(tr).transpose() File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 376, in dot return self.matvec(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 492, in matvec new = array(new) ValueError: setting an array element with a sequence. ====================================================================== ERROR: Does the matrix's .sum(axis=...) method work? ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/tests/test_base.py", line 117, in test_sum assert_array_equal(self.dat.sum(), self.datsp.sum()) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 457, in sum return ( self * asmatrix(ones((n, 1), dtype=self.dtype)) ).sum() File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 361, in __mul__ return self.dot(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/base.py", line 376, in dot return self.matvec(other) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/dok.py", line 492, in matvec new = array(new) ValueError: setting an array element with a sequence. Nils From prabhu at aero.iitb.ac.in Fri Apr 25 12:55:14 2008 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Fri, 25 Apr 2008 22:25:14 +0530 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <4811CC63.6030501@ar.media.kyoto-u.ac.jp> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> Message-ID: <48120CF2.2090704@aero.iitb.ac.in> David Cournapeau wrote: > So I don't see how a discussion about C++ and stl is relevant for > numpy/scipy. The original email was about including cython code: if the > choice is between no code and cython code, I know which side I am on. The thread has meandered far from the original topic. Sorry for helping it along. I'll start another thread once I put together what I've been talking about. Just to be clear. I have nothing against Cython or Pyrex and even use Pyrex in some of my own code. There are cases when it is an excellent option. I don't think Nathan had said anything of the sort either. I'll elaborate the problem I see later on and start a new thread about it. > Swig has problems too, BTW. For example, when I was working on > scipy.cluster last year, I could not regenerate the source file because > of SWIG version mismatches, and at the end, I just found easier to > rewrite the extension in pure C (it was really small). Of course SWIG has its fair share of problems. Not the least the fact that it changes so fast, so much -- at least in the past this was a problem. cheers, prabhu From wnbell at gmail.com Fri Apr 25 14:34:31 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Apr 2008 13:34:31 -0500 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> References: <6ce0ac130804230900w76a74321nb8b1ceb38e72076f@mail.gmail.com> <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> Message-ID: On Fri, Apr 25, 2008 at 10:23 AM, St?fan van der Walt wrote: > 2008/4/25 Nathan Bell : > > > Replacing exisiting Python/C/C++ code with Cython is my only objection. > > Just to be clear: do you think using the raw NumPy and Python C API is > easier and cleaner than using Cython? That's a false dichotomy. I think writing useful C/C++ libraries and then interfacing them w/ SWIG/Cython or handwritten wrappers is a more sustainable and safer approach. IMO separating the wrappers from the implementation is a good idea. Wrapping C code, especially C code that was written in a way conducive to wrapping, is not difficult. This holds true for SWIG/Cython/flavor of the week. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From wnbell at gmail.com Fri Apr 25 14:50:37 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Apr 2008 13:50:37 -0500 Subject: [SciPy-dev] svn scipy.sparse In-Reply-To: References: Message-ID: On Fri, Apr 25, 2008 at 11:53 AM, Nils Wagner wrote: > Hi Nathan, > > I found some new errors in scipy svn > > ====================================================================== > ERROR: test_matvec (test_base.TestDOK) > ---------------------------------------------------------------------- I can't reproduce those errors, but I made some changes to r4178 that might fix the problem. Please let me know if the problem persists. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From nwagner at iam.uni-stuttgart.de Fri Apr 25 15:53:27 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 25 Apr 2008 21:53:27 +0200 Subject: [SciPy-dev] svn scipy.sparse In-Reply-To: References: Message-ID: On Fri, 25 Apr 2008 13:50:37 -0500 "Nathan Bell" wrote: > On Fri, Apr 25, 2008 at 11:53 AM, Nils Wagner > wrote: >> Hi Nathan, >> >> I found some new errors in scipy svn >> >> ====================================================================== >> ERROR: test_matvec (test_base.TestDOK) >> ---------------------------------------------------------------------- > > I can't reproduce those errors, but I made some changes >to r4178 that > might fix the problem. Please let me know if the >problem persists. > > -- > Nathan Bell wnbell at gmail.com > http://graphics.cs.uiuc.edu/~wnbell/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev Thank you. Works for me with r4178 :-) Nils From peridot.faceted at gmail.com Fri Apr 25 16:12:10 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Fri, 25 Apr 2008 22:12:10 +0200 Subject: [SciPy-dev] Inclusion of cython code in scipy In-Reply-To: References: <48100C26.2000001@aero.iitb.ac.in> <6ce0ac130804240844rb68b927n942f065f38687f76@mail.gmail.com> <6ce0ac130804240919h68cce1efp69756b362cd7a6dc@mail.gmail.com> <4810B76B.6000204@aero.iitb.ac.in> <4811CC63.6030501@ar.media.kyoto-u.ac.jp> <9457e7c80804250823i4576bdeev6a64711681e7e78a@mail.gmail.com> Message-ID: On 25/04/2008, Nathan Bell wrote: > On Fri, Apr 25, 2008 at 10:23 AM, St?fan van der Walt wrote: > > 2008/4/25 Nathan Bell : > > > > > Replacing exisiting Python/C/C++ code with Cython is my only objection. > > > > Just to be clear: do you think using the raw NumPy and Python C API is > > easier and cleaner than using Cython? > > That's a false dichotomy. I think writing useful C/C++ libraries and > then interfacing them w/ SWIG/Cython or handwritten wrappers is a more > sustainable and safer approach. > > IMO separating the wrappers from the implementation is a good idea. > Wrapping C code, especially C code that was written in a way conducive > to wrapping, is not difficult. This holds true for SWIG/Cython/flavor > of the week. If you're writing a pure C/C++ module to be interfaced by SWIG, how do you use functions from the python API (or pure python functions from scipy)? Can you do this without delving into the python C api? To be clear: suppose I want to write a simulation object in raw C and wrap it through SWIG. But in the (one-time) construction of the simulation I want to use the percentile point function of the von Mises distribution (conveniently available as scipy.stats.vonmises(x).ppf). Does SWIG let my pure C code easily get at this function, or do I need to reimplement it myself? Thanks, Anne From wnbell at gmail.com Fri Apr 25 18:06:20 2008 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 25 Apr 2008 17:06:20 -0500 Subject: [SciPy-dev] Interesting ARPACK branch In-Reply-To: References: Message-ID: On Mon, Apr 21, 2008 at 8:13 AM, Neilen Marais wrote: > I've found that figuring out what is the latest ARPACK release and > weather you've got all the newest patches installed is quite a mission. > It seems someone has take up a little of the the maintainership issue: > > http://mathema.tician.de/software/arpack > > Might be useful to pull the ARPACK code for future scipy releases from > there. Thanks for the pointer. It's surprising that such a well-known package would be without an official maintainer. I don't know which version of the ARPACK code we're using in sparse.linalg, but we should definitely look carefully at the upstream option(s) before addressing any bugs ourselves. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From tournesol33 at gmail.com Fri Apr 25 21:17:20 2008 From: tournesol33 at gmail.com (tournesol) Date: Sat, 26 Apr 2008 10:17:20 +0900 Subject: [SciPy-dev] 2D array to 3D Message-ID: <481282A0.1090800@gmail.com> Hi All. Is there a easy way to insert 1D(j) array into another 2D array(B:jxk) and conver B to B:ixjxk ? ex:) >>> from numpy import * >>> a=arange(4) >>> a array([0, 1, 2, 3]) >>> b=arange(9) >>> b.shape=3,3 >>> b array([[0, 1, 2], [3, 4, 5], [6, 7, 8]]) I just wanna insert A into B B:1x3x3, [[[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]] B:2x3x3, [[[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]] ?? [[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]]] B:3x3x3, [[[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]] [[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]] [[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11]]] Thanks for your help From millman at berkeley.edu Sat Apr 26 11:57:47 2008 From: millman at berkeley.edu (Jarrod Millman) Date: Sat, 26 Apr 2008 08:57:47 -0700 Subject: [SciPy-dev] Status of sandbox.spline and my other code. In-Reply-To: <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> References: <3a1077e70706100955p4a705acy3b26d9cf9cf0298b@mail.gmail.com> <3a1077e70804080155p22fad7bcj9f805155a95fefa8@mail.gmail.com> <3a1077e70804080156g3ce29db3k2f63895b0003aeda@mail.gmail.com> Message-ID: On Tue, Apr 8, 2008 at 1:56 AM, John Travers wrote: > The code in sandbox.spline should be a complete replacement for the > code in scipy.interpolate.fitpack and fitpack2 including the f2py > wrappers of the fitpack fortran code (i.e. replacing the hand crafted > c wrappers). Apart from using only f2py, a number of improvements were > made to the docstrings and error checking and some bugs fixed and > quite a few unit tests added. > > Originally I was advocating that we have a separate spline package > with the fitpack stuff and a separate interpolate package that can > utilise either the spline or rbf of delaunay etc. packages as these > can all be used for more than just interpolation. This didn't seem to > gain any traction on the mailing list, so I left it. I would still > advocate this, but recognise the reasons that people might not want to > expand the number of packages so much. > > I left this code about a year ago and then got bogged down with my > thesis work which I finished a few weeks back. If they > code/improvements might be used then I can find the time this week to > check it is all working and still up to date. Then I could either > supply a set of patches against the main interpolate code which could > be more easily understood, or provide a mercurial/bzr branch for > someone else to merge from (if the bzr/mercurial stuff is setup yet). > > Let me know if that might be useful. There are also further things > that were on my todo list for interpolate, including using > RectBivariateSPline where possible as it is much better for > interpolation than the currently used surfit. (I have a unit test > which segfaults the current code, but not teh RectBivariateSpline > code). Hey John, I would like to see you move forward on this. I am not sure that you need to create patches or a bzr/mecurial branch. Why not just commit to the trunk. That way it will be much simpler for everyone to test you changes. If there is some problem, we can always just fix them or back them out. I assume you have commit rights, since your code is in the scipy sandbox. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From nwagner at iam.uni-stuttgart.de Sun Apr 27 04:28:45 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sun, 27 Apr 2008 10:28:45 +0200 Subject: [SciPy-dev] Changeset 4181 Message-ID: Hi All, Changeset 4181 introduces ====================================================================== ERROR: test_construction (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 168, in test_construction P = PiecewisePolynomial(self.xi,self.yi,3) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_derivative (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 175, in test_derivative P = PiecewisePolynomial(self.xi,self.yi,3) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_derivatives (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 178, in test_derivatives P = PiecewisePolynomial(self.xi,self.yi,3) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_incremental (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 199, in test_incremental P.append(self.xi[i],self.yi[i],3) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_scalar (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 171, in test_scalar P = PiecewisePolynomial(self.xi,self.yi,3) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_shapes_scalarvalue (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 203, in test_shapes_scalarvalue P = PiecewisePolynomial(self.xi,self.yi,4) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_shapes_scalarvalue_derivative (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 209, in test_shapes_scalarvalue_derivative P = PiecewisePolynomial(self.xi,self.yi,4) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar ====================================================================== ERROR: test_vector (test_polyint.CheckPiecewise) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/tests/test_polyint.py", line 189, in test_vector for i in xrange(len(ys[0][0]))] File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 448, in __init__ self.extend(xi[1:],yi[1:],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 539, in extend self.append(xi[i],yi[i],orders) File "/usr/local/lib64/python2.5/site-packages/scipy/interpolate/polyint.py", line 503, in append raise ValueError, "Each derivative must be a scalar" ValueError: Each derivative must be a scalar Nils From wnbell at gmail.com Sun Apr 27 14:57:02 2008 From: wnbell at gmail.com (Nathan Bell) Date: Sun, 27 Apr 2008 13:57:02 -0500 Subject: [SciPy-dev] MIT license Message-ID: We've received some MIT-licensed code to read matrices in the Harwell-Boeing file format[1]. My understanding is that the MIT license is essentially the same as the BSD, so is there any need to relicense the code? Generally, can MIT licensed code be included in SciPy without further consideration? [1] http://projects.scipy.org/scipy/scipy/ticket/354 Dominique, thanks for your contribution and sorry for the slow progress on this ticket. -- Nathan Bell wnbell at gmail.com http://graphics.cs.uiuc.edu/~wnbell/ From dominique.orban at gmail.com Sun Apr 27 17:18:38 2008 From: dominique.orban at gmail.com (Dominique Orban) Date: Sun, 27 Apr 2008 17:18:38 -0400 Subject: [SciPy-dev] MIT license In-Reply-To: References: Message-ID: <8793ae6e0804271418w74e0f6c6g24521b05ece9bf1c@mail.gmail.com> On Sun, Apr 27, 2008 at 2:57 PM, Nathan Bell wrote: > We've received some MIT-licensed code to read matrices in the > Harwell-Boeing file format[1]. > > My understanding is that the MIT license is essentially the same as > the BSD, so is there any need to relicense the code? Generally, can > MIT licensed code be included in SciPy without further consideration? > > [1] http://projects.scipy.org/scipy/scipy/ticket/354 Nathan, I chose the MIT license because I had to pick one. I am not partial to it and am open to picking another. The HB and RB readers depend on Konrad Hinsen's FortranFormat module, which I think is part of his ScientificPython package---that's released under the "CeCILL" license (www.cecill.info/index.en.html) so I'm not sure how that works with SciPy licensing. > Dominique, thanks for your contribution and sorry for the slow > progress on this ticket. No problem. I am not sure what the best way to alert the SciPy team is. At the time I posted the module, I had e-mailed the developers list but strangely, the message bounced back to me. Dominique From matthieu.brucher at gmail.com Sun Apr 27 17:27:00 2008 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 27 Apr 2008 23:27:00 +0200 Subject: [SciPy-dev] MIT license In-Reply-To: <8793ae6e0804271418w74e0f6c6g24521b05ece9bf1c@mail.gmail.com> References: <8793ae6e0804271418w74e0f6c6g24521b05ece9bf1c@mail.gmail.com> Message-ID: 2008/4/27 Dominique Orban : > On Sun, Apr 27, 2008 at 2:57 PM, Nathan Bell wrote: > > We've received some MIT-licensed code to read matrices in the > > Harwell-Boeing file format[1]. > > > > My understanding is that the MIT license is essentially the same as > > the BSD, so is there any need to relicense the code? Generally, can > > MIT licensed code be included in SciPy without further consideration? > > > > [1] http://projects.scipy.org/scipy/scipy/ticket/354 > > Nathan, > > I chose the MIT license because I had to pick one. I am not partial to > it and am open to picking another. The HB and RB readers depend on > Konrad Hinsen's FortranFormat module, which I think is part of his > ScientificPython package---that's released under the "CeCILL" license > (www.cecill.info/index.en.html) so I'm not sure how that works with > SciPy licensing. CeCILL is the French version of teh GPL, so if those headers are needed to build the package, I don't think they can be included into Scipy. > > Dominique, thanks for your contribution and sorry for the slow > > progress on this ticket. > > > No problem. I am not sure what the best way to alert the SciPy team > is. At the time I posted the module, I had e-mailed the developers > list but strangely, the message bounced back to me. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From peridot.faceted at gmail.com Mon Apr 28 01:00:09 2008 From: peridot.faceted at gmail.com (Anne Archibald) Date: Mon, 28 Apr 2008 01:00:09 -0400 Subject: [SciPy-dev] Changeset 4181 In-Reply-To: References: Message-ID: 2008/4/27 Nils Wagner : > Hi All, > > Changeset 4181 introduces [embarrassing ream of errors] Well I feel like an idiot. Fixed in r4195. Anne From emanuele at relativita.com Mon Apr 28 08:09:41 2008 From: emanuele at relativita.com (Emanuele Olivetti) Date: Mon, 28 Apr 2008 14:09:41 +0200 Subject: [SciPy-dev] MIT license In-Reply-To: References: <8793ae6e0804271418w74e0f6c6g24521b05ece9bf1c@mail.gmail.com> Message-ID: <4815BE85.6090406@relativita.com> Matthieu Brucher wrote: > > > 2008/4/27 Dominique Orban >: > > On Sun, Apr 27, 2008 at 2:57 PM, Nathan Bell > wrote: > > We've received some MIT-licensed code to read matrices in the > > Harwell-Boeing file format[1]. > > > > My understanding is that the MIT license is essentially the same as > > the BSD, so is there any need to relicense the code? > Generally, can > > MIT licensed code be included in SciPy without further > consideration? > > > > [1] http://projects.scipy.org/scipy/scipy/ticket/354 > > Nathan, > > I chose the MIT license because I had to pick one. I am not partial to > it and am open to picking another. The HB and RB readers depend on > Konrad Hinsen's FortranFormat module, which I think is part of his > ScientificPython package---that's released under the "CeCILL" license > (www.cecill.info/index.en.html > ) so I'm not sure how that > works with > SciPy licensing. > > > > CeCILL is the French version of teh GPL, so if those headers are > needed to build the package, I don't think they can be included into > Scipy. > CeCILL (v2) is a free software licence _similar_ to GPL and compatible to GPL. Here is a brief comment by Free Software Foundation: http://www.gnu.org/philosophy/license-list.html To Dominique: as most of the minor licenses be careful to adopt it; its robustness is not well tested and you could have unexpected troubles later. Emanuele From aisaac at american.edu Mon Apr 28 10:18:04 2008 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 28 Apr 2008 10:18:04 -0400 Subject: [SciPy-dev] MIT license In-Reply-To: <4815BE85.6090406@relativita.com> References: <8793ae6e0804271418w74e0f6c6g24521b05ece9bf1c@mail.gmail.com> <4815BE85.6090406@relativita.com> Message-ID: Did anyone answer Nathan's question about whether MIT licensed code can go into SciPy? My understanding is: yes. Wikipedia gives a fairly standard summary of the MIT/BSD difference: The MIT License is similar to the 3-clause "modified" BSD license, except that the BSD license contains a notice prohibiting the use of the name of the copyright holder in promotion. This is sometimes present in versions of the MIT License, as noted above. Cheers, Alan Isaac From nwagner at iam.uni-stuttgart.de Mon Apr 28 14:15:36 2008 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Mon, 28 Apr 2008 20:15:36 +0200 Subject: [SciPy-dev] Changeset 4181 In-Reply-To: References: Message-ID: On Sun, 27 Apr 2008 10:28:45 +0200 "Nils Wagner" wrote: > Hi All, > > Changeset 4181 introduces > > Just a few errors persist in r4195 :-) ====================================================================== ERROR: Failure: ImportError (cannot import name _bspline) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/nose/loader.py", line 363, in loadTestsFromName module = self.importer.importFromPath( File "/usr/lib/python2.4/site-packages/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/usr/lib/python2.4/site-packages/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_bspline.py", line 9, in ? import scipy.stats.models.bspline as B File "/usr/lib/python2.4/site-packages/scipy/stats/models/bspline.py", line 23, in ? from scipy.stats.models import _bspline ImportError: cannot import name _bspline ====================================================================== ERROR: test_factor3 (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_formula.py", line 231, in test_factor3 m = fac.main_effect(reference=1) File "/usr/lib/python2.4/site-packages/scipy/stats/models/formula.py", line 273, in main_effect reference = names.index(reference) ValueError: list.index(x): x not in list ====================================================================== ERROR: test_factor4 (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_formula.py", line 239, in test_factor4 m = fac.main_effect(reference=2) File "/usr/lib/python2.4/site-packages/scipy/stats/models/formula.py", line 273, in main_effect reference = names.index(reference) ValueError: list.index(x): x not in list ====================================================================== ERROR: test_huber (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_scale.py", line 35, in test_huber m = scale.huber(X) File "/usr/lib/python2.4/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/usr/lib/python2.4/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/usr/lib/python2.4/site-packages/numpy/core/fromnumeric.py", line 979, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== ERROR: test_huberaxes (test_scale.TestScale) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_scale.py", line 40, in test_huberaxes m = scale.huber(X, axis=0) File "/usr/lib/python2.4/site-packages/scipy/stats/models/robust/scale.py", line 82, in __call__ for donothing in self: File "/usr/lib/python2.4/site-packages/scipy/stats/models/robust/scale.py", line 102, in next scale = N.sum(subset * (a - mu)**2, axis=self.axis) / (self.n * Huber.gamma - N.sum(1. - subset, axis=self.axis) * Huber.c**2) File "/usr/lib/python2.4/site-packages/numpy/core/fromnumeric.py", line 979, in sum return sum(axis, dtype, out) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== FAIL: test_imresize (test_pilutil.TestPILUtil) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/testing/decorators.py", line 83, in skipper return f(*args, **kwargs) File "/usr/lib/python2.4/site-packages/scipy/misc/tests/test_pilutil.py", line 25, in test_imresize assert_equal(im1.shape,(11,22)) File "/usr/lib/python2.4/site-packages/numpy/testing/utils.py", line 137, in assert_equal assert_equal(len(actual),len(desired),err_msg,verbose) File "/usr/lib/python2.4/site-packages/numpy/testing/utils.py", line 145, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: ACTUAL: 0 DESIRED: 2 ====================================================================== FAIL: test_namespace (test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/stats/models/tests/test_formula.py", line 119, in test_namespace self.assertEqual(xx.namespace, Y.namespace) AssertionError: {} != {'Y': array([ 0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78, 80, 82, 84, 86, 88, 90, 92, 94, 96, 98]), 'X': array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49])} ---------------------------------------------------------------------- Ran 2165 tests in 106.200s FAILED (failures=2, errors=5) >>> scipy.__version__ '0.7.0.dev4195' From hoytak at gmail.com Tue Apr 29 15:20:49 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Tue, 29 Apr 2008 12:20:49 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804202050g63ece587g5eb9c3d6d9324c2b@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> <4db580fd0804202050g63ece587g5eb9c3d6d9324c2b@mail.gmail.com> Message-ID: <4db580fd0804291220j58406a44qbdbd5b0de54dac1a@mail.gmail.com> Okay, I have a patch together that does the above. I've been using it in my code for the past week or so, and I'm happy with how it works. If someone with svn commit access could review it and commit it, that would be helpful. It allows users to specify a RandomState object to use in calculating the random samples in the stats code. It defaults to the previous behavior of drawing them directly from the numpy.random module, so it shouldn't break any backwards compatibility. There is also full test coverage. I'm a bit new to the process of submitting patches like this. The attached is a tarball of three svndiff outputs. In general, is that the best way to submit a patch? I plan on contributing to scipy more in the future, so I want to get the process right from the start. Thanks! --Hoyt On Sun, Apr 20, 2008 at 8:50 PM, Hoyt Koepke wrote: > Oops -- ignore the line in the diff that says "reshape(, size..." -- > I was playing around with having it return a scalar and forgot to undo > it before taking the diff. > > --Hoyt > > > > On Sun, Apr 20, 2008 at 8:46 PM, Hoyt Koepke wrote: > > Hello, > > > > I've got the changes done, complete with test coverage. I also > > cleaned up a few minor coding things that make it easier to read the > > code but don't change the behavior. I'll attach the diff if anyone is > > curious / wants to give me feedback. > > > > However, I have a quick question about something that strikes me as > > rather odd behavior. When rvs() is called with no size or a size of 1 > > it returns a python array of size (1,). I find this odd for two > > reasons: > > > > 1. It's inconsistent with numpy.random > > 2. One of the unit tests in test_distributions.py tests to make sure > > the default returns a scalar (it now fails). > > > > While I'm at it, should I change this behavior? It seems like it > > could break some backwards compatibility stuff, so I'll leave it for > > everyone else to decide. > > > > > > > > -- Hoyt > > > > +++++++++++++++++++++++++++++++++++ > > Hoyt Koepke > > UBC Department of Computer Science > > http://www.cs.ubc.ca/~hoytak/ > > hoytak at gmail.com > > +++++++++++++++++++++++++++++++++++ > > > > > > -- > > > +++++++++++++++++++++++++++++++++++ > Hoyt Koepke > UBC Department of Computer Science > http://www.cs.ubc.ca/~hoytak/ > hoytak at gmail.com > +++++++++++++++++++++++++++++++++++ > -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ -------------- next part -------------- A non-text attachment was scrubbed... Name: stats_rng_patch.tar.gz Type: application/x-gzip Size: 8500 bytes Desc: not available URL: From robert.kern at gmail.com Tue Apr 29 15:33:13 2008 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 29 Apr 2008 14:33:13 -0500 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <4db580fd0804291220j58406a44qbdbd5b0de54dac1a@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> <4db580fd0804202050g63ece587g5eb9c3d6d9324c2b@mail.gmail.com> <4db580fd0804291220j58406a44qbdbd5b0de54dac1a@mail.gmail.com> Message-ID: <3d375d730804291233x516e40ffg2fc2b3053567eb4a@mail.gmail.com> On Tue, Apr 29, 2008 at 2:20 PM, Hoyt Koepke wrote: > Okay, I have a patch together that does the above. I've been using it > in my code for the past week or so, and I'm happy with how it works. > If someone with svn commit access could review it and commit it, that > would be helpful. Thank you! I will try to review it tomorrow. > It allows users to specify a RandomState object to use in calculating > the random samples in the stats code. It defaults to the previous > behavior of drawing them directly from the numpy.random module, so it > shouldn't break any backwards compatibility. There is also full test > coverage. > > I'm a bit new to the process of submitting patches like this. The > attached is a tarball of three svndiff outputs. In general, is that > the best way to submit a patch? For related changes like this, it is easier for the reviewer/committer to have a single .diff made from the root of the source tree rather than one .diff for each changed file. > I plan on contributing to scipy more > in the future, so I want to get the process right from the start. Usually, they should be attached to a ticket on the scipy Trac. You will need to register before you can make a ticket. http://projects.scipy.org/scipy/scipy http://projects.scipy.org/scipy/scipy/register However, I imagine you will probably get SVN access in short order if you keep this up. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From hoytak at gmail.com Tue Apr 29 15:55:48 2008 From: hoytak at gmail.com (Hoyt Koepke) Date: Tue, 29 Apr 2008 12:55:48 -0700 Subject: [SciPy-dev] random seed in scipy.stats In-Reply-To: <3d375d730804291233x516e40ffg2fc2b3053567eb4a@mail.gmail.com> References: <4db580fd0804191548j1f7d3487u67a1ed423b95b55f@mail.gmail.com> <3d375d730804200014n6ca95a8eh8b010bab1ebfb8f2@mail.gmail.com> <4db580fd0804200044v342e1979i59914b9054e1b98b@mail.gmail.com> <4db580fd0804201323k671dc470w9670b91a0d69c843@mail.gmail.com> <3d375d730804201730x8a8a49dv11b186d6d4fdc5db@mail.gmail.com> <4db580fd0804201736t6881949t74bce1fbb234a2f0@mail.gmail.com> <4db580fd0804202046h714f12a7h5222e413dc8548ea@mail.gmail.com> <4db580fd0804202050g63ece587g5eb9c3d6d9324c2b@mail.gmail.com> <4db580fd0804291220j58406a44qbdbd5b0de54dac1a@mail.gmail.com> <3d375d730804291233x516e40ffg2fc2b3053567eb4a@mail.gmail.com> Message-ID: <4db580fd0804291255i5dba8937q914b63e0df83aa32@mail.gmail.com> > Thank you! I will try to review it tomorrow. Excellent. Let me know if you have any suggestions or would like me to change anything. About half the lines in the distributions.py diff are correcting some shorthand stuff that I found confusing. I thought it better to use the straight numpy names, so I cleaned it up. (e.g. asarray abreviated as arr). > For related changes like this, it is easier for the reviewer/committer > to have a single .diff made from the root of the source tree rather > than one .diff for each changed file. Okay, sounds good, I attached that version. > However, I imagine you will probably get SVN access in short order if > you keep this up. That would be awesome... :-) -- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak at gmail.com +++++++++++++++++++++++++++++++++++ -------------- next part -------------- A non-text attachment was scrubbed... Name: stats_rng.diff.gz Type: application/x-gzip Size: 8899 bytes Desc: not available URL: From travis at enthought.com Wed Apr 30 10:47:30 2008 From: travis at enthought.com (Travis Vaught) Date: Wed, 30 Apr 2008 09:47:30 -0500 Subject: [SciPy-dev] [ANN] EuroSciPy Abstracts Deadline Reminder Message-ID: Greetings, Just a reminder: the abstracts for the EuroSciPy Conference in Leipzig are due by midnight tonight (CST, US [UTC -6]) April, 30. If you'd like to present, please submit your abstract as a PDF, MS Word or plain text file to euroabstracts at scipy.org. For more information on the EuroSciPy Conference, please see: http://www.scipy.org/EuroSciPy2008