From steve at shrogers.com Sun Jan 1 00:10:22 2006 From: steve at shrogers.com (Steven H. Rogers) Date: Sat, 31 Dec 2005 22:10:22 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B72B79.8040807@ieee.org> References: <43B72B79.8040807@ieee.org> Message-ID: <43B7643E.6030402@shrogers.com> +1 scicore From brendansimons at yahoo.ca Sun Jan 1 01:21:31 2006 From: brendansimons at yahoo.ca (Brendan Simons) Date: Sun, 1 Jan 2006 01:21:31 -0500 Subject: [SciPy-dev] Scipy-dev Digest, Vol 26, Issue 47 In-Reply-To: References: Message-ID: Speaking as a newbie, I find the "import scipy (which?)" ambiguity confusing enough to be annoying. I can't think of any advantage of overlapping the namespaces. I nominate sciarray or numsci as potential replacements. Brendan -- Brendan Simons, Project Engineering Stern Laboratories, Hamilton Ontario On 31-Dec-05, at 7:08 PM, Travis wrote: > > > I would like to hear some more opinions about this. The one that is > most convincing to me is that > you can't tell if something comes from full scipy or scipy_core > just by > looking at the import statement. > > That's an interesting concern. I know, this would require quite a bit > of re-factoring. So, I'm not sure it is worth it. > It might help with the image of scipy_core. > > We could call the scipy_core package scicore, or numcore, if that > is the > only concern. The biggest concern is that right now, a lot of the > sub-package management system is builtin to scipy_core from the get- > go. > That would have to be re-factored. I'm not sure how easy that > would be. > > I'm ambivalent at this point, and mostly concerned with "moving > forward" > and having people view scipy_core as relatively stable.... > > -Travis -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnd.baecker at web.de Sun Jan 1 10:26:37 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 1 Jan 2006 16:26:37 +0100 (CET) Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B72B79.8040807@ieee.org> References: <43B72B79.8040807@ieee.org> Message-ID: On Sat, 31 Dec 2005, Travis Oliphant wrote: > Fernando brings up an interesting point that might be worth some > discussion. It's might be too late to have this discussion, but with a > more stable release coming up, it might not be. > > Here is what he says: > > > My main point is this: I worry a little that using 'scipy' as a single > > name > > for both scipy_core and scipy_full is causing more problems than it > > solves, > > and may be an unnecessary long-term headache. There is the issue of > > confusion > > of installation amongst newbies, as well as technical problems with > > packaging > > because of file collisions, package manager issues, etc. Interesting - thanks for bringing this up. When talking about scipy recently, I always felt the need to be very precise, whether the issue is related to "scipy *core*" or "*full* scipy" (and in addition to distinguish from "*old* scipy"); I was about to voice my concerns a couple, but swamped by all the other stuff... > > I wonder if we wouldn't be better off with the scipy_core package > > being named > > something else altogether, let's say 'ndarray' (or whatever: I suck at > > coming > > up with good names). Are there any other options? For example (not meant to be complete): - num - numpy - numeric - numsci - numcore - numarr - numlib - arrcore - arrlib - ndarray - Array - sciarray - scicore - numcore Personally, I am not sure if we really need "core" in the name. The shorter the better. In view of the new chararrays, `num` or `sci` might be already too specific, so would `Array` be a good option? In addition, maybe one could even think of replacing "scipy" (for `full scipy`) by `scikits` (or some other name). This would mean that Numeric, numarray, scipy (old), `scipy core`, `scipy new` could exist in parallel in site-packages. This could make the transition to the new scipy easier. Best, Arnd From Chris.Fonnesbeck at MyFWC.com Sun Jan 1 12:48:04 2006 From: Chris.Fonnesbeck at MyFWC.com (Fonnesbeck, Chris) Date: Sun, 1 Jan 2006 12:48:04 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: Message-ID: On 1/1/06 10:26 AM, "Arnd Baecker" wrote: >>> I wonder if we wouldn't be better off with the scipy_core package >>> being named >>> something else altogether, let's say 'ndarray' (or whatever: I suck at >>> coming >>> up with good names). > > > Are there any other options? > For example (not meant to be complete): > > - num > - numpy > - numeric > - numsci > - numcore > - numarr > - numlib > - arrcore > - arrlib > - ndarray > - Array > - sciarray > - scicore > - numcore > I like numpy, which was the informal name for Numeric. I agree that there is little merit in keeping "core" in the name. Alternatively, what about using "scipy" for scipy_core, and "scipy_ext" or "scipy_plus" for the full package? C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2131 bytes Desc: not available URL: From jh at oobleck.astro.cornell.edu Sun Jan 1 13:17:41 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Sun, 1 Jan 2006 13:17:41 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: (scipy-dev-request@scipy.net) References: Message-ID: <200601011817.k01IHfCB015817@oobleck.astro.cornell.edu> +1 (because I'm not allowed to do +10!) on renaming. This will be a *lot* simpler for newbies, and for the doc and web-page and support email writers who are trying to help them. And the other reasons. I like keeping "sci" as the common prefix, so scicore or sciarray are both nice. If we do ultimately (I'm thinking 5 years) want to get scicore into the mainstream Python, in addition to ndarray, using the native distutils might be a plus. Otherwise, could such a move make trouble for the author of an add-on who didn't want to depend on full scipy but who wanted scipy.distutils? (And if not, then why do we have it at all?) Is that scenario more likely than an author wanting just one other package in full scipy (thus meriting special treatment for scipy.distutils)? --jh-- From charlesr.harris at gmail.com Sun Jan 1 14:34:04 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 1 Jan 2006 12:34:04 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: References: <43B72B79.8040807@ieee.org> Message-ID: On 1/1/06, Arnd Baecker wrote: > > On Sat, 31 Dec 2005, Travis Oliphant wrote: > > > Are there any other options? > For example (not meant to be complete): > > - num > - numpy > - numeric > - numsci > - numcore > - numarr > - numlib > - arrcore > - arrlib > - ndarray > - Array > - sciarray > - scicore > - numcore I would go for ndarray so I could use nd as the short version, but sciarray or scicore would also suit. I think numpy has too strong a connection to numeric. Where does the nd in ndarray come from, anyway? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sun Jan 1 14:43:50 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 1 Jan 2006 12:43:50 -0700 Subject: [SciPy-dev] Example of power of new data-type descriptors. In-Reply-To: <200512301946.05452.faltet@carabos.com> References: <43AFB124.9060507@ieee.org> <200512301946.05452.faltet@carabos.com> Message-ID: On 12/30/05, Francesc Altet wrote: > > A Dilluns 26 Desembre 2005 10:00, Travis Oliphant va escriure: > > I'd like more people to know about the new power that is in scipy core > > due to the general data-type descriptors that can now be used to define > > numeric arrays. Towards that effort here is a simple example (be sure > > to use latest SVN -- there were a coupld of minor changes that improve > > usability made recently). Notice this example does not use a special > > "record" array subclass. This is just a regular array. > > IMO, this is very good stuff and it opens the door to support > homogeneous, heterogeneous and character strings in just one object. > That makes the inclusion of such an object in Python a very big > improvement because people will finally have a very effective > container for virtually *any* kind of large datasets in an easy way. > I'm personally very excited about this new functionality :-) I wonder if it couldn't be adapted to read binary c structures off disk. There are lots of data acquisition programs that store their data that way, even though doing so introduces all sorts of packing and alignment problems due differences between compilers and architectures. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pearu at scipy.org Sun Jan 1 14:07:37 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 1 Jan 2006 13:07:37 -0600 (CST) Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B73EFD.5070406@ieee.org> References: <43B72B79.8040807@ieee.org> <3d375d730512311745l4114c2d1q7de121ae5ef6a14b@mail.gmail.com> <43B73EFD.5070406@ieee.org> Message-ID: Renaming and installing scipy_core packages under different namespace from scipy is certainly possible. Technically it does not simplify importing packages, we would still need pkgload hooks for setting scipy.__doc__ and scipy.__all__ without importing scipy subpackages. It is a bit unfortunate that now, when many of the issues with importing scipy_core and "full scipy" packages have been resolved and importing scipy became stable, we need a major renaming patches again. I really hope it will be the last one if we decide to do it. OT: Btw, with the current scipy_core from svn import scipy # (i) and from scipy import * # (ii) do already the right thing with proper scipy.__all__ list, that is, (i) imports only subpackages that have global symbols and (ii) imports all subpackages. Anyway, I am in favor of renaming scipy_core, but preferably to something shorter. `scicore` sounds good to me, I would propose also `scico` or `scire`, but they seem to be taken for other project names, note that also `scicore` is taken. Hmm, also `sciarray` is taken. As for applying the renaming, it should be mostly a replace scipy->.. task. I'll be tomorrow away but I'll be back on January 3rd to help with the renaming. And as for dropping scipy.distutils, scipy.f2py etc from scipy_core, I'm -1. We have discussed that before. scipy.distutils have many features over std distutils that considerably simplify maintaining scipy setup.py files etc. I am continuously cleaning up scipy.distutils to use more and more std distutils. However, I doubt that scipy.distutils anytime soon or ever, to be honest, could be replaced with std distutils. Pearu From Fernando.Perez at colorado.edu Sun Jan 1 15:31:13 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 01 Jan 2006 13:31:13 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: References: <43B72B79.8040807@ieee.org> Message-ID: <43B83C11.7000004@colorado.edu> Hi all, > I would go for ndarray so I could use nd as the short version, but sciarray > or scicore would also suit. I think numpy has too strong a connection to > numeric. Where does the nd in ndarray come from, anyway? I think the 'nd' is short for n-dimensional, to emphasize that it's more than a simple array (1-d) or matrix (2-d) library. Here's a suggestion. I'll use 'scicore' and 'ndarray' as working names, but the words can be changed. What I want to emphasize here is the organizational structure, not the actual names. How about the package being called scicore, which would include the modules: scicore/ __init__.py ->ndarray/ __init__.py ->f2py/ ->weave/ ->distutils/ ->anythingelse (fft/linalg/... - the 'lite' versions we have today)/ scicore's __init__ would have: __all__ = ['whatever'] + ndarray.__all__ With this organization, we get: 1. The scicore package will provide all the necessary array functionality (which actually lives in ndarray), as well as a number of infrastructure utilities. These are needed by scipy (full), but they can also be very useful to other scientifically-oriented packages. This would mean that third-party library writers can rely on these useful tools, without the fear of pulling 'the full scipy behemoth' in as a dependency. I know that, regardless of how much we succeed in making scipy_full work well and be easy to install, as a third-party author myself (for ipython), the thought of every new non-stdlib dependency for me requires very, very careful consideration. Offering this light core as common ground could have beneficial effects in the long term, as it would be a good unifying layer. 2. If we ever decide to push anything into the python core proper, it would most likely be just (part or all of) ndarray. This is the object targeted by Travis' array PEP, and the piece of most wide-ranging usefulness for non-scientific use (PIL, WX, etc.). This would keep our namespaces isolated, so that if at some point in the future this is considered, one could just package ndarray standalone and give it to Guido and his team. It might require a bit of distutils cleanup, but that's something we can worry about later (by that point, the functionality in scicore.distutils may even have been pushed upstream to the Python's official distutils). Aside from the necessary refactoring work right now, I don't see any major drawbacks to this, and do see benefits for the long term. Regards, f From pearu at scipy.org Sun Jan 1 14:56:55 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 1 Jan 2006 13:56:55 -0600 (CST) Subject: [SciPy-dev] scipy.pkgload (was Re: Moving random.py) In-Reply-To: <43B71DEB.3050002@astraw.com> References: <43A2B069.9000300@ieee.org> <43A2C1B6.1070502@bigpond.net.au> <43A36202.7090504@colorado.edu> <43A36A45.6080107@ieee.org> <43A4A036.2050704@colorado.edu> <43B71DEB.3050002@astraw.com> Message-ID: On Sat, 31 Dec 2005, Andrew Straw wrote: > This is great. I've made a few changes to allow the new system to work > with setuptools. Basically, the new patches don't assume that the > package __path__ variable contains only a single directory, but rather > searches all directories in __path__ for subpackages and modules. This > allows scipy_core and full scipy to both exist as .eggs without any > other changes. I hope you can commit this patch or something similar > before the release this coming week. Thanks for the patch. I have applied the patch with modifications. I haven't test it with eggs, so, could give a try. Thanks! Pearu From strawman at astraw.com Sun Jan 1 17:12:19 2006 From: strawman at astraw.com (Andrew Straw) Date: Sun, 01 Jan 2006 14:12:19 -0800 Subject: [SciPy-dev] scipy.pkgload (was Re: Moving random.py) In-Reply-To: References: <43A2B069.9000300@ieee.org> <43A2C1B6.1070502@bigpond.net.au> <43A36202.7090504@colorado.edu> <43A36A45.6080107@ieee.org> <43A4A036.2050704@colorado.edu> <43B71DEB.3050002@astraw.com> Message-ID: <43B853C3.4020407@astraw.com> Pearu Peterson wrote: >On Sat, 31 Dec 2005, Andrew Straw wrote: > > > >>This is great. I've made a few changes to allow the new system to work >>with setuptools. Basically, the new patches don't assume that the >>package __path__ variable contains only a single directory, but rather >>searches all directories in __path__ for subpackages and modules. This >>allows scipy_core and full scipy to both exist as .eggs without any >>other changes. I hope you can commit this patch or something similar >>before the release this coming week. >> >> > >Thanks for the patch. I have applied the patch with modifications. I >haven't test it with eggs, so, could give a try. > > It works fine. Thanks. Incidentally, this means that eggs with "namespace packages" will work with scipy now. In other words, we could implement "scikits" that are distributed separately but reside in the scipy.* namespace with no further work (for both egg-based and normal distributions). And then there's the question of whether this is a good idea... From oliphant.travis at ieee.org Sun Jan 1 17:57:43 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 01 Jan 2006 15:57:43 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: References: <43B72B79.8040807@ieee.org> Message-ID: <43B85E67.1010600@ieee.org> Thanks for the comments. I think we should go ahead with the renaming. Yes, it will be a pain for a bit. But, I agree that long term it's a good idea. I don't want to explain the difference between full scipy and scipy_core anymore either... I don't think scipy should change its name. The (full) scipy package will still be called scipy. Now, options are open for naming. I've liked many of the suggestions so far. Here are my votes. The parenthesized names are other possibilities. sci (arrlab, scicore)/ __init__.py ->numpy (ndarray, narray)/ __init__.py ->f2py/ ->distutils/ ->weave/ ** see comments ** ->linalg/ ->fft/ ->random/ **Comment: I'm thinking about moving weave over to full scipy. I recognize it's utility, but with pyrex gaining popularity and the fact that weave is not maintained actively by anyone, it concerns me to keep it in the basic package. I know we talked about this before, but it's the only sub-package I still feel timid around. Is someone willing to step up and keep weave up-to-date? I think we can do this soon. Could we move all of the package loading stuff over to scipy and keep sci a relatively simple package? The name changes are the worst, but I'm willing to do them (in matplotlib CVS as well). It's time to speak up about names ore forever hold your peace :-) -Travis From robert.kern at gmail.com Sun Jan 1 18:07:29 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 01 Jan 2006 17:07:29 -0600 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B85E67.1010600@ieee.org> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> Message-ID: <43B860B1.3050603@gmail.com> Travis Oliphant wrote: > sci (arrlab, scicore)/ > __init__.py > > ->numpy (ndarray, narray)/ > __init__.py > > ->f2py/ > > ->distutils/ > > ->weave/ ** see comments ** > > ->linalg/ > > ->fft/ Conflict with the function scicore.fft()? -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charlesr.harris at gmail.com Sun Jan 1 18:32:52 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 1 Jan 2006 16:32:52 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B85E67.1010600@ieee.org> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> Message-ID: On 1/1/06, Travis Oliphant wrote: > > > Thanks for the comments. I think we should go ahead with the renaming. > Yes, it will be a pain for a bit. But, I agree that long term it's a > good idea. I don't want to explain the difference between full scipy > and scipy_core anymore either... > > I don't think scipy should change its name. The (full) scipy package > will still be called scipy. > > Now, options are open for naming. I've liked many of the suggestions so > far. Here are my votes. The parenthesized names are other possibilities. > > sci (arrlab, scicore)/ > __init__.py > > ->numpy (ndarray, narray)/ > __init__.py > > ->f2py/ > > ->distutils/ > > ->weave/ ** see comments ** > > ->linalg/ > > ->fft/ > > ->random/ I would rather replace sci by numpy and then use ndarray in the package. Sci doesn't tell me the these are array utilities. Neither does numpy, really, but it has a history... Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From Chris.Fonnesbeck at MyFWC.com Sun Jan 1 18:37:57 2006 From: Chris.Fonnesbeck at MyFWC.com (Fonnesbeck, Chris) Date: Sun, 1 Jan 2006 18:37:57 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: Message-ID: On 1/1/06 6:32 PM, "Charles R Harris" wrote: > I would rather replace sci by numpy and then use ndarray in the package. Sci > doesn't tell me the these are array utilities. Neither does numpy, really, but > it has a history... If we want an informative name, how about "arrayutils"? I still like "numpy" though, both because of the connection to the packages it is replacing, and because it matches the form of "scipy" C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2131 bytes Desc: not available URL: From Fernando.Perez at colorado.edu Sun Jan 1 18:58:54 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 01 Jan 2006 16:58:54 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B85E67.1010600@ieee.org> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> Message-ID: <43B86CBE.1060903@colorado.edu> Travis Oliphant wrote: > sci (arrlab, scicore)/ I'm +0.5 on scicore: I think it emphasizes the fact that this constitutes the necessary core functionality for scipy, without any of the current confusion problems. The following is a little experiment, to see how a blurb would read (remove the weave reference as needed). Feel free to s/scicore/your_favorite/ to see how it would read: """scicore is a package for array-based numerical computing in Python. It provides a collection of modules useful for a wide variety of numerical tasks, similar in spirit to the basic functionality of systems such as Matlab or IDL. At its center is a flexible array type which can be very useful even for non-scientific tasks that interface with arrays in C libraries. The scicore package consists of: - ndarray: a package exposing a flexible datatype to efficiently manipulate n-dimensional arrays. Such objects are implemented as C extensions and support array element-wise arithmetic and other mathematical operations. While they are conceptually similar to the arrays of languages like Fortran90, they are in fact much more capable objects, as their contents can consist of arbitrary python objects and they have very rich indexing, data access and low-level maninpulation capabilities. - f2py: ... The scicore package can be installed on any standard python distribution, and it has no other dependencies for installation than Python itself (and a C compiler for a source-based install, as there is extension code in it). At runtime, some of its components do require the presence of compilers (f2py needs a Fortran compiler, while weave needs a C++ one), but these are not needed unless you explicitly import and use f2py or weave. The scipy package builds on top of scicore, to expose a large collection of algorithms and libraries for many tasks in scientific computing. scipy wraps many well-known and field-tested Fortran and C libraries, as well as providing new routines written both in C and in Python. """ Just a little experiment in language :) > ->numpy (ndarray, narray)/ -1 on 'numpy': if at some point in the future this is considered a candidate for the stdlib, I don't think we want anything with a 'py' suffix in the package name (not a single module in the stdlib ends in 'py' today, except for 'copy'). I also don't want to have to explain to anybody "well, this isn't really the old numpy, it's the new numpy, which was written by the guy who was maintaining the old numpy, but it's a new code, and it also has features from numarray, so it's like the old numpy + numarray + better, but it's called numpy again. Easy, no?" Really, I don't. +1 on 'ndarray', because it emphasizes the fact that these are generic, very flexible (esp. with all the recent enhancements) N-dimensional data structures. In that sense, 'narray' has a stronger hint of 'n for numbers', while Travis has shown that these objects can be much more than traditional Fortran-type arrays. I also think ndarray goes well as an extension to the 'array' module in the stdlib. They are, after all, conceptually close. > **Comment: I'm thinking about moving weave over to full scipy. I > recognize it's utility, but with pyrex gaining popularity and the fact that > weave is not maintained actively by anyone, it concerns me to keep it in > the basic package. I know we talked about this before, but it's the only > sub-package I still feel timid around. Is someone willing to step up and > keep weave up-to-date? I really think that weave is great, but I also know how annoying it is to debug (each patch I've commited for it has taken me an afternoon to understand the code flow). I can volunteer to help with it, but can't really make very hard promises on my performance there (ipython is my top priority in terms of free software development time). So if you think it will help to move it over to scipy, by all means do so (unless someone else wants to play with weave: Prabhu and Robert two others who have tackled the beast in the past...). Cheers, f From faltet at carabos.com Sun Jan 1 19:04:00 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 01:04:00 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core Message-ID: <200601020104.01140.faltet@carabos.com> Hi, I've started looking at giving suport of scipy_core objects into PyTables. I begun by homogenous array (not chararrays or recarrays yet), and discovered some issues that perhaps can be improved in scipy_core: 1.- Is there any reason for this? In [67]: type(scicore.array(3, dtype='D')) Out[67]: In [68]: type(scicore.array(3).astype('D')) Out[68]: !!!!!!!! However, astype() seems to work correctly with arrays: In [69]: type(scicore.array([3], dtype='D')) Out[69]: In [70]: type(scicore.array([3]).astype('D')) Out[70]: 2.- For the sake of compatibility with numarray, I'm wondering if you can include entries in the typeDict dictionary that follows numarray conventions. In [98]: scicore.typeDict['UInt16'] --------------------------------------------------------------------------- exceptions.KeyError Traceback (most recent call last) /home/faltet/python.nobackup/scipy_core/ KeyError: 'UInt16' However, this exists: In [99]: scicore.typeDict['Uint16'] Out[99]: Better yet, would the change 'UintXX' --> 'UIntXX' possible? I also like this because 1) is a standard (at least is what the users of numarray are used to, so why invent yet another convention?) and 2) if you want to obtain the unsigned version of a type, you just have to write: 'U'+signed_type, and you are done. Also, for numarray compatibility, could 'complexXXX' --> 'ComplexXXX' be made? And add a 'Bool'? I know that a type 'Bool8' exists, but perhaps 'Bool' can be added as an alias of 'Bool8'. 3.- Finally, I'd advocate to have a more handy string representation for .dtype. Right now, one have: In [117]: a.dtype --------> a.dtype() Out[117]: 0 (why dtype supports the call protocol? Is it really necessary?) In [118]: str(a.dtype) Out[118]: "" In [121]: repr(a.dtype) Out[121]: "" I'd suggest that the string representation for dtype would follow the numarray convention, so that str(a.dtype) would give 'Int32', 'UInt32' and so on. This would enhance the interoperativity between numarray and scipy_core (during the time they have to coexist). Also, having a simple string representation for types may easy the type manipulation. The repr(a.dtype) would be left as it is now. Thanks and Happy New Year! -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From faltet at carabos.com Sun Jan 1 19:08:17 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 01:08:17 +0100 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B7643E.6030402@shrogers.com> References: <43B72B79.8040807@ieee.org> <43B7643E.6030402@shrogers.com> Message-ID: <200601020108.17734.faltet@carabos.com> A Diumenge 01 Gener 2006 06:10, Steven H. Rogers va escriure: > +1 scicore -1 numpy (may sound confusing with existing Numeric) +1 scicore +5 bmh (basic matrix handling) +10 bmatrix (basic matrix) Just to add more noise into this ;-) -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From oliphant.travis at ieee.org Sun Jan 1 18:07:41 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 01 Jan 2006 16:07:41 -0700 Subject: [SciPy-dev] Alternative to Docstrings in C-code Message-ID: <43B860BD.3080507@ieee.org> I want to let people know that I've started a little project to remove docstrings from c-code and place them in a python module that will dynamically add docstrings to builtin functions. I don't know about you, but I hate writing docstrings in C-code. It's much harder to get the formatting to look right, and it's much more painful than in Python. Also, having all the docstrings in one place might actually make for a nice little reference. I don't know what kind of loading speed impact this will have (if any), so I'm proceeding cautiously. Currently, there is a simple utitlity called add_docstring in the _compiled_base module that can add a docstring to a builtin-function-or-method object or a type object. It only works if the object does not already have a docstring (because once added the Python string is never released until Python exits). I've heard some things on blogs that convince me that (unfortunately) I need to reassure certain people that this is *not* some sneaky attempt to remove docstrings. I'm actually trying to make them more convenient to add, update, and maintain. I like docstrings as much as anybody. I also like them to look good. -Travis From Fernando.Perez at colorado.edu Sun Jan 1 19:35:29 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 01 Jan 2006 17:35:29 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <200601020108.17734.faltet@carabos.com> References: <43B72B79.8040807@ieee.org> <43B7643E.6030402@shrogers.com> <200601020108.17734.faltet@carabos.com> Message-ID: <43B87551.1070802@colorado.edu> Francesc Altet wrote: > A Diumenge 01 Gener 2006 06:10, Steven H. Rogers va escriure: > >>+1 scicore > > > -1 numpy (may sound confusing with existing Numeric) > +1 scicore > +5 bmh (basic matrix handling) > +10 bmatrix (basic matrix) > > Just to add more noise into this ;-) -1 to anything with a direct reference to 'matrix': we're trying to convey that these objects go beyond 2-d matrices, in contrast with matlab and other 'matrix-oriented' languages/systems. Cheers, f From Fernando.Perez at colorado.edu Sun Jan 1 19:37:34 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 01 Jan 2006 17:37:34 -0700 Subject: [SciPy-dev] Alternative to Docstrings in C-code In-Reply-To: <43B860BD.3080507@ieee.org> References: <43B860BD.3080507@ieee.org> Message-ID: <43B875CE.90809@colorado.edu> Travis Oliphant wrote: > I want to let people know that I've started a little project to remove > docstrings from c-code and place them in a python module that will > dynamically add docstrings to builtin functions. I don't know about > you, but I hate writing docstrings in C-code. It's much harder to get > the formatting to look right, and it's much more painful than in > Python. Also, having all the docstrings in one place might actually > make for a nice little reference. > > I don't know what kind of loading speed impact this will have (if any), > so I'm proceeding cautiously. Currently, there is a simple utitlity > called add_docstring in the _compiled_base module that can add a > docstring to a builtin-function-or-method object or a type object. It > only works if the object does not already have a docstring (because once > added the Python string is never released until Python exits). Great! I also hate writing docstrings in C, and this may significantly lower the barrier for users to contribute docstring patches. It's a lot easier (esp. if a user is not a C programmer) to edit a plain python file full of """strings""" than to dig into the C sources. Since documentation is an area where users who don't feel comfortable with the internals can still make excellent contributions, I think this is an excellent development. Cheers, f From schofield at ftw.at Sun Jan 1 20:45:26 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 2 Jan 2006 01:45:26 +0000 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B86CBE.1060903@colorado.edu> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> <43B86CBE.1060903@colorado.edu> Message-ID: <724E3D8F-AF9B-4147-BBEF-FF66B6CA1A5C@ftw.at> On 01/01/2006, at 11:58 PM, Fernando Perez wrote: > Travis Oliphant wrote: > >> sci (arrlab, scicore)/ > > I'm +0.5 on scicore: I think it emphasizes the fact that this > constitutes the > necessary core functionality for scipy, without any of the current > confusion > problems. The following is a little experiment, to see how a blurb > would read > (remove the weave reference as needed). Feel free to s/scicore/ > your_favorite/ > to see how it would read: > > """scicore is a package for array-based numerical computing in Python. > > It provides a collection of modules useful for a wide variety of > numerical > tasks, similar in spirit to the basic functionality of systems such > as Matlab > or IDL. At its center is a flexible array type which can be very > useful even > for non-scientific tasks that interface with arrays in C libraries. > > The scicore package consists of: > > - ndarray: a package exposing a flexible datatype to efficiently > manipulate > n-dimensional arrays. Such objects are implemented as C extensions > and > support array element-wise arithmetic and other mathematical > operations. > While they are conceptually similar to the arrays of languages like > Fortran90, > they are in fact much more capable objects, as their contents can > consist of > arbitrary python objects and they have very rich indexing, data > access and > low-level maninpulation capabilities. > > - f2py: ... > > > The scicore package can be installed on any standard python > distribution, and > it has no other dependencies for installation than Python itself > (and a C > compiler for a source-based install, as there is extension code in > it). At > runtime, some of its components do require the presence of > compilers (f2py > needs a Fortran compiler, while weave needs a C++ one), but these > are not > needed unless you explicitly import and use f2py or weave. > > > The scipy package builds on top of scicore, to expose a large > collection of > algorithms and libraries for many tasks in scientific computing. > scipy wraps > many well-known and field-tested Fortran and C libraries, as well > as providing > new routines written both in C and in Python. > """ > > Just a little experiment in language :) > >> ->numpy (ndarray, narray)/ > > -1 on 'numpy': if at some point in the future this is considered a > candidate > for the stdlib, I don't think we want anything with a 'py' suffix > in the > package name (not a single module in the stdlib ends in 'py' today, > except for > 'copy'). > > I also don't want to have to explain to anybody "well, this isn't > really the > old numpy, it's the new numpy, which was written by the guy who was > maintaining the old numpy, but it's a new code, and it also has > features from > numarray, so it's like the old numpy + numarray + better, but it's > called > numpy again. Easy, no?" Really, I don't. > > +1 on 'ndarray', because it emphasizes the fact that these are > generic, > very flexible (esp. with all the recent enhancements) N-dimensional > data > structures. In that sense, 'narray' has a stronger hint of 'n for > numbers', > while Travis has shown that these objects can be much more than > traditional > Fortran-type arrays. > > I also think ndarray goes well as an extension to the 'array' > module in the > stdlib. They are, after all, conceptually close. > I'm with Fernando on this, for all the same reasons. So: +1 on 'ndarray' for the basic array module. +1 on 'scicore' for the whole package. And ... +1 on moving weave to scipy, to keep scicore lean and mean ;) -- Ed From charlesr.harris at gmail.com Sun Jan 1 21:02:00 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 1 Jan 2006 19:02:00 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <724E3D8F-AF9B-4147-BBEF-FF66B6CA1A5C@ftw.at> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> <43B86CBE.1060903@colorado.edu> <724E3D8F-AF9B-4147-BBEF-FF66B6CA1A5C@ftw.at> Message-ID: On 1/1/06, Ed Schofield wrote: > > > On 01/01/2006, at 11:58 PM, Fernando Perez wrote: > > > Travis Oliphant wrote: > > > >> sci (arrlab, scicore)/ > > > > I'm +0.5 on scicore: I think it emphasizes the fact that this > > constitutes the > > necessary core functionality for scipy, without any of the current > > confusion > > problems. The following is a little experiment, to see how a blurb > > would read > > (remove the weave reference as needed). Feel free to s/scicore/ > > your_favorite/ > > to see how it would read: > > > > """scicore is a package for array-based numerical computing in Python. > > > > It provides a collection of modules useful for a wide variety of > > numerical > > tasks, similar in spirit to the basic functionality of systems such > > as Matlab > > or IDL. At its center is a flexible array type which can be very > > useful even > > for non-scientific tasks that interface with arrays in C libraries. > > > > The scicore package consists of: > > > > - ndarray: a package exposing a flexible datatype to efficiently > > manipulate > > n-dimensional arrays. Such objects are implemented as C extensions > > and > > support array element-wise arithmetic and other mathematical > > operations. > > While they are conceptually similar to the arrays of languages like > > Fortran90, > > they are in fact much more capable objects, as their contents can > > consist of > > arbitrary python objects and they have very rich indexing, data > > access and > > low-level maninpulation capabilities. > > > > - f2py: ... > > > > > > The scicore package can be installed on any standard python > > distribution, and > > it has no other dependencies for installation than Python itself > > (and a C > > compiler for a source-based install, as there is extension code in > > it). At > > runtime, some of its components do require the presence of > > compilers (f2py > > needs a Fortran compiler, while weave needs a C++ one), but these > > are not > > needed unless you explicitly import and use f2py or weave. > > > > > > The scipy package builds on top of scicore, to expose a large > > collection of > > algorithms and libraries for many tasks in scientific computing. > > scipy wraps > > many well-known and field-tested Fortran and C libraries, as well > > as providing > > new routines written both in C and in Python. > > """ > > > > Just a little experiment in language :) > > > >> ->numpy (ndarray, narray)/ > > > > -1 on 'numpy': if at some point in the future this is considered a > > candidate > > for the stdlib, I don't think we want anything with a 'py' suffix > > in the > > package name (not a single module in the stdlib ends in 'py' today, > > except for > > 'copy'). > > > > I also don't want to have to explain to anybody "well, this isn't > > really the > > old numpy, it's the new numpy, which was written by the guy who was > > maintaining the old numpy, but it's a new code, and it also has > > features from > > numarray, so it's like the old numpy + numarray + better, but it's > > called > > numpy again. Easy, no?" Really, I don't. > > > > +1 on 'ndarray', because it emphasizes the fact that these are > > generic, > > very flexible (esp. with all the recent enhancements) N-dimensional > > data > > structures. In that sense, 'narray' has a stronger hint of 'n for > > numbers', > > while Travis has shown that these objects can be much more than > > traditional > > Fortran-type arrays. > > > > I also think ndarray goes well as an extension to the 'array' > > module in the > > stdlib. They are, after all, conceptually close. > > > > I'm with Fernando on this, for all the same reasons. So: > +1 on 'ndarray' for the basic array module. > +1 on 'scicore' for the whole package. > > And ... > +1 on moving weave to scipy, to keep scicore lean and mean ;) > > > -- Ed There seems to be a consensus developing. So +1 on 'ndarray' for the basic array module. +1 on 'scicore' for the whole package. And ... +1 on moving weave to scipy. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From schofield at ftw.at Sun Jan 1 21:38:31 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 2 Jan 2006 02:38:31 +0000 Subject: [SciPy-dev] [SciPy-user] default dtype References: <9E95E32C-373A-4830-A8B3-FF16605B7181@ftw.at> Message-ID: <2698AB6B-03E0-484B-A2CC-305FC2D55134@ftw.at> On 30/12/2005, at 7:14 PM, Travis Oliphant wrote: > Alan G Isaac wrote: > >> Is an integer data type the obvious >> default for 'empty'? I expected float. >> > This question comes up occasionally. The reason for int is largely > historical --- that's what was decided long ago when Numeric came out. > Changing this in some places would break a lot of code, I'm afraid. > And the default for empty is done for consistency. I felt it > better to > have one default rather than many. > > The default can be changed in one place in the C-code if we did decide > to change it. Now's the time because version 1.0 is approaching > in the > next couple of months. Version 0.9 will be the first-of-the-year > release. +5 on changing the default to float. I think we'd look back on this decision in several years as difficult but right. Here are some ideas on how we could ease the transition: (1) We could provide new functions intzeros(), intones(), and intempty () with the same behaviour as the current functions. That is, integer types would be the default, but this could be overridden by a dtype keyword argument. Then converting any old Numeric / numarray code would just require another global string substitution in convertcode.py. (2) We could provide two sets of functions, intzeros() etc. and floatzeros() etc., and remove the default interpretation altogether from the standard zeros() functions. This is not ideal long term, but could be a useful temporary measure during a transition for shaking out bugs from the scicore and scipy trees. (3) The default type could be chosen by the user as a package-level global variable. I think this would be the best solution. Then the old integer default could be turned on with one line of Python code. I suppose that Python functions using this default, given the static evaluation of default argument values, would need the "dtype=None" idiom in function headers followed by dtype=global_dtype in the function body. -- Ed From oliphant.travis at ieee.org Sun Jan 1 22:34:01 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Sun, 01 Jan 2006 20:34:01 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601020104.01140.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> Message-ID: Francesc Altet wrote: > Hi, > > I've started looking at giving suport of scipy_core objects into > PyTables. I begun by homogenous array (not chararrays or recarrays > yet), and discovered some issues that perhaps can be improved in > scipy_core: > > 1.- Is there any reason for this? > > In [67]: type(scicore.array(3, dtype='D')) > Out[67]: > > In [68]: type(scicore.array(3).astype('D')) > Out[68]: !!!!!!!! > The reason is that you are getting an "array scalar" because 0-dimensional arrays are only returned from the array function, itself. All other functions return array scalars. Please note that complex128_arrtype is an actual python type object that can hold a single item of the array. > > 2.- For the sake of compatibility with numarray, I'm wondering if you > can include entries in the typeDict dictionary that follows numarray > conventions. > > In [98]: scicore.typeDict['UInt16'] > --------------------------------------------------------------------------- > exceptions.KeyError Traceback (most > recent call last) > > /home/faltet/python.nobackup/scipy_core/ > > KeyError: 'UInt16' > > However, this exists: > > In [99]: scicore.typeDict['Uint16'] > Out[99]: This is a mistake. Uint16 should *be* UInt16. That is one of the big purposes of typeDict (compatibility with numarray). > > Also, for numarray compatibility, could 'complexXXX' --> 'ComplexXXX' > be made? And add a 'Bool'? I know that a type 'Bool8' exists, but > perhaps 'Bool' can be added as an alias of 'Bool8'. Sounds fine on the Bool. But, note that unlike in numarray, the lower-case types are actual Python type objects. The capitalized names are just string typecodes. Also, ComplexXXX uses the old (bit-width of the float) convention for backwards compatibility, whereas the new complexXXX names use item-bit-width convention. > > 3.- Finally, I'd advocate to have a more handy string representation > for .dtype. Right now, one have: > > In [117]: a.dtype > --------> a.dtype() > Out[117]: 0 > > (why dtype supports the call protocol? Is it really necessary?) > Because it's an actual type object. I think we could disable the tp_new method of the type. Curently, this allows you do say int32([1,2,3,4]) to construct an array. > In [118]: str(a.dtype) > Out[118]: "" > > In [121]: repr(a.dtype) > Out[121]: "" This would require creating a new metaclass because a.dtype is a typeobject and so prints using the standard type object printing. We would have to inherit from the typeobject to change that standard. I think rather than shield people from the idea that dtype is an actual typeobject (which changing the printing would do) we should emphaize that fact. Thanks for your suggestions. -Travis From oliphant.travis at ieee.org Sun Jan 1 22:45:51 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 01 Jan 2006 20:45:51 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B860B1.3050603@gmail.com> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> <43B860B1.3050603@gmail.com> Message-ID: <43B8A1EF.6040703@ieee.org> Robert Kern wrote: >Travis Oliphant wrote: > > > >>sci (arrlab, scicore)/ >> __init__.py >> >> ->numpy (ndarray, narray)/ >> __init__.py >> >> ->f2py/ >> >> ->distutils/ >> >> ->weave/ ** see comments ** >> >> ->linalg/ >> >> ->fft/ >> >> > >Conflict with the function scicore.fft()? > > I was thinking we wouldn't pull up fft, ifft, rand, and randn in scicore, but let scipy and/or pylab do that kind of name-space manipulation. Otherwise, we would have to call it fftlib or something. -Travis From prabhu_r at users.sf.net Sun Jan 1 23:18:27 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Mon, 2 Jan 2006 09:48:27 +0530 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B85E67.1010600@ieee.org> References: <43B72B79.8040807@ieee.org> <43B85E67.1010600@ieee.org> Message-ID: <17336.43411.318433.401353@monster.iitb.ac.in> >>>>> "Travis" == Travis Oliphant writes: Travis> The name changes are the worst, but I'm willing to do them Travis> (in matplotlib CVS as well). It's time to speak up about Travis> names ore forever hold your peace :-) How about the following crazy (and not so crazy) alternatives (humor me, please): s (too short?), sc (short for scipy_core), score, cores, sarray, sarr, sar, scar, sciray, siray, travis, toarr (Travis Oliphant's Array), toa, tao (Travis "Array" Oliphant), art (Array Travis), arto, star. I'd better stop before someone thinks I've been smoking funny things. Somehow scicore sounds unimaginative and too serious. cheers, prabhu From steve at shrogers.com Sun Jan 1 23:41:09 2006 From: steve at shrogers.com (Steven H. Rogers) Date: Sun, 01 Jan 2006 21:41:09 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B87551.1070802@colorado.edu> References: <43B72B79.8040807@ieee.org> <43B7643E.6030402@shrogers.com> <200601020108.17734.faltet@carabos.com> <43B87551.1070802@colorado.edu> Message-ID: <43B8AEE5.3020901@shrogers.com> -1 for anything with matrix in it -1 scicore (withdrawing previous vote) +1 ndarray +1 arrlab +1 arrproc +1 ap Regards, Steve ///////////////////// Fernando Perez wrote: > Francesc Altet wrote: > >>A Diumenge 01 Gener 2006 06:10, Steven H. Rogers va escriure: >> >> >>>+1 scicore >> >> >>-1 numpy (may sound confusing with existing Numeric) >>+1 scicore >>+5 bmh (basic matrix handling) >>+10 bmatrix (basic matrix) >> >>Just to add more noise into this ;-) > > > -1 to anything with a direct reference to 'matrix': we're trying to convey > that these objects go beyond 2-d matrices, in contrast with matlab and other > 'matrix-oriented' languages/systems. > > Cheers, > > f > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > > -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From jdhunter at ace.bsd.uchicago.edu Sun Jan 1 23:43:14 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Sun, 01 Jan 2006 22:43:14 -0600 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B72B79.8040807@ieee.org> (Travis Oliphant's message of "Sat, 31 Dec 2005 18:08:09 -0700") References: <43B72B79.8040807@ieee.org> Message-ID: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Travis" == Travis Oliphant writes: Travis> We could call the scipy_core package scicore, or numcore, Travis> if that is the only concern. The biggest concern is that I've been happy with the name "numerix" in matplotlib. It is intended to connote "Numeric" in plural, eg numeric, numarray and (now) scipy core. Also, the import abbrev import numerix as nx is a good mnemonic and is less likely to clash with the local namespace than other popular historical choices import Numeric as N import scipy as s import pylab as p And lots of folks are using it already as an array-object interface. I've been happy in my own code with nx.mlab.mean, nx.arange, and so forth.... It appears to me that scipy core is for the most part more Numeric and less scipy, in that is has most of what Numeric/numarray offered but does not have the bulk of scipy. Thus a name like numerix may better reflect what the package offers: a core array object with benefits. JDH From charlesr.harris at gmail.com Mon Jan 2 00:07:52 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 1 Jan 2006 22:07:52 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: On 1/1/06, John Hunter wrote: > > >>>>> "Travis" == Travis Oliphant writes: > > Travis> We could call the scipy_core package scicore, or numcore, > Travis> if that is the only concern. The biggest concern is that > > I've been happy with the name "numerix" in matplotlib. It is intended > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > core. Also, the import abbrev > > import numerix as nx > That sounds good. Is that Asterix's nerdy nephew? -------------- next part -------------- An HTML attachment was scrubbed... URL: From prabhu_r at users.sf.net Mon Jan 2 00:23:30 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Mon, 2 Jan 2006 10:53:30 +0530 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <17336.47314.52741.74517@prpc.aero.iitb.ac.in> >>>>> "John" == John Hunter writes: >>>>> "Travis" == Travis Oliphant writes: Travis> We could call the scipy_core package scicore, or numcore, Travis> if that is the only concern. The biggest concern is that [...] John> scipy. Thus a name like numerix may better reflect what the John> package offers: a core array object with benefits. Obelix, Vitalstatistix and other Asterix character names come readily to mind. cheers, prabhu-going-completely-nuts From golux at comcast.net Mon Jan 2 00:31:59 2006 From: golux at comcast.net (Stephen Waterbury) Date: Mon, 02 Jan 2006 00:31:59 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <17336.47314.52741.74517@prpc.aero.iitb.ac.in> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> <17336.47314.52741.74517@prpc.aero.iitb.ac.in> Message-ID: <43B8BACF.5030008@comcast.net> Prabhu Ramachandran wrote: >>>>>>"John" == John Hunter writes: > >>>>>>"Travis" == Travis Oliphant writes: > [...] > John> scipy. Thus a name like numerix may better reflect what the > John> package offers: a core array object with benefits. > > Obelix, Vitalstatistix and other Asterix character names come readily > to mind. Wow, I just realized that Asterix's mother must have been ... matrix! (Pronounced with a short "a", of course ... sorry, couldn't resist :) Steve From prabhu_r at users.sf.net Mon Jan 2 00:55:25 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Mon, 2 Jan 2006 11:25:25 +0530 Subject: [SciPy-dev] [ANN] The Enthought Tool Suite Message-ID: <17336.49229.573301.509742@prpc.aero.iitb.ac.in> Greetings, This is to announce the *existence* of the Enthought Tool Suite (ETS). ETS is a collection of several useful components that together provide a powerful application framework with special emphasis on scientific and engineering applications. There are a large number of components, some stable and some under heavy development. These include: * Traits: (general purpose: stable) - Traits allow one to define special Python object attributes that elegantly support initialization, validation, delegation, notification and visualization (i.e. automatic generation of user interfaces) * Endo: (general purpose: stable) - Endo is a Traits aware API documentation tool. * PyFace: (user interface: stable) - PyFace is a collection of Traits-based MVC components for wxPython. * Envisage: (general purpose: UNSTABLE) - Envisage is an extensible Traits-based application framework. It offers a plug-in based architecture similar to the Java Eclipse framework. While most of the existing plug-ins are stable, the overall package itself, and the UI plug-in in particular, are still under heavy development. * Kiva: (graphics: stable) - Kiva is a multi-platform Display PDF 2D drawing engine that supports multiple output backends, including wxWidgets on Windows, GTK, and Carbon, a variety of raster image formats, PDF, and Postscript. * Chaco: (plotting: being re-factored) - Chaco is a Python toolkit for producing interactive plotting applications. Chaco applications can range from simple line plotting scripts up to GUI applications for interactively exploring different aspects of interrelated data. Chaco leverages other Enthought technologies such as Kiva, Enable, and Traits to produce highly interactive plots of publication quality. * TVTK: (graphics: stable) - TVTK is a Pythonic VTK wrapper for visualization, graphics and imaging in 2D/3D. It transparently supports Traits and Numeric/numarray/scipy arrays. * MayaVi2: (scientific/engineering: semi-stable, unfinished) - MayaVi2 is an Envisage plug-in for 2D/3D scientific data visualization. It is the next generation of the MayaVi scientific data visualizer. These tools together form the Enthought Tool Suite (ETS). ETS has been under heavy development for the last year or two. While some of the tools are still under heavy development, many of them like Traits, PyFace, Kiva, TVTK and others are stable enough for general use. Enthought's enhanced Python distribution (Enthon) and installers for several of the stable ETS components mentioned above are available here: http://code.enthought.com/ The source code is released under a BSD-like license. Up-to-date installers for the ETS are on their way and will be announced when ready. Those who are interested in the components that are still under development should check out the SVN repository (a fresh working copy is likely to occupy about 250MB). Trac interface (Wiki, code, issue tracker): http://www.enthought.com/enthought Build instructions: http://www.enthought.com/enthought/wiki/GrabbingAndBuilding Mailing list: http://mail.enthought.com/mailman/listinfo/envisage-dev ETS is developed and funded by Enthought, Inc., Austin, TX. Enjoy using the Enthought Tool Suite! Cheers, The Enthought Tool Suite Team From nwagner at mecha.uni-stuttgart.de Mon Jan 2 05:02:37 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 02 Jan 2006 11:02:37 +0100 Subject: [SciPy-dev] AttributeError: PackageLoader instance has no attribute '_obj2str' Message-ID: Hi all, A Happy New Year and my first bug report for latest svn of core/scipy. from scipy import * File "/usr/local/lib/python2.4/site-packages/scipy/__init__.py", line 335, in ? pkgload(verbose=SCIPY_IMPORT_VERBOSE,postpone=True) File "/usr/local/lib/python2.4/site-packages/scipy/__init__.py", line 216, in __call__ self.warn('Overwriting %s=%s (was %s)' \ AttributeError: PackageLoader instance has no attribute '_obj2str' Nils From cimrman3 at ntc.zcu.cz Mon Jan 2 05:35:39 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 02 Jan 2006 11:35:39 +0100 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <43B901FB.2090801@ntc.zcu.cz> John Hunter wrote: >>>>>>"Travis" == Travis Oliphant writes: > > > Travis> We could call the scipy_core package scicore, or numcore, > Travis> if that is the only concern. The biggest concern is that > > I've been happy with the name "numerix" in matplotlib. It is intended > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > core. Also, the import abbrev > > import numerix as nx > > is a good mnemonic and is less likely to clash with the local > namespace than other popular historical choices > > import Numeric as N > import scipy as s > import pylab as p > > And lots of folks are using it already as an array-object interface. > I've been happy in my own code with nx.mlab.mean, nx.arange, and so > forth.... I cannot resist to generate some more names :) I do often import (numeric/scipy.base) as nm which brings me to the idea to use directly the ultrashort 'nm' (or 'nx'!). r. From dd55 at cornell.edu Mon Jan 2 06:10:49 2006 From: dd55 at cornell.edu (Darren Dale) Date: Mon, 2 Jan 2006 06:10:49 -0500 (EST) Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <1711.68.67.236.115.1136200249.squirrel@webmail.cornell.edu> >>>>>> "Travis" == Travis Oliphant writes: > > Travis> We could call the scipy_core package scicore, or numcore, > Travis> if that is the only concern. The biggest concern is that > > I've been happy with the name "numerix" in matplotlib. It is intended > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > core. Also, the import abbrev > > import numerix as nx I like this idea a lot From arnd.baecker at web.de Mon Jan 2 06:18:56 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 2 Jan 2006 12:18:56 +0100 (CET) Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <1711.68.67.236.115.1136200249.squirrel@webmail.cornell.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> <1711.68.67.236.115.1136200249.squirrel@webmail.cornell.edu> Message-ID: On Mon, 2 Jan 2006, Darren Dale wrote: [ John's suggestion] > > I've been happy with the name "numerix" in matplotlib. It is intended > > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > > core. Also, the import abbrev > > > > import numerix as nx > > I like this idea a lot +1 Still, let me *try* to keep an (presumably incomplete) overview about the suggestions: * "scipy core" - scicore (containing ndarray, which could go into normal python - FP) - numerix - Array - ap - arrcore - arrayutils - arrlab - arrlib - arrproc - darray - narray - ndarray - nm - num - numarr - numeric - numcore - numpy - numsci - numlib - nx - sci - sciarray I also like the following ones, but ... - tao or tAo ? - scivoo (especially for Prabhu ;-) General remarks: - IMHO: no core in the name of the package - no matrix (or vec either) in the name of the package (it is about *arrays* ) - no py in the name of the part which does the arrays (Reason: if we want this to go into normal python at some point, there should be no "*py") * "full scipy" - scipy - scikits Personally, I like - numerix/scikits - numerix/scipy (I also like `numeric`, but that might be too close to "Numeric" ...) Best, Arnd From faltet at carabos.com Mon Jan 2 06:35:43 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 12:35:43 +0100 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: References: <43B72B79.8040807@ieee.org> <1711.68.67.236.115.1136200249.squirrel@webmail.cornell.edu> Message-ID: <200601021235.44711.faltet@carabos.com> A Dilluns 02 Gener 2006 12:18, Arnd Baecker va escriure: > On Mon, 2 Jan 2006, Darren Dale wrote: > > [ John's suggestion] > > > > I've been happy with the name "numerix" in matplotlib. It is intended > > > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > > > core. Also, the import abbrev > > > > > > import numerix as nx > > > > I like this idea a lot > > +1 Yeah, I like it too. Also, it sounds great to my (not very english used) ears! -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From faltet at carabos.com Mon Jan 2 06:47:48 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 12:47:48 +0100 Subject: [SciPy-dev] [SciPy-user] default dtype In-Reply-To: <2698AB6B-03E0-484B-A2CC-305FC2D55134@ftw.at> References: <9E95E32C-373A-4830-A8B3-FF16605B7181@ftw.at> <2698AB6B-03E0-484B-A2CC-305FC2D55134@ftw.at> Message-ID: <200601021247.49311.faltet@carabos.com> A Dilluns 02 Gener 2006 03:38, Ed Schofield va escriure: > +5 on changing the default to float. I think we'd look back on this > decision in several years as difficult but right. Sorry, but I don't agree. If we want Python to include the container for array objects, making the default be double seems to stress the fact that this object is meant primarily for scientific use (which is true to some extent). However, for the sake of stablishing a *real* standard to keep datasets, I'd advocate the default to remain int. In addition, there are a lot of uses for integer arrays (indices, images...). IMO, making the double the default would discourage the use of the object between people not used to write scientific/technical apps. The only issue is the possible confusion in users when they will receive Int32 arrays in 32-bit platforms and Int64 arrays in 64-bit ones. I don't know, but perhaps this single reason is strong enough to change the default to Float64. Ups, I think I've created more confusion in the discussion, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From schofield at ftw.at Mon Jan 2 08:53:17 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 02 Jan 2006 13:53:17 +0000 Subject: [SciPy-dev] [SciPy-user] default dtype In-Reply-To: <200601021247.49311.faltet@carabos.com> References: <9E95E32C-373A-4830-A8B3-FF16605B7181@ftw.at> <2698AB6B-03E0-484B-A2CC-305FC2D55134@ftw.at> <200601021247.49311.faltet@carabos.com> Message-ID: <43B9304D.1040701@ftw.at> Francesc Altet wrote: >A Dilluns 02 Gener 2006 03:38, Ed Schofield va escriure: > > >>+5 on changing the default to float. I think we'd look back on this >>decision in several years as difficult but right. >> >> >Sorry, but I don't agree. If we want Python to include the container >for array objects, making the default be double seems to stress the >fact that this object is meant primarily for scientific use (which is >true to some extent). However, for the sake of stablishing a *real* >standard to keep datasets, I'd advocate the default to remain int. In >addition, there are a lot of uses for integer arrays (indices, >images...). IMO, making the double the default would discourage the >use of the object between people not used to write >scientific/technical apps. > >The only issue is the possible confusion in users when they will >receive Int32 arrays in 32-bit platforms and Int64 arrays in 64-bit >ones. I don't know, but perhaps this single reason is strong enough to >change the default to Float64. > > Interesting point. Then what do you think about an integer default that is redefinable by the user? For example: >>> import scicore / whatever >>> scicore.default_dtype = float64 Then zeros(), empty(), and ones() (any others?) would use the new default. I think the flexibility would be nice, and it should be feasible ... Perhaps the most compelling argument against an integer default is the current behaviour with unsafe casting: >>> a = zeros(10) >>> a[0] = 1.159262 >>> a[0] 1 and this argument would evaporate if unsafe casts were required to be more explicit. I promised to provide a patch and run some timings for this, but I haven't done this yet. -- Ed From schofield at ftw.at Mon Jan 2 09:11:55 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 02 Jan 2006 14:11:55 +0000 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <43B934AB.4000805@ftw.at> On 02/01/2006, at 4:43 AM, John Hunter wrote: > I've been happy with the name "numerix" in matplotlib. It is intended > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > core. Also, the import abbrev > > import numerix as nx > > is a good mnemonic and is less likely to clash with the local > namespace than other popular historical choices > > import Numeric as N > import scipy as s > import pylab as p > > And lots of folks are using it already as an array-object interface. > I've been happy in my own code with nx.mlab.mean, nx.arange, and so > forth.... > > It appears to me that scipy core is for the most part more Numeric and > less scipy, in that is has most of what Numeric/numarray offered but > does not have the bulk of scipy. Thus a name like numerix may better > reflect what the package offers: a core array object with benefits. I like this. The x in 'numerix' connotes a kind of genericity, as if all other array packages are just splinters off it. +1 on numerix I also had scary dreams last night about the name 'scicore'. So I'll withdraw my previous vote: -1 on scicore How about 'sciarray'? It fits in well with 'scipy' and sounds like a more full-featured version of numarray. It also seems that 'sciarray.org' is free, no?! Arguing about the new name is fun, and much easier than real work. I think we should change it twice a year. -- Ed From faltet at carabos.com Mon Jan 2 09:56:17 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 15:56:17 +0100 Subject: [SciPy-dev] [SciPy-user] default dtype In-Reply-To: <43B9304D.1040701@ftw.at> References: <9E95E32C-373A-4830-A8B3-FF16605B7181@ftw.at> <200601021247.49311.faltet@carabos.com> <43B9304D.1040701@ftw.at> Message-ID: <200601021556.18401.faltet@carabos.com> A Dilluns 02 Gener 2006 14:53, Ed Schofield va escriure: > Then what do you think about an integer default that is redefinable by > > the user? For example: > >>> import scicore / whatever > >>> scicore.default_dtype = float64 > > Then zeros(), empty(), and ones() (any others?) would use the new > default. I think the flexibility would be nice, and it should be > feasible ... Yes. I think this would be really nice to have. That way, the 64/32-bit dichotomy would disappear. > Perhaps the most compelling argument against an integer default is the > > current behaviour with unsafe casting: > >>> a = zeros(10) > >>> a[0] = 1.159262 > >>> a[0] > > 1 > > and this argument would evaporate if unsafe casts were required to be > more explicit. I promised to provide a patch and run some timings for > this, but I haven't done this yet. Frankly, I don't find this specially bad. If double would be the default we have similar problems: >>> a = zeros(10) >>> a[0] = 1 >>> a[0] 1.0 i.e. the value has been upgraded without our knowlegde. Of course, you may say that upgrading is always better than downgrading, but if we follow this reasoning to the end, then complex128 should be the default, which is somewhat crazy. IMO, the user has to be concious about the default type when creating arrays. Once he is used to the default, everything should go well. And again, allowing a user-definable default would be very handy. -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From Fernando.Perez at colorado.edu Mon Jan 2 10:01:41 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 08:01:41 -0700 Subject: [SciPy-dev] [SciPy-user] default dtype In-Reply-To: <200601021556.18401.faltet@carabos.com> References: <9E95E32C-373A-4830-A8B3-FF16605B7181@ftw.at> <200601021247.49311.faltet@carabos.com> <43B9304D.1040701@ftw.at> <200601021556.18401.faltet@carabos.com> Message-ID: <43B94055.9000109@colorado.edu> Francesc Altet wrote: > A Dilluns 02 Gener 2006 14:53, Ed Schofield va escriure: > >>Then what do you think about an integer default that is redefinable by >> >>the user? For example: >> >>> import scicore / whatever >> >>> scicore.default_dtype = float64 >> >>Then zeros(), empty(), and ones() (any others?) would use the new >>default. I think the flexibility would be nice, and it should be >>feasible ... > > > Yes. I think this would be really nice to have. That way, the > 64/32-bit dichotomy would disappear. Is this supposed to be a fully global setting? If so, what about import numerix numerix.default_dtype = 'something' import somemodule somemodule.foo() boom! Now, the foo() call either: 1. blows up, because it had unqualified zeros() calls whose dtype has now changed, or 2. resets numerix.default_type back, and your code blows up next. I think this can be handled with a _call_ (numerix.set_default_dtype()), but it requires special stack-handling code so that numerix can know to apply the new default only to calls made from the same module. Cheers, f From oliphant.travis at ieee.org Mon Jan 2 10:50:46 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 02 Jan 2006 08:50:46 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <43B94BD6.5050104@ieee.org> John Hunter wrote: >>>>>>"Travis" == Travis Oliphant writes: >>>>>> >>>>>> > > Travis> We could call the scipy_core package scicore, or numcore, > Travis> if that is the only concern. The biggest concern is that > >I've been happy with the name "numerix" in matplotlib. It is intended >to connote "Numeric" in plural, eg numeric, numarray and (now) scipy >core. Also, the import abbrev > > import numerix as nx > >is a good mnemonic and is less likely to clash with the local >namespace than other popular historical choices > > I like the numerix name. My only concern is would that mess up the use of numerix as a superset of all the array packages? Perhaps that would be a "good thing" :-) -Travis From Fernando.Perez at colorado.edu Mon Jan 2 11:17:26 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 09:17:26 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B94BD6.5050104@ieee.org> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> <43B94BD6.5050104@ieee.org> Message-ID: <43B95216.5040601@colorado.edu> Travis Oliphant wrote: >>I've been happy with the name "numerix" in matplotlib. It is intended >>to connote "Numeric" in plural, eg numeric, numarray and (now) scipy >>core. Also, the import abbrev >> >> import numerix as nx >> >>is a good mnemonic and is less likely to clash with the local >>namespace than other popular historical choices >> >> > > I like the numerix name. My only concern is would that mess up the use > of numerix as a superset of all the array packages? Perhaps that would > be a "good thing" :-) I'm also +1 on numerix (I'd forgotten about it, but I've always liked it). There seems to be some consensus building on this one, at least from the votes today. Even though I like a lot the idea of naming it just 'nx' as Robert suggested, I think that nx is too likely to collide with codes that have (nx,ny,nz) variables in them (quite common). Leaving the actual package name as 'numerix' allows everybody to choose a local shorthand that doesn't cause name collisions in their own namespace. So my vote is +1 numerix -> the new scipy_core name +1 ndarray -> the new name for the actual array package with the possibility that, in the future, ndarray may be pitched for inclusion in the core. Cheers, f ps - Travis, thanks for humoring my concerns on this at such a late time. Your patience is much appreciated. From guyer at nist.gov Mon Jan 2 11:24:23 2006 From: guyer at nist.gov (Jonathan Guyer) Date: Mon, 2 Jan 2006 11:24:23 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B934AB.4000805@ftw.at> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> <43B934AB.4000805@ftw.at> Message-ID: <7292494e4748628a073363508e43a3a0@nist.gov> On Jan 2, 2006, at 9:11 AM, Ed Schofield wrote: > I also had scary dreams last night about the name 'scicore'. That's what happens when you stay up late to watch Babylon 5 reruns. From faltet at carabos.com Mon Jan 2 11:54:02 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 17:54:02 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: References: <200601020104.01140.faltet@carabos.com> Message-ID: <200601021754.03309.faltet@carabos.com> A Dilluns 02 Gener 2006 04:34, Travis E. Oliphant va escriure: > > In [98]: scicore.typeDict['UInt16'] > > ------------------------------------------------------------------------- > >-- exceptions.KeyError Traceback (most > > recent call last) > > > > /home/faltet/python.nobackup/scipy_core/ > > > > KeyError: 'UInt16' > > > This is a mistake. Uint16 should *be* UInt16. That is one of the big > purposes of typeDict (compatibility with numarray). Good! > > Also, for numarray compatibility, could 'complexXXX' --> 'ComplexXXX' > > be made? And add a 'Bool'? I know that a type 'Bool8' exists, but > > perhaps 'Bool' can be added as an alias of 'Bool8'. > > Sounds fine on the Bool. But, note that unlike in numarray, the > lower-case types are actual Python type objects. The capitalized names > are just string typecodes. Also, ComplexXXX uses the old (bit-width of > the float) convention for backwards compatibility, whereas the new > complexXXX names use item-bit-width convention. Mmm, I'm afraid that you added 'bool' but forgot about 'Bool'. Also, ComplexXXX seems to follow item-bit-width convention instead of numarray convention. Although I'm a bit hesitant on this regard in the sense that having a 'complex64' equivalent to 'Complex32' can be a source of confusion. For the sake of consistency, I'd advocate to let things as they are now, i.e. 'Complex64' --> 'complex64' and 'Complex128' --> 'complex128'. I know that this violates the numarray convention, but consistency should always be a must. Perhaps a possible solution would be to define another dictionary just for numarray mappings. So, we can have typeDict with the new types defined in scipy_core and, say, typeDictNA that contains the types defined in numarray. I think this would be a good way to avoid unnecessary confusions and to simplify the already cluttered typeDict. What others think? > > 3.- Finally, I'd advocate to have a more handy string representation > > for .dtype. Right now, one have: > > > > In [117]: a.dtype > > --------> a.dtype() > > Out[117]: 0 > > > > (why dtype supports the call protocol? Is it really necessary?) > > Because it's an actual type object. I think we could disable the tp_new > method of the type. Curently, this allows you do say int32([1,2,3,4]) > to construct an array. Having the possibility to create arrays with int32([1,2,3,4]) sounds appealing indeed. However, I wonder if you are providing too much functionality to the dtype object. For me, it should just be a container for the type of the data, but allowing it to be also a factory for arrays may be confusing. This object could also define some methods to write the different string representations for dtype. For example dtype.char() will be equivalent to .dtypechar() defined in ndarray object. In the same way, dtype.str() would be equivalent to .dtypestr and dtype.descr() would be equivalent to .dtypedescr(). Maybe unrelated with this discussion, but important anyway because of the (supposedly) large userbase that ipython have is the fact that ipython will call the __call__() function for the object, if it founds it. This makes things worse for the (ipython) user that wants just to print the value of the dtype object. So, my opinion on this subject is that dtype should just be a container for information, and not a factory, and that __call__ should be disabled on it (it is no longer necessary in this scenario). You may add different intxx(), floatxx()... factories, but having array() I think that would be more than enough. > > In [118]: str(a.dtype) > > Out[118]: "" > > > > In [121]: repr(a.dtype) > > Out[121]: "" > > This would require creating a new metaclass because a.dtype is a > typeobject and so prints using the standard type object printing. We > would have to inherit from the typeobject to change that standard. > > I think rather than shield people from the idea that dtype is an actual > typeobject (which changing the printing would do) we should emphaize > that fact. I see your point. However, for better numarray interaction, I would like to have a mean for a direct translation between numarray<-->scipy_core types. For example, given a dtype like , I'd like to be able to have a direct map into numarray type (I think this is not possible now, is it?). In fact, this is another point in favor of defining a new typeDictNA (or maybe typeNA, for short), with keys for mapping the types in both senses. For example: {: 'Int8', : 'Int16',... 'b': 'Int8', 'h': 'Int16', ... # For shorter char keys 'i1': 'Int8', 'i2': 'Int16', ... # For string convention keys 'Int8': 'b', 'Int16': 'h', ...} # The reverse mapping Maybe this is too specific to my needs, but I'm sure that this can facilitate the migration from numarray to scipy_core to other people as well. Finally, I have yet another (the last one, at least for today ;-) question about scipy_core: In [35]: print scicore.typeDict['I'], scicore.typeDict['L'] In [36]: scicore.typeDict['I'] == scicore.typeDict['L'] Out[36]: False The same goes for: In [38]: print scicore.typeDict['i'], scicore.typeDict['l'] In [39]: print scicore.typeDict['i'] == scicore.typeDict['l'] False Is this consistent?. I would say no (I'm on a 32-bit platform). Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From Fernando.Perez at colorado.edu Mon Jan 2 12:05:04 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 10:05:04 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601021754.03309.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> Message-ID: <43B95D40.5050005@colorado.edu> > Maybe unrelated with this discussion, but important anyway because of > the (supposedly) large userbase that ipython have is the fact that > ipython will call the __call__() function for the object, if it founds > it. This makes things worse for the (ipython) user that wants just to > print the value of the dtype object. This is a known case of the tension between convenience and correctness, that ipython tries (not always successfully) to balance. Note that to aid a little in situations like these, ipython offers some options: 1. Toggle autocalling off for a while: In [1]: int ------> int() Out[1]: 0 In [2]: autocall Automatic calling is: OFF In [3]: int Out[3]: You can turn it on again as needed. 2. Turn it off permanently in your ~/.ipython/ipythonrc file, if it's causing more problems than not: autocall 0 3. If you just want to quickly print a value, the %p magic is short for 'print': In [7]: int ------> int() Out[7]: 0 In [8]: p int (if you happen to have a 'p' variable, call %p, which is still a little less typing than 'print' :) I hope this helps, f From faltet at carabos.com Mon Jan 2 13:35:09 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 19:35:09 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <43B95D40.5050005@colorado.edu> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> <43B95D40.5050005@colorado.edu> Message-ID: <200601021935.10046.faltet@carabos.com> A Dilluns 02 Gener 2006 18:05, Fernando Perez va escriure: > > Maybe unrelated with this discussion, but important anyway because of > > the (supposedly) large userbase that ipython have is the fact that > > ipython will call the __call__() function for the object, if it founds > > it. This makes things worse for the (ipython) user that wants just to > > print the value of the dtype object. > > This is a known case of the tension between convenience and correctness, > that ipython tries (not always successfully) to balance. > > Note that to aid a little in situations like these, ipython offers some > options: [snip] Yes, I knew this. But the point is that newbie users (and not as newbie, because I find toggling autocall on and off a bit annoying), will be somewhat surprised about the results. This was actually the main reason why I've decided to turn off the __call__() in many objects in PyTables. The other reason is that I replaced __call__() by more meaningful names and the result is far more readable (you know, explicit is better than implicit). Thanks anyway for remembering, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From Fernando.Perez at colorado.edu Mon Jan 2 13:49:29 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 11:49:29 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601021935.10046.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> <43B95D40.5050005@colorado.edu> <200601021935.10046.faltet@carabos.com> Message-ID: <43B975B9.5000407@colorado.edu> Francesc Altet wrote: > A Dilluns 02 Gener 2006 18:05, Fernando Perez va escriure: > >>>Maybe unrelated with this discussion, but important anyway because of >>>the (supposedly) large userbase that ipython have is the fact that >>>ipython will call the __call__() function for the object, if it founds >>>it. This makes things worse for the (ipython) user that wants just to >>>print the value of the dtype object. >> >>This is a known case of the tension between convenience and correctness, >>that ipython tries (not always successfully) to balance. >> >>Note that to aid a little in situations like these, ipython offers some >>options: > > [snip] > > Yes, I knew this. But the point is that newbie users (and not as > newbie, because I find toggling autocall on and off a bit annoying), > will be somewhat surprised about the results. This was actually the > main reason why I've decided to turn off the __call__() in many > objects in PyTables. The other reason is that I replaced __call__() by > more meaningful names and the result is far more readable (you know, > explicit is better than implicit). I've toyed with the idea of making the autocall flag be a 0,1,2 integer, where today's 'on' would become 2 for 'full', while 1 would be an intermediate mode, doing autocall ONLY if there are arguments: 1: foo bar -> foo(bar) foo -> foo 2: foo bar -> foo(bar) foo -> foo() But I'm out of time for more significant ipython work for now, I'm afraid (a lot of new features went in recently, but now I need to work on other things). Vote if you like it, though, and I'll consider it for the future. Cheers, f From jh at oobleck.astro.cornell.edu Mon Jan 2 16:09:53 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Mon, 2 Jan 2006 16:09:53 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: (scipy-dev-request@scipy.net) References: Message-ID: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> Besides seeking a name that is attractive and meaningful, we should remember functionality: - nx is a super-common variable name - numerix is used by some to mean "any array package" - some suggested names are taken already by other packages Despite any political desires on our part, shouldn't we play nice and not use any of these? Think the Google mantra:...no evil...no evil... How about: ndarray actual array package (and try to get it in mainstream py) arrmath core math setup scipy big-tent, well-maintained, well-documented, stable scisand unmaintained, experimental, or prototype (alt: coventry) Note that, while the word "math" appears in arrmath, and may thus send some people screaming, those individuals are unlikely to use FFTs or linear algebra. "ndarray" is unassuming and informative, and probably what those people want anyway (if they want our distutils, they have to face the music, but they don't have to dance). We need to resolve this so Travis can actually *do* it! Perhaps someone could set up a poll on scipy.org or new.scipy.org? --jh-- From Fernando.Perez at colorado.edu Mon Jan 2 16:18:14 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 14:18:14 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> Message-ID: <43B99896.8060003@colorado.edu> Joe Harrington wrote: > Besides seeking a name that is attractive and meaningful, we should > remember functionality: > > - nx is a super-common variable name > - numerix is used by some to mean "any array package" Note that, to my knowledge, the only two uses of 'numerix' in the wild right now are by matplotlib and enthought. John specifically already suggested nuemrix, so he's obviously OK with that. And in enthought, numerix is also (I think) used as a compatibility layer, so once we only have one array package, I imagine they'd be OK (though confirmation from them would be welcome). On technical grounds I think there is no real conflict, as their current imports are of the form import {matplotlib,enthought}.numerix vs our proposed top-level name import numerix. Those who do from matplotlib import numerix would actually be OK, since matplotlib.numerix would return, as it does today, an indirection object which referes to either Numeric, numarray or numerix (the top-level one). So I'm not sure that I see either a real technical or social objection to 'numerix' (modulo final confirmation from Enthought. Robert?) Cheers, f From perry at stsci.edu Mon Jan 2 16:21:54 2006 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 2 Jan 2006 16:21:54 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: numerix is also my favorite (and not just because we coined it; I wished I had thought of it in place of numarray; at least it is now available for the next incarnation). Perry On Jan 1, 2006, at 11:43 PM, John Hunter wrote: >>>>>> "Travis" == Travis Oliphant writes: > > Travis> We could call the scipy_core package scicore, or numcore, > Travis> if that is the only concern. The biggest concern is that > > I've been happy with the name "numerix" in matplotlib. It is intended > to connote "Numeric" in plural, eg numeric, numarray and (now) scipy > core. Also, the import abbrev > > import numerix as nx > > is a good mnemonic and is less likely to clash with the local > namespace than other popular historical choices > > import Numeric as N > import scipy as s > import pylab as p > > And lots of folks are using it already as an array-object interface. > I've been happy in my own code with nx.mlab.mean, nx.arange, and so > forth.... > > It appears to me that scipy core is for the most part more Numeric and > less scipy, in that is has most of what Numeric/numarray offered but > does not have the bulk of scipy. Thus a name like numerix may better > reflect what the package offers: a core array object with benefits. > > JDH > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From perry at stsci.edu Mon Jan 2 16:24:05 2006 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 2 Jan 2006 16:24:05 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> Message-ID: <281fb160e386919f2084c1f906e7afdd@stsci.edu> On Jan 2, 2006, at 4:09 PM, Joe Harrington wrote: > Besides seeking a name that is attractive and meaningful, we should > remember functionality: > > - nx is a super-common variable name But any two letter combination is likely to be found in programs so this is not a significant issue (imho) > - numerix is used by some to mean "any array package" As Fernando points out, it's use is quite limited at the moment, and John has no problem with it being used differently. Perry From strawman at astraw.com Mon Jan 2 16:38:28 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 02 Jan 2006 13:38:28 -0800 Subject: [SciPy-dev] poll for renaming scipy_core online Message-ID: <43B99D54.2050202@astraw.com> To keep tallying easy, I've put up a poll at the new wiki: http://new.scipy.org/Wiki/RenamingCore I didn't list all the wonderful names proposed, since I figured if they were liked enough, someone would add them. Although I do think if Travis wants to adopt the new(?) middle name of "Array", we should pay any legal fees he incurs. :) Now, hopefully we won't have any election fraud here... From Fernando.Perez at colorado.edu Mon Jan 2 16:42:28 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 14:42:28 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B99D54.2050202@astraw.com> References: <43B99D54.2050202@astraw.com> Message-ID: <43B99E44.6080000@colorado.edu> Andrew Straw wrote: > To keep tallying easy, I've put up a poll at the new wiki: > > http://new.scipy.org/Wiki/RenamingCore > > I didn't list all the wonderful names proposed, since I figured if they > were liked enough, someone would add them. Although I do think if Travis > wants to adopt the new(?) middle name of "Array", we should pay any > legal fees he incurs. :) > > Now, hopefully we won't have any election fraud here... Notes: - scipy_core is NOT the current name, it's 'scipy'. We're defining "name" as "what goes after the import keyword in the code", not the lingo. So the 'no change' case would be 'scipy'. See, even you get confused, so we HAVE to change it :) - numerix is a proposed name for the whole package, not for the array core. I think for the array core, ndarray is so far leading (though other options have been proposed). Cheers, f From faltet at carabos.com Mon Jan 2 16:47:51 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 2 Jan 2006 22:47:51 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <43B975B9.5000407@colorado.edu> References: <200601020104.01140.faltet@carabos.com> <200601021935.10046.faltet@carabos.com> <43B975B9.5000407@colorado.edu> Message-ID: <200601022247.56236.faltet@carabos.com> A Dilluns 02 Gener 2006 19:49, Fernando Perez va escriure: > I've toyed with the idea of making the autocall flag be a 0,1,2 integer, > where today's 'on' would become 2 for 'full', while 1 would be an > intermediate mode, doing autocall ONLY if there are arguments: > > 1: foo bar -> foo(bar) > foo -> foo > > 2: foo bar -> foo(bar) > foo -> foo() > > But I'm out of time for more significant ipython work for now, I'm afraid > (a lot of new features went in recently, but now I need to work on other > things). > > Vote if you like it, though, and I'll consider it for the future. That's an interesting proposal, indeed. +1 for the improved autocall intermediate level. -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From Fernando.Perez at colorado.edu Mon Jan 2 16:51:05 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 14:51:05 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601022247.56236.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021935.10046.faltet@carabos.com> <43B975B9.5000407@colorado.edu> <200601022247.56236.faltet@carabos.com> Message-ID: <43B9A049.7030602@colorado.edu> Francesc Altet wrote: > A Dilluns 02 Gener 2006 19:49, Fernando Perez va escriure: > >>I've toyed with the idea of making the autocall flag be a 0,1,2 integer, >>where today's 'on' would become 2 for 'full', while 1 would be an >>intermediate mode, doing autocall ONLY if there are arguments: >> >>1: foo bar -> foo(bar) >> foo -> foo >> >>2: foo bar -> foo(bar) >> foo -> foo() >> >>But I'm out of time for more significant ipython work for now, I'm afraid >>(a lot of new features went in recently, but now I need to work on other >>things). >> >>Vote if you like it, though, and I'll consider it for the future. > > > That's an interesting proposal, indeed. > > +1 for the improved autocall intermediate level. Duly noted, thanks. Cheers, f From strawman at astraw.com Mon Jan 2 17:30:49 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 02 Jan 2006 14:30:49 -0800 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B99E44.6080000@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> Message-ID: <43B9A999.8030009@astraw.com> Fernando, I've updated the wiki page to be more clear about what exactly the votes are for. Let me know if I'm still confused! :) Andrew From strawman at astraw.com Mon Jan 2 17:41:29 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 02 Jan 2006 14:41:29 -0800 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B99E44.6080000@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> Message-ID: <43B9AC19.1050606@astraw.com> Fernando Perez wrote: >Andrew Straw wrote: > > >>To keep tallying easy, I've put up a poll at the new wiki: >> >>http://new.scipy.org/Wiki/RenamingCore >> >>I didn't list all the wonderful names proposed, since I figured if they >>were liked enough, someone would add them. Although I do think if Travis >>wants to adopt the new(?) middle name of "Array", we should pay any >>legal fees he incurs. :) >> >>Now, hopefully we won't have any election fraud here... >> >> > >Notes: > >- scipy_core is NOT the current name, it's 'scipy'. We're defining "name" as >"what goes after the import keyword in the code", not the lingo. So the 'no >change' case would be 'scipy'. See, even you get confused, so we HAVE to >change it :) > >- numerix is a proposed name for the whole package, not for the array core. I >think for the array core, ndarray is so far leading (though other options have >been proposed). > > OK, so following up on this, I've realized there's a potential issue here: If the "package name" (what you call "name") above becomes, for the sake of argument, numerix, does this mean that our distutils becomes numerix.distutils? And numerix.distutils and numerix.f2py would be necessary for scipy? An alternative is that the distribution could distribute two top-level packages (e.g. numerix and scipy). From Fernando.Perez at colorado.edu Mon Jan 2 17:58:22 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 15:58:22 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9AC19.1050606@astraw.com> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> Message-ID: <43B9B00E.5040806@colorado.edu> Andrew Straw wrote: > OK, so following up on this, I've realized there's a potential issue here: > > If the "package name" (what you call "name") above becomes, for the > sake of argument, numerix, does this mean that our distutils becomes > numerix.distutils? And numerix.distutils and numerix.f2py would be > necessary for scipy? An alternative is that the distribution could > distribute two top-level packages (e.g. numerix and scipy). Why is this an issue? numerix, the package (consisting of ndarray, f2py, distutils and more) would be a dependency for scipy. It would constitute the 'core' (as we call it today) needed by the 'full' scipy, and it can be adopted as a reliable, easy to install dependency by many other projects (matplotlib, for example). I don't see this as a problem: numerix doesn't have runtime dependencies on Fortran or anything else, and only a gcc (or similar) build-time dependency, just like any other python extension module. With Win32 binaries being provided, this shouldn't be a problem. Keep in mind that, in terms of included packages (major internal rewrites, renaming and reorganization aside), we have: numerix ~ today's Numeric/numarray + f2py + scipy.distutils Both f2py and distutils are pure-python systems. Offering them in numerix means that no extra dependencies are needed for 'full' scipy. We then have the following dependency table (beyond Python itself): numerix dependencies -------------------- * binary install: - None * source install: - A C compiler. * runtime: - (optional) the f2py package, provided as part of numerix, requires a Fortran compiler to operate. f2py is not needed for numerix's normal functioning, it is a utility package. scipy dependencies ------------------ * binary install: - the numerix package * source install: - the numerix package - (optionally) the ATLAS libraries. - C and Fortran compilers. * runtime: - the numerix package - (optional) The weave module, which allows runtime dynaminc inlining of C and C++ code, requires C/C++ compilers to operate. weave is not needed for scipy's normal functioning, it is a utility package. This looks like a pretty decent situation to me. Am I missing something? Cheers, f From guyer at nist.gov Mon Jan 2 18:25:20 2006 From: guyer at nist.gov (Jonathan Guyer) Date: Mon, 2 Jan 2006 18:25:20 -0500 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B99896.8060003@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> Message-ID: <971690209da5c7d5fd07001afce2d026@nist.gov> On Jan 2, 2006, at 4:18 PM, Fernando Perez wrote: > Note that, to my knowledge, the only two uses of 'numerix' in the wild > right > now are by matplotlib and enthought. John specifically already > suggested > nuemrix, so he's obviously OK with that. And in enthought, numerix is > also (I > think) used as a compatibility layer, so once we only have one array > package, > I imagine they'd be OK (though confirmation from them would be > welcome). We use "numerix" in FiPy for the same reason, but I have no complaint (and some favor) with this becoming the new name of scipy-core. If it causes us problems, then it means that we haven't done a good enough job of isolating the specific array library used. From strawman at astraw.com Mon Jan 2 18:46:28 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 02 Jan 2006 15:46:28 -0800 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9B00E.5040806@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> Message-ID: <43B9BB54.60103@astraw.com> Fernando Perez wrote: >Andrew Straw wrote: > > > >>OK, so following up on this, I've realized there's a potential issue here: >> >>If the "package name" (what you call "name") above becomes, for the >>sake of argument, numerix, does this mean that our distutils becomes >>numerix.distutils? And numerix.distutils and numerix.f2py would be >>necessary for scipy? An alternative is that the distribution could >>distribute two top-level packages (e.g. numerix and scipy). >> >> > >Why is this an issue? > I just wanted to make sure the situation was clear to all. I found it potentially confusing that scipy.distutils and scipy.f2py would become part of the new package. I think it is worth reiterating why distutils and f2py are included with the new base package and not full scipy: As far as I can tell (correct me if I'm wrong), distutils is included because it does a lot of nice auto-detection of atlas, mkl, acml, and vecLib. Those packages are linked into the base array package (rather than lapack_lite) where possible. (This same issue is currently also the most challenging aspect of installing Numeric or numarray, so the situation doesn't go away if we, hypothetically speaking, shifted to Python distutils.) f2py's reasons for inclusion aren't so clear to me, but as it's pure Python, I don't mind its inclusion. Cheers! Andrew From Fernando.Perez at colorado.edu Mon Jan 2 19:00:00 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 17:00:00 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9BB54.60103@astraw.com> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> Message-ID: <43B9BE80.7000901@colorado.edu> Andrew Straw wrote: > I just wanted to make sure the situation was clear to all. I found it > potentially confusing that scipy.distutils and scipy.f2py would become > part of the new package. > > I think it is worth reiterating why distutils and f2py are included with > the new base package and not full scipy: As far as I can tell (correct > me if I'm wrong), distutils is included because it does a lot of nice > auto-detection of atlas, mkl, acml, and vecLib. Those packages are > linked into the base array package (rather than lapack_lite) where > possible. (This same issue is currently also the most challenging aspect > of installing Numeric or numarray, so the situation doesn't go away if > we, hypothetically speaking, shifted to Python distutils.) f2py's > reasons for inclusion aren't so clear to me, but as it's pure Python, I > don't mind its inclusion. I see the f2py/distutils inclusion into numerix as serving two benefits: 1. It simplifies the scipy bootstrapping. Installing scipy from source will 'just work' (yeah, right :), given that it can assume that numerix.{f2py,distutils} are always there. No need to play path tricks to find them from within the install directory itself. It's quite possible that this bootstrapping issue is already well solved and doesn't really matter, I'm not too familiar with the most current internals, so correct me if I'm wrong. 2. It increases the value of numerix as a base layer for third-party scientific packages. I think authors will appreciate getting a few really good utilities for writing python scientific software, without making the full scipy a dependency. Weave (I think) also falls into this category, but its maintenance difficulty seems to tilt the decision in the direction of moving it to full scipy. I wish I could commit to maintaining it in numerix, but I really can't, and I understand Travis' desire to have numerix be a rock-solid, no-hassles-to-adopt foundation. It would be fantastic if someone with a C++ penchant wanted to pick this ball up. Just my take on it. cheers, f From listservs at mac.com Mon Jan 2 19:48:28 2006 From: listservs at mac.com (listservs at mac.com) Date: Mon, 2 Jan 2006 19:48:28 -0500 Subject: [SciPy-dev] linalg namespace changed? Message-ID: <3FBB4CDF-1C19-4C6E-81E8-468C9A122053@mac.com> It appears that recent changes to scipy_core in svn have broken the linalg namespace: >>> from scipy import linalg Traceback (most recent call last): File "", line 1, in ? ImportError: cannot import name linalg This worked a few days ago, but now does not (and breaks my PyMC module). Was this intended? C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com From eric at enthought.com Mon Jan 2 21:31:18 2006 From: eric at enthought.com (eric jones) Date: Mon, 02 Jan 2006 20:31:18 -0600 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B99896.8060003@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> Message-ID: <43B9E1F6.6070609@enthought.com> Fernando Perez wrote: >Joe Harrington wrote: > > >>Besides seeking a name that is attractive and meaningful, we should >>remember functionality: >> >>- nx is a super-common variable name >>- numerix is used by some to mean "any array package" >> >> > >Note that, to my knowledge, the only two uses of 'numerix' in the wild right >now are by matplotlib and enthought. John specifically already suggested >nuemrix, so he's obviously OK with that. And in enthought, numerix is also (I >think) used as a compatibility layer, so once we only have one array package, >I imagine they'd be OK (though confirmation from them would be welcome). > > Unless Robert or Prabhu have any reason not to approve, I'm fine with it. eric From eric at enthought.com Mon Jan 2 21:37:29 2006 From: eric at enthought.com (eric jones) Date: Mon, 02 Jan 2006 20:37:29 -0600 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9BE80.7000901@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> Message-ID: <43B9E369.1070701@enthought.com> Fernando Perez wrote: >2. It increases the value of numerix as a base layer for third-party >scientific packages. I think authors will appreciate getting a few really >good utilities for writing python scientific software, without making the full >scipy a dependency. > >Weave (I think) also falls into this category, but its maintenance difficulty >seems to tilt the decision in the direction of moving it to full scipy. I >wish I could commit to maintaining it in numerix, but I really can't, and I >understand Travis' desire to have numerix be a rock-solid, no-hassles-to-adopt >foundation. It would be fantastic if someone with a C++ penchant wanted to >pick this ball up. > > I would love to say yes but am up to my ears in other commitments. Everyone is back in the office tomorrow, so I'll talk it over with them and see if we can come up with a strategy for maintaining weave. thanks, eric From Fernando.Perez at colorado.edu Mon Jan 2 22:19:19 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 20:19:19 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9E369.1070701@enthought.com> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> Message-ID: <43B9ED37.7050207@colorado.edu> eric jones wrote: > Fernando Perez wrote: > > >>2. It increases the value of numerix as a base layer for third-party >>scientific packages. I think authors will appreciate getting a few really >>good utilities for writing python scientific software, without making the full >>scipy a dependency. >> >>Weave (I think) also falls into this category, but its maintenance difficulty >>seems to tilt the decision in the direction of moving it to full scipy. I >>wish I could commit to maintaining it in numerix, but I really can't, and I >>understand Travis' desire to have numerix be a rock-solid, no-hassles-to-adopt >>foundation. It would be fantastic if someone with a C++ penchant wanted to >>pick this ball up. >> >> > > I would love to say yes but am up to my ears in other commitments. > Everyone is back in the office tomorrow, so I'll talk it over with them > and see if we can come up with a strategy for maintaining weave. I think it's worth mentioning that weave up to the transition into the new scipy (I haven't really checked since) was working fairly well. All the bugs I had been seeing related to either compiler warnings or spurious recompilations had been fixed, and with the inclusion of the blitz 0.9 sources, things work even with gcc4. So this isn't really a major development commitment, but rather one of 'being there' if the need arises, I think. However, since I was the first to say I couldn't do it myself, I understand your position :) I just wanted to give a brief overview of where I think things stand. Perhaps someone else can contribute some additional, more recent info (specifically re. weave and scipy-current) if the above is incorrect or misleading. For many scientific users it may not really be an issue, as they will probably (I know I will) always have the full scipy around. It's really more for third-party developers who want to use the core array package and attending utilities, for whom weave falling on the numerix rather than the scipy side of the fence, may be a significant convenience win. Cheers, f From eric at enthought.com Tue Jan 3 00:02:28 2006 From: eric at enthought.com (eric jones) Date: Mon, 02 Jan 2006 23:02:28 -0600 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43B9ED37.7050207@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> Message-ID: <43BA0564.8030402@enthought.com> Fernando Perez wrote: >eric jones wrote: > > >>Fernando Perez wrote: >> >> >> >> >>>2. It increases the value of numerix as a base layer for third-party >>>scientific packages. I think authors will appreciate getting a few really >>>good utilities for writing python scientific software, without making the full >>>scipy a dependency. >>> >>>Weave (I think) also falls into this category, but its maintenance difficulty >>>seems to tilt the decision in the direction of moving it to full scipy. I >>>wish I could commit to maintaining it in numerix, but I really can't, and I >>>understand Travis' desire to have numerix be a rock-solid, no-hassles-to-adopt >>>foundation. It would be fantastic if someone with a C++ penchant wanted to >>>pick this ball up. >>> >>> >>> >>> >>I would love to say yes but am up to my ears in other commitments. >>Everyone is back in the office tomorrow, so I'll talk it over with them >>and see if we can come up with a strategy for maintaining weave. >> >> > >I think it's worth mentioning that weave up to the transition into the new >scipy (I haven't really checked since) was working fairly well. All the bugs >I had been seeing related to either compiler warnings or spurious >recompilations had been fixed, and with the inclusion of the blitz 0.9 >sources, things work even with gcc4. So this isn't really a major development >commitment, but rather one of 'being there' if the need arises, I think. > > Thanks for the info. Have you tried it out with scipy_core arrays? Does it work with these now? thanks, eric From jdhunter at ace.bsd.uchicago.edu Tue Jan 3 00:24:39 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Mon, 02 Jan 2006 23:24:39 -0600 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B94BD6.5050104@ieee.org> (Travis Oliphant's message of "Mon, 02 Jan 2006 08:50:46 -0700") References: <43B72B79.8040807@ieee.org> <87oe2vxmsd.fsf@peds-pc311.bsd.uchicago.edu> <43B94BD6.5050104@ieee.org> Message-ID: <87bqytnaso.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Travis" == Travis Oliphant writes: Travis> I like the numerix name. My only concern is would that Travis> mess up the use of numerix as a superset of all the array Travis> packages? Perhaps that would be a "good thing" :-) I think this could be potentially confusing for some in a transition, but I don't think it is a deal breaker. Not that many people use matplotlib.numerix explicitly: most get the symbols from pylab which imports matplotlib.numerix. And as Fernando notes, the matplotlib.numerix namespace has 90% overlap with the Numeric / numarray/ scipy core namespace anyhow. If numerix is adopted as the name for the scipy core base package, I would be inclined to rename the matplotlib compatibility layer to something like matplotlib.nxinterface which I think will still be useful for compatibility and profiling for the near term. This with proper deprecation warnings should help clear any confusions. JDH From oliphant.travis at ieee.org Tue Jan 3 00:38:07 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 02 Jan 2006 22:38:07 -0700 Subject: [SciPy-dev] linalg namespace changed? In-Reply-To: <3FBB4CDF-1C19-4C6E-81E8-468C9A122053@mac.com> References: <3FBB4CDF-1C19-4C6E-81E8-468C9A122053@mac.com> Message-ID: <43BA0DBF.5070607@ieee.org> listservs at mac.com wrote: >It appears that recent changes to scipy_core in svn have broken the >linalg namespace: > > >>> from scipy import linalg >Traceback (most recent call last): > File "", line 1, in ? >ImportError: cannot import name linalg > > There has been quite a bit of work on package loading. It might be that you need to re-install the package. I don't think there has been any specific work with linalg (but their might be a naming issue). We are going to do some major re-naming this week. This should actually make scipy itself more stable as most of the changes to scipy were changes to scipy_core (which will be called something different). -Travis From Fernando.Perez at colorado.edu Tue Jan 3 01:07:32 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 23:07:32 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43BA0564.8030402@enthought.com> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> <43BA0564.8030402@enthought.com> Message-ID: <43BA14A4.7050303@colorado.edu> eric jones wrote: >>I think it's worth mentioning that weave up to the transition into the new >>scipy (I haven't really checked since) was working fairly well. All the bugs >>I had been seeing related to either compiler warnings or spurious >>recompilations had been fixed, and with the inclusion of the blitz 0.9 >>sources, things work even with gcc4. So this isn't really a major development >>commitment, but rather one of 'being there' if the need arises, I think. >> >> > > Thanks for the info. Have you tried it out with scipy_core arrays? > Does it work with these now? I'm going to give a half-assed answer, because I haven't really played with weave in the new code yet (all my production stuff was still using scipy-old; once we move to the numerix stable naming scheme, I'll begin testing a port). But a _quick_ and dirty test with an up-to-the-minute SVN copy shows there are problems: In [6]: weave.inline('std::cout << "hello\\n";') cc1plus: warning: command line option "-Wstrict-prototypes" is valid for Ada/C/ObjC but not for C++ /home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp: In member function `void numpy_type_handler::conversion_numpy_check_type(PyArrayObject*, int, const char*)': /home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:488: error: `PyArray_EquivalentTypenums' undeclared (first use this function) /home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:488: error: (Each undeclared identifier is reported only once for each function it appears in.) /home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp: In member function `void numpy_type_handler::numpy_check_type(PyArrayObject*, int, const char*)': /home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:514: error: `PyArray_EquivalentTypenums' undeclared (first use this function) blah blah blah. None of the files in the examples/ directory (it seems they are the same from the old scipy, as they still bear your name and paths in the comments) seem to work at all (I didn't really tried them all, but several failed, and given that the simple 'hello world' above doesn't compile tells me a lot). However, I'd be surprised if this were all that hard to fix. The major underpinnings of weave don't really need to change, and the underlying C structure of the new numerix arrays isn't all that different (in their simplest form) to the old. A blitz constructor is little more than a few integers indicating the dimensions/strides and a raw pointer to the memory buffer. Perhaps Travis or someone more familiar with the current code at the C level can give a better evaluation of how much would be needed to make this work. I'd suggest that as a simple starting point, the test suite could just call all the scripts in weave/examples. Even if we don't check their numerical results yet, just seeing that they compile would be a testing advance. Obviously, some work does need to be done, and we all know that takes time. Cheers, f From oliphant.travis at ieee.org Tue Jan 3 01:20:57 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 02 Jan 2006 23:20:57 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43BA14A4.7050303@colorado.edu> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> <43BA0564.8030402@enthought.com> <43BA14A4.7050303@colorado.edu> Message-ID: <43BA17C9.80005@ieee.org> Fernando Perez wrote: >eric jones wrote: > > > >>>I think it's worth mentioning that weave up to the transition into the new >>>scipy (I haven't really checked since) was working fairly well. All the bugs >>>I had been seeing related to either compiler warnings or spurious >>>recompilations had been fixed, and with the inclusion of the blitz 0.9 >>>sources, things work even with gcc4. So this isn't really a major development >>>commitment, but rather one of 'being there' if the need arises, I think. >>> >>> >>> >>> >>Thanks for the info. Have you tried it out with scipy_core arrays? >>Does it work with these now? >> >> > >I'm going to give a half-assed answer, because I haven't really played with >weave in the new code yet (all my production stuff was still using scipy-old; >once we move to the numerix stable naming scheme, I'll begin testing a port). > But a _quick_ and dirty test with an up-to-the-minute SVN copy shows there >are problems: > >In [6]: weave.inline('std::cout << "hello\\n";') > >cc1plus: warning: command line option "-Wstrict-prototypes" is valid for >Ada/C/ObjC but not for C++ >/home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp: In >member function `void >numpy_type_handler::conversion_numpy_check_type(PyArrayObject*, int, const >char*)': >/home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:488: >error: `PyArray_EquivalentTypenums' undeclared (first use this function) >/home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:488: >error: (Each undeclared identifier is reported only once for each function it >appears in.) >/home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp: In >member function `void numpy_type_handler::numpy_check_type(PyArrayObject*, >int, const char*)': >/home/fperez/.python24_compiled/sc_23f50baa189f86f27c668f2d0fcfd7df0.cpp:514: >error: `PyArray_EquivalentTypenums' undeclared (first use this function) > > > Doh... Weave *was* working, but I changed the C-API a couple of iterations ago for consistency and this function is called something else. This is an example of why I'm uncomfortable with it in the core: it's just not on my radar... >None of the files in the examples/ directory (it seems they are the same from >the old scipy, as they still bear your name and paths in the comments) seem to >work at all (I didn't really tried them all, but several failed, and given >that the simple 'hello world' above doesn't compile tells me a lot). > >However, I'd be surprised if this were all that hard to fix. The major >underpinnings of weave don't really need to change, and the underlying C >structure of the new numerix arrays isn't all that different (in their >simplest form) to the old. A blitz constructor is little more than a few >integers indicating the dimensions/strides and a raw pointer to the memory buffer. > >Perhaps Travis or someone more familiar with the current code at the C level >can give a better evaluation of how much would be needed to make this work. > > > It should work as before for the old data-types, but I doubt the new fancy array data-types will be handled correctly. >I'd suggest that as a simple starting point, the test suite could just call >all the scripts in weave/examples. Even if we don't check their numerical >results yet, just seeing that they compile would be a testing advance. > >Obviously, some work does need to be done, and we all know that takes time. > > Again, weave was working for simple examples before the refactoring that occurred in December. -Travis From Fernando.Perez at colorado.edu Tue Jan 3 01:26:20 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 02 Jan 2006 23:26:20 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43BA17C9.80005@ieee.org> References: <43B99D54.2050202@astraw.com> <43B99E44.6080000@colorado.edu> <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> <43BA0564.8030402@enthought.com> <43BA14A4.7050303@colorado.edu> <43BA17C9.80005@ieee.org> Message-ID: <43BA190C.7050809@colorado.edu> Travis Oliphant wrote: > Doh... Weave *was* working, but I changed the C-API a couple of > iterations ago for consistency and this function is called something > else. This is an example of why I'm uncomfortable with it in the core: > it's just not on my radar... Putting even something as simple as weave.inline('std::cout << "hello\\n";') and other one-liners like x=linspace(0,1,10) weave.blitz('x+1;') # this doesn't work either would help keep it on the radar (yours or anyone else's). With a codebase the size and complexity of scipy, we really need the test harness to be as tight as possible (not that I have any authority to say this: ipython is 18.000 lines of python and not ONE line of testing code; it's like flying a 747 with no instruments and the cockpit windows painted black...) But hey, do as I say and all that ;) Best, f From oliphant.travis at ieee.org Tue Jan 3 02:04:34 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 00:04:34 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B99896.8060003@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> Message-ID: <43BA2202.1040100@ieee.org> It sounds like a lot of people are in favor of numerix. I'm not sure I'm completely sold on the name yet. I'm concerned about name clashes with current uses of the numerix label in Python. What about psipi (which could then have a cool label like $\psi \pi$ ;-) Or some other clever name that is completely new? My wife suggested "venom" --- but that is already used by another Python project. But, regardless, I don't want the name of what is currently in scipy.base to be ndarray, because I would prefer the ndarray object to be in the numerix name space. In other words, I would rather have all what is now in base installed to the numerix name-space directly, so no need to come up with a separate name (unless we want the numerix directory under site-packages to be cleaner in which case numerix.core or numerix.base or numerix.lib would suffice as the *full* package name. Then we would still have: numerix.f2py numerix.distutils numerix.testing numerix.linalg numerix.fft (with the fft function fft in this name-space as numerix.fft.fft) numerix.random (I've already removed weave and placed it in full scipy for now. Fernando seems to be the only one clamoring for it, but is unable to maintain it ;-) ) I'll hold off on this until tomorrow, but we need a new name *very soon* because I agree with Joe Harrington that the work actually just needs to be done to convert to it. Thanks for all of the comments. Keep them coming. -Travis From Fernando.Perez at colorado.edu Tue Jan 3 02:15:25 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 00:15:25 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43BA2202.1040100@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> Message-ID: <43BA248D.3020906@colorado.edu> Travis Oliphant wrote: > (I've already removed weave and placed it in full scipy for now. > Fernando seems to be the only one clamoring for it, but is unable to > maintain it ;-) ) > > I'll hold off on this until tomorrow, but we need a new name *very soon* > because I agree with Joe Harrington that the work actually just needs to > be done to convert to it. > > Thanks for all of the comments. Keep them coming. Well, here's my dying gasp in battle :) My idea of ndarray is simply to keep the pure array functionality fairly standalone, thinking of potential inclusion of that part in the python core at some point, or of the needs of embedders, py2exe, py2app users, etc. I never thought that a user would ever do from numerix.ndarray import ... because we'd have, in the numerix __init__ file, __all__ = numerix.__all__ + ['more','stuff'] Numerix will always be the well-rounded project, the collection of basic scientific packages, and the namespace for everyday use. All examples, tutorials, etc, will use import numerix as some_really_convenient_shorthand x = some_really_convenient_shorthand.linspace(0,1,100) etc. But if somebody comes one day asking for 'how can I strip just the array piece out so py2exe can build the smallest package possible for my project to ship', it would be nice to say: 'just grab the ndarray package standalone, and write your code to do all array imports from ndarray'. So there, now I die :) Cheers, f ps - at this point, I really won't fight: I am _thriled_ to see the name changed, and that's all I really care. The ndarray thing is (thin) icing on the cake, and I'm trying to lose weight :) From oliphant.travis at ieee.org Tue Jan 3 02:35:17 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 00:35:17 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43BA248D.3020906@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> Message-ID: <43BA2935.5080508@ieee.org> Fernando Perez wrote: >Well, here's my dying gasp in battle :) > > Perhaps I can channel you a bit, if you are dead... >My idea of ndarray is simply to keep the pure array functionality fairly >standalone, thinking of potential inclusion of that part in the python core at >some point, or of the needs of embedders, py2exe, py2app users, etc. I never >thought that a user would ever do > >from numerix.ndarray import ... > >because we'd have, in the numerix __init__ file, > >__all__ = numerix.__all__ + ['more','stuff'] > > Great (assuming you mean __all__ = ndarray.__all__ + ... ;-) ) That's what I'm thinking, too. But should we clobber the numerix.ndarray *sub-package* with the numerix.ndarray.ndarray *object*. Isn't that the kind of thing we are trying to avoid doing? So, that somebody could do import numerix.ndarray if they were pedantic? -Travis From arnd.baecker at web.de Tue Jan 3 02:37:20 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 3 Jan 2006 08:37:20 +0100 (CET) Subject: [SciPy-dev] commentary system Message-ID: Hi, for those not following Doc-SIG/Python-Dev, the commentary system looks very interesting. (Maybe it could also be used in modified form for the SciPy live-docs) Best, Arnd ---------- Forwarded message ---------- Date: Tue, 03 Jan 2006 01:08:26 -0600 From: Ian Bicking To: Ian Bicking Cc: Doc-Sig at python.org, python-dev at python.org Subject: Re: [Doc-SIG] [Python-Dev] that library reference, again I've put an instance of Commentary up against the full 2.4 documentation: http://pythonpaste.org/comment/python24/ It writes to a Subversion repository so you can see what the backend data looks like: http://pythonpaste.org/comment/svn/python24/ -- not much there yet. Obviously things like notification and reports would be useful, but I don't know what the state of the art for Subversion is either, so potentially these tools already exist. But last I really looked at the tools for Subversion, I wasn't overly impressed. Things like self-signup for pages you were interested in would be useful. But anyway, there's nothing too difficult about those features. Anyway, it's just meant as a demo for now, but give it a look, and if you have problems or suggestions write a ticket: http://pythonpaste.org/commentary/trac/newticket Cheers. -- Ian Bicking | ianb at colorstudy.com | http://blog.ianbicking.org _______________________________________________ Doc-SIG maillist - Doc-SIG at python.org http://mail.python.org/mailman/listinfo/doc-sig From Fernando.Perez at colorado.edu Tue Jan 3 02:42:12 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 00:42:12 -0700 Subject: [SciPy-dev] commentary system In-Reply-To: References: Message-ID: <43BA2AD4.7080506@colorado.edu> Arnd Baecker wrote: > Hi, > > for those not following Doc-SIG/Python-Dev, the commentary system looks > very interesting. > (Maybe it could also be used in modified form for the SciPy live-docs) Yup, just saw that. BTW, if anyone is going to test it, the action to put in a comment is a double-click (not obvious, at least to me). I was just thinking also on how this would fit with a Wiki, though. In a sense, this came about on python-dev/doc-sig due to the problems with the python standard docs, and the process for those being seen as too rigid. Now that for scipy we have the new moin site, which I expect to grow into a nice collection of examples, cookbook-type recipes and other useful information, perhaps this isn't such a concern. We don't have a large body of existing LaTeX docs to worry about (though ironically, in this group that would be FAR less of a problem than it is for python-dev), and it may be that the Moin site alone is flexible enough to create nice docs. we'll see... Cheers, f From Fernando.Perez at colorado.edu Tue Jan 3 02:51:06 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 00:51:06 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43BA2935.5080508@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> Message-ID: <43BA2CEA.9050806@colorado.edu> Travis Oliphant wrote: >>My idea of ndarray is simply to keep the pure array functionality fairly >>standalone, thinking of potential inclusion of that part in the python core at >>some point, or of the needs of embedders, py2exe, py2app users, etc. I never >>thought that a user would ever do >> > >>from numerix.ndarray import ... > >>because we'd have, in the numerix __init__ file, >> >>__all__ = numerix.__all__ + ['more','stuff'] >> >> > > Great (assuming you mean __all__ = ndarray.__all__ + ... ;-) ) yup, sorry :) > That's what I'm thinking, too. But should we clobber the > numerix.ndarray *sub-package* with the numerix.ndarray.ndarray > *object*. Isn't that the kind of thing we are trying to avoid doing? > So, that somebody could do import numerix.ndarray if they were pedantic? Mmh, I'm confused. I thought that the 'array' object would be _defined_ in the ndarray package, and would only 'propagate up' to the numerix namespace via the above mechanism. So I guess that somebody could do, if they wanted from numerix import ndarray x = ndarray.array([1,2,3]) since ndarray would be a normal python package (a sub-package of numerix, hence automatically found by python if it lives in a sub-dir of numerix/ and has an __init__ file). The point is that WE will never ship ndarray as a standalone package, but there will be nothing technically preventing that from being done. This isolation will probably keep us in check design-wise, as well as allowing somebody who may need just the array functionality for an embedded/py2exe project to grab it. None of this seems like unnecessary namespace pollution to me. But again: don't worry, and move on. Make the decision you feel best with, and go ahead. We have work to do ;) Announce a final form, and the anxious doc-writers and website gang can get to documenting the currently incomprehensible mess into a nice, clear and compact set of instructions for newcomers. Cheers, f From charlesr.harris at gmail.com Tue Jan 3 06:33:47 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 3 Jan 2006 04:33:47 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43BA2935.5080508@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> Message-ID: On 1/3/06, Travis Oliphant wrote: > > > That's what I'm thinking, too. But should we clobber the > numerix.ndarray *sub-package* with the numerix.ndarray.ndarray I'm as confused Fernando. Probably more so, actually. What would ndarray.ndarray be? I thought ndarray was the equivalent of the old numeric or numarray *without* the subpackages, i.e., all the array functionality, or basically multiarray. All the other stuff goes under numerix. What am I missing? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pebarrett at gmail.com Tue Jan 3 09:49:47 2006 From: pebarrett at gmail.com (Paul Barrett) Date: Tue, 3 Jan 2006 09:49:47 -0500 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <43B975B9.5000407@colorado.edu> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> <43B95D40.5050005@colorado.edu> <200601021935.10046.faltet@carabos.com> <43B975B9.5000407@colorado.edu> Message-ID: <40e64fa20601030649q536762eax35790b9e0d7c058c@mail.gmail.com> On 1/2/06, Fernando Perez wrote: > > Francesc Altet wrote: > > A Dilluns 02 Gener 2006 18:05, Fernando Perez va escriure: > > > >>>Maybe unrelated with this discussion, but important anyway because of > >>>the (supposedly) large userbase that ipython have is the fact that > >>>ipython will call the __call__() function for the object, if it founds > >>>it. This makes things worse for the (ipython) user that wants just to > >>>print the value of the dtype object. > >> > >>This is a known case of the tension between convenience and correctness, > >>that ipython tries (not always successfully) to balance. > >> > >>Note that to aid a little in situations like these, ipython offers some > >>options: > > > > [snip] > > > > Yes, I knew this. But the point is that newbie users (and not as > > newbie, because I find toggling autocall on and off a bit annoying), > > will be somewhat surprised about the results. This was actually the > > main reason why I've decided to turn off the __call__() in many > > objects in PyTables. The other reason is that I replaced __call__() by > > more meaningful names and the result is far more readable (you know, > > explicit is better than implicit). > > I've toyed with the idea of making the autocall flag be a 0,1,2 integer, > where > today's 'on' would become 2 for 'full', while 1 would be an intermediate > mode, > doing autocall ONLY if there are arguments: > > 1: foo bar -> foo(bar) > foo -> foo > > 2: foo bar -> foo(bar) > foo -> foo() > > But I'm out of time for more significant ipython work for now, I'm afraid > (a > lot of new features went in recently, but now I need to work on other > things). > > Vote if you like it, though, and I'll consider it for the future. > > +1 for me too. -- Paul -- Paul Barrett, PhD Johns Hopkins University Assoc. Research Scientist Dept of Physics and Astronomy Phone: 410-516-5190 Baltimore, MD 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Tue Jan 3 12:03:41 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 10:03:41 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <40e64fa20601030649q536762eax35790b9e0d7c058c@mail.gmail.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> <43B95D40.5050005@colorado.edu> <200601021935.10046.faltet@carabos.com> <43B975B9.5000407@colorado.edu> <40e64fa20601030649q536762eax35790b9e0d7c058c@mail.gmail.com> Message-ID: <43BAAE6D.3030403@colorado.edu> Paul Barrett wrote: >>I've toyed with the idea of making the autocall flag be a 0,1,2 integer, >>where >>today's 'on' would become 2 for 'full', while 1 would be an intermediate >>mode, >>doing autocall ONLY if there are arguments: >> >>1: foo bar -> foo(bar) >> foo -> foo >> >>2: foo bar -> foo(bar) >> foo -> foo() >> >>But I'm out of time for more significant ipython work for now, I'm afraid >>(a >>lot of new features went in recently, but now I need to work on other >>things). >> >>Vote if you like it, though, and I'll consider it for the future. >> >> > > +1 for me too. OK, it looks like the idea has support. I've put it on the todo list, probably for 0.7.1 (but I may sneak it in, if I can find a bit of time this weekend, it's fairly straightforward). Thanks for the feedback, f From Fernando.Perez at colorado.edu Tue Jan 3 12:10:16 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 10:10:16 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> Message-ID: <43BAAFF8.5050105@colorado.edu> Charles R Harris wrote: > On 1/3/06, Travis Oliphant wrote: > > >> >> >>That's what I'm thinking, too. But should we clobber the >>numerix.ndarray *sub-package* with the numerix.ndarray.ndarray > > > > I'm as confused Fernando. Probably more so, actually. What would > ndarray.ndarray be? I thought ndarray was the equivalent of the old numeric > or numarray *without* the subpackages, i.e., all the array functionality, or > basically multiarray. All the other stuff goes under numerix. What am I > missing? Mmh, I agree that ndarray would not have the subpackages (fft, linalg-lite, random, etc.). But doesn't the array datatype go in there? If not, what is left? Sorry if my lack of clarity has caused confusion. To try and recap, my take on it is this: ndarray would be a package containing only the standalone functionality necessary to _create_ multidimensional arrays. This package would ship as part of numerix, but in principle it could be decoupled (it wouldn't _depend_ on anything else). At a bare minimum, the ndarray namespace could be (this is stretching it, I'm sure some basic utilities will be there as well): ndarray/__init__.py: __all__ = ['array'] But again, if this discussion is just causing confusion, ignore me :) The important point was to clear the scipycore/full-scipy/scipy namespace mess, and we got that. At this point, ndarray is an 'implementation detail' I'm happy to let Travis decide on as he sees fit. Best, f From prabhu_r at users.sf.net Tue Jan 3 12:36:30 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 3 Jan 2006 23:06:30 +0530 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43B9E1F6.6070609@enthought.com> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43B9E1F6.6070609@enthought.com> Message-ID: <17338.46622.299851.319948@monster.iitb.ac.in> >>>>> "Eric" == eric jones writes: Eric> Fernando Perez wrote: >> Note that, to my knowledge, the only two uses of 'numerix' in >> the wild right now are by matplotlib and enthought. John >> specifically already suggested nuemrix, so he's obviously OK >> with that. And in enthought, numerix is also (I think) used as >> a compatibility layer, so once we only have one array package, >> I imagine they'd be OK (though confirmation from them would be >> welcome). >> Eric> Unless Robert or Prabhu have any reason not to approve, I'm Eric> fine with it. I am OK with it. After all, numerix is different from enthought.util.numerix so it should not cause any conflicts. cheers, prabhu From faltet at carabos.com Tue Jan 3 13:07:36 2006 From: faltet at carabos.com (Francesc Altet) Date: Tue, 3 Jan 2006 19:07:36 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601021754.03309.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> Message-ID: <200601031907.37400.faltet@carabos.com> Hi, Attached is a patch (against SVN 1764) that implements a new typeNA dictionary that provides mappings between numarray<-->scipy_core datatypes. This solves a couple of issues: - Logically separates the datatypes in typeDict (where it remains just the scipy_core types) from the datatypes present in numarray, that are in the new typeNA. This also relieves doing ugly mappings in current typeDict, like 'Complex32'-->complex64_arrtype and 'Complex64'-->complex128_arrtype. - Now, there is a direct way to get the numarray type coming from a scipy_core type. For example: In [27]: q=scicore.array([1],dtype='Q') In [28]: scicore.typeNA[q.dtypechar] Out[28]: 'UInt64' In [29]: scicore.typeNA[q.dtype] Out[29]: 'UInt64' that was not possible before. OTOH, I'd have preferred to follow the numarray convention (except perhaps for complex types were the full bit-wide would have been more consistent) for the long string representation of scipy_core types. For example, we could have: In [30]: q.dtypedescr.dtypestr Out[30]: 'UInt64' which I find clearer than the current: In [30]: q.dtypedescr.dtypestr Out[30]: ' numerix.ndarray *sub-package* with the numerix.ndarray.ndarray > > > I'm as confused Fernando. Probably more so, actually. What would > ndarray.ndarray be? Perhaps what people are missing, is that the name of the new array object is *ndarray*. Try type(array([10,20])) to see what I mean. Because it is a *new-style* builtin type, it can be constructed by calling the type object: scipy.ndarray(...) Normally people use scipy.array to construct an array but subclasses need to be able to access the class constructor. So, I'm concerned about confusing the class constructor with a *sub-package*. Does, that make sense? -Travis From paustin at eos.ubc.ca Tue Jan 3 15:40:11 2006 From: paustin at eos.ubc.ca (Philip Austin) Date: Tue, 3 Jan 2006 12:40:11 -0800 Subject: [SciPy-dev] Googling numerix In-Reply-To: <43BADFDF.2030807@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BADFDF.2030807@ieee.org> Message-ID: <17338.57643.969609.796904@owl.eos.ubc.ca> Here's one other group it would be good to include in discussions: http://sml.nicta.com.au/crest/tiki-index.php?page=CREST+and+numerix Regards, Phil From Fernando.Perez at colorado.edu Tue Jan 3 16:09:39 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 14:09:39 -0700 Subject: [SciPy-dev] Renaming scipy_core ??? In-Reply-To: <43BADFDF.2030807@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BADFDF.2030807@ieee.org> Message-ID: <43BAE813.1040906@colorado.edu> Travis Oliphant wrote: > Charles R Harris wrote: > > >>On 1/3/06, *Travis Oliphant* >> wrote: >> >> >> >> That's what I'm thinking, too. But should we clobber the >> numerix.ndarray *sub-package* with the numerix.ndarray.ndarray >> >> >>I'm as confused Fernando. Probably more so, actually. What would >>ndarray.ndarray be? > > > > Perhaps what people are missing, is that the name of the new array > object is *ndarray*. Try type(array([10,20])) to see what I mean. > > Because it is a *new-style* builtin type, it can be constructed by > calling the type object: > > scipy.ndarray(...) > > Normally people use scipy.array to construct an array but subclasses > need to be able to access the class constructor. Ah! I certainly was missing that part, thanks for the clarification. Mmh, I'm not sure then what the cleanest solution would be: numerix.ndarray.ndarray() looks ugly. Do you want to expose the raw type as a top-level? If so, then that seems inevitable. If not, I guess you could have an ndarray subpackage for the types: ndarray.types.ndarray() Are there more types, or is ndarray() the only new-style type constructor? I now see that, regardless of how you choose to name things, you'll still need to make a packaging decision on where to expose the ndarray type constructor (unless you _only_ put it in the numerix level, but then it becomes impossible to ever separate it as a standalone component. I guess you could decide to go that route, too). Cheers, f From oliphant.travis at ieee.org Tue Jan 3 16:12:03 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 14:12:03 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BAAFF8.5050105@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> Message-ID: <43BAE8A3.8070009@ieee.org> It looks like the name of the new package will be numerix. Thanks to all who offered suggestions. The numerix name is the favorite on the straw poll and in the comments to the list. There were no deal-breaker negative comments either... There are a few other people using the name (OCaml library, a financial company, etc.). There is also a group of Python PETSci users who seem to want to create their own numerical world rather than help make the one we are trying to use better (read Phil Austin's post...). But, it looks like that Python group is trying to rename their stuff so I don't think it's a problem. Besides numerix has been in use in matplotlib for at least a couple of years, right? The numerix directory will contain the following sub-packages: numerix/ core/ --- the basic arrayobject only and it's standard sub-classes. The core sub-package will also have its name-space placed in the numerix name-space. lib/ --- extra functions using core (mlab, polynomial, function_base, etc...) distutils/ f2py/ linalg/ dft/ (fft, ifft will also be brought up to the numerix name-space) random/ (rand, randn will also be brought up to the numerix name-space) testing/ Will be the remaining sub-packages... The new pkgloader facility will be present in numerix. This may take a while to convert. I'll make a branch of svn to do it called numerix. If anybody has time to help, please let me know and dive in. There are lots of names to fix.... I'd still like to get something out by tomorrow. Best, -Travis From charlesr.harris at gmail.com Tue Jan 3 16:18:34 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 3 Jan 2006 14:18:34 -0700 Subject: [SciPy-dev] Googling numerix In-Reply-To: <17338.57643.969609.796904@owl.eos.ubc.ca> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <17338.57643.969609.796904@owl.eos.ubc.ca> Message-ID: On 1/3/06, Philip Austin wrote: > > > Here's one other group it would be good to include in discussions: > > http://sml.nicta.com.au/crest/tiki-index.php?page=CREST+and+numerix And there was also this comment there, Simon: See also: http://numeric.scipy.org/PEP.txt This is a proposal for "Multidimensional Arrays for Python". It's a kind of api that numarray/numeric/whoever-else can build their array's on top of. If possible we should get our act together and try and work with these guys. It looks like these guys pretty much want to duplicate MatLab as is and interface to PETSc. They seem to be interested mainly in matrices and sparse matrices. I'm guessing they do a lot of PDE. Anyway, this does lead to a naming problem, although I note that they only started up last July, so I think we could make an argument on first use. Do we want to worry about this? Is their project going to go anywhere? I didn't get a look at their svn repository, so I have no idea if they are very active. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Tue Jan 3 16:21:59 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 14:21:59 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BAE8A3.8070009@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> Message-ID: <43BAEAF7.3040804@colorado.edu> Travis Oliphant wrote: > It looks like the name of the new package will be numerix. Thanks to > all who offered suggestions. The numerix name is the favorite on the > straw poll and in the comments to the list. There were no deal-breaker > negative comments either... Yes! Thank you, Travis. > There are a few other people using the name (OCaml library, a financial > company, etc.). There is also a group of Python PETSci users who seem > to want to create their own numerical world rather than help make the > one we are trying to use better (read Phil Austin's post...). But, it > looks like that Python group is trying to rename their stuff so I don't > think it's a problem. Besides numerix has been in use in matplotlib for > at least a couple of years, right? I don't want to sound hostile, but I completely agree with you on ignoring the group Phil's googling found. If they can't be bothered to participate on a large community effort and contribute improvements but would rather start over from scratch reinventing the same wheels, I don't feel compelled to bend over backwards to accomodate them. Robert Cimrman and his gang also had problems with sparse support, but their approach was to work (hard) to bring the best of the existing libraries over to the new system. I place a lot of value and respect on that. And besides, on their site there is exactly ZERO code for open, anonymous download (you need to create an account to see the code, see http://sml.nicta.com.au/crest/tiki-index.php?page=Numerix+SVN+instructions). Something about that just annoys me as far as 'open source' practices. Given that we have historical precedent (matplotlib had numerix before July'05) and an open development process where they could have participated but didn't, I say: Go numerix! Cheers, f From jdhunter at ace.bsd.uchicago.edu Tue Jan 3 16:26:49 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 03 Jan 2006 15:26:49 -0600 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BAE8A3.8070009@ieee.org> (Travis Oliphant's message of "Tue, 03 Jan 2006 14:12:03 -0700") References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> Message-ID: <873bk5dmue.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Travis" == Travis Oliphant writes: Travis> There are a few other people using the name (OCaml Travis> library, a financial company, etc.). There is also a Travis> group of Python PETSci users who seem to want to create Travis> their own numerical world rather than help make the one we Travis> are trying to use better (read Phil Austin's post...). Travis> But, it looks like that Python group is trying to rename Travis> their stuff so I don't think it's a problem. Besides Travis> numerix has been in use in matplotlib for at least a Travis> couple of years, right? Todd added numerix in matplotlib-0.51, according to the release notes at http://matplotlib.sourceforge.net/whats_new.html which was released in March of 2004. It looks like the matplotlib use is the first in the python world; a search of google groups for numerix group:*python* returns http://groups.google.com/groups?q=numerix+group%3A*python*&start=0&scoring=d&hl=en& with all the posts save the most recent one are clearly referring to matplotlib numerix. So as long as Perry and Todd are willing to give it up, and I gather they are from Perry's comments, it doesn't look like anyone else can rightfully claim the name. JDH From perry at stsci.edu Tue Jan 3 17:59:53 2006 From: perry at stsci.edu (Perry Greenfield) Date: Tue, 3 Jan 2006 17:59:53 -0500 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <873bk5dmue.fsf@peds-pc311.bsd.uchicago.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <873bk5dmue.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <4b5db4bad628024d2171dbe509149c46@stsci.edu> On Jan 3, 2006, at 4:26 PM, John Hunter wrote: > > So as long as Perry and Todd are willing to give it up, and I gather Yes, we are willing to give it up. -- Perry > they are from Perry's comments, it doesn't look like anyone else can > rightfully claim the name. From oliphant.travis at ieee.org Tue Jan 3 18:09:22 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 16:09:22 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <4b5db4bad628024d2171dbe509149c46@stsci.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <873bk5dmue.fsf@peds-pc311.bsd.uchicago.edu> <4b5db4bad628024d2171dbe509149c46@stsci.edu> Message-ID: <43BB0422.5020106@ieee.org> I've made a new svn repository which will house the numerix library: http://svn.scipy.org/svn/numerix/trunk will be the main library. It is not ready to be checked out at all. But, this will be the new name. I've given permissions to just a few for the new server. If I've forgotten you and you are willing to help in the next few days, let me know and I'll update your permissions. -Travis From gruben at bigpond.net.au Tue Jan 3 18:10:06 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Wed, 04 Jan 2006 10:10:06 +1100 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BAE8A3.8070009@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> Message-ID: <43BB044E.4080407@bigpond.net.au> If you look at this page on the site link provided by Philip, there's a comment: http://sml.nicta.com.au/crest/tiki-index.php says Note: the names "CREST" and "numerix" will have to change since both have trademark problems. We are soliciting naming suggestions - please contribute. I don't know what trademark problems they're referring to, but you might have to continue looking for a different name. Gary R. Travis Oliphant wrote: > It looks like the name of the new package will be numerix. Thanks to > all who offered suggestions. The numerix name is the favorite on the > straw poll and in the comments to the list. There were no deal-breaker > negative comments either... From paustin at eos.ubc.ca Tue Jan 3 18:16:07 2006 From: paustin at eos.ubc.ca (Philip Austin) Date: Tue, 3 Jan 2006 15:16:07 -0800 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BB044E.4080407@bigpond.net.au> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> Message-ID: <17339.1463.681290.226040@owl.eos.ubc.ca> Gary Ruben writes: > I don't know what trademark problems they're referring to, but you might > have to continue looking for a different name. This may be what they're talking about: http://www.numerix-dsp.com From oliphant.travis at ieee.org Tue Jan 3 18:50:52 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 16:50:52 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <17339.1463.681290.226040@owl.eos.ubc.ca> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> <17339.1463.681290.226040@owl.eos.ubc.ca> Message-ID: <43BB0DDC.70303@ieee.org> Philip Austin wrote: >Gary Ruben writes: > > I don't know what trademark problems they're referring to, but you might > > have to continue looking for a different name. > > > numerik could also be used. -Travis From Fernando.Perez at colorado.edu Tue Jan 3 19:06:00 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 17:06:00 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BB0DDC.70303@ieee.org> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> <17339.1463.681290.226040@owl.eos.ubc.ca> <43BB0DDC.70303@ieee.org> Message-ID: <43BB1168.8020607@colorado.edu> Travis Oliphant wrote: > Philip Austin wrote: > > >>Gary Ruben writes: >> >>>I don't know what trademark problems they're referring to, but you might >>>have to continue looking for a different name. >> God, I hate this (but good call, better now than when the cease-and-desist letter arrives, probably straight to Enthought's offices [lawyers will always go first for a company rather than an individual, chasing the smell of money]). I suck at names, I've said it before. Best I can do: mantis: multidimensional arrays, numerics and tables in science I really don't like 'cute' names for functional tools, but I fail to find anything better in the 'array lab numerics' direction. Cheers, f From oliphant.travis at ieee.org Tue Jan 3 19:06:01 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 17:06:01 -0700 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] Message-ID: <43BB1169.5070405@ieee.org> >Gary Ruben writes: > > I don't know what trademark problems they're referring to, but you might > > have to continue looking for a different name. > >This may be what they're talking about: > >http://www.numerix-dsp.com > > > > ***Bad News*** Well, I did a search on the U.S. Trademark database and it looks like they have trademarked the word Numerix. Here is the result of that search. * Typed Drawing* ------------------------------------------------------------------------ *Word Mark* *NUMERIX* *Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer software for pricing, valuing, and managing risk of financial instruments, derivatives, investment securities, and portfolios therefor, and instructional manuals sold as a unit therewith. FIRST USE: 19970912. FIRST USE IN COMMERCE: 19971224 *Mark Drawing Code* (1) TYPED DRAWING *Design Search Code* *Serial Number* 75092619 *Filing Date* April 18, 1996 *Current Filing Basis* 1A *Original Filing Basis* 1B *Published for Opposition* July 29, 1997 *Registration Number* 2302369 *Registration Date* December 21, 1999 *Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar Chicago ILLINOIS 60611 *Attorney of Record* Randall M. Whitmeyer *Type of Mark* TRADEMARK *Register* PRINCIPAL *Live/Dead Indicator* LIVE I'm afraid this is a deal-breaker as our library could be used in this kind of computation, I'm very concerned that in the future, this company may decide to try and enforce their trademark which would be a lot of upheaval. Should I be concerned? I also noticed that lots of people have trademarked the PYTHON word. How much should I care about this? By the way, neither psipi or scicore have trademark problems. Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. Please help... ---Travis From schofield at ftw.at Tue Jan 3 17:20:05 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 03 Jan 2006 22:20:05 +0000 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <43BAEAF7.3040804@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BAEAF7.3040804@colorado.edu> Message-ID: <43BAF895.4090604@ftw.at> Fernando Perez wrote: >Go numerix! > > It says on the CREST/Numerix site that: "Numerix is the codename of the new CREST interface to backend numerical libraries." If it's just a codename, with any luck Vishy and Simon won't mind choosing a different name for their release when it does happen. Or, even better, they'll see how good the new numerix and scipy are and want to contribute :) -- Ed From oliphant.travis at ieee.org Tue Jan 3 18:25:01 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 16:25:01 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <17339.1463.681290.226040@owl.eos.ubc.ca> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> <17339.1463.681290.226040@owl.eos.ubc.ca> Message-ID: <43BB07CD.5050503@ieee.org> Philip Austin wrote: >Gary Ruben writes: > > I don't know what trademark problems they're referring to, but you might > > have to continue looking for a different name. > >This may be what they're talking about: > >http://www.numerix-dsp.com > > > There is also a http://www.numerix.com Will this be a problem? The *official* name will be Numerix Python: python-numerix package so I don't see the problem. Has the name numerix been trademarked? How is that possible if multiple people are already using it? This needs to be settled quickly. I don't want to spend hours changing the name only to have to change it to something else... -Travis From strawman at astraw.com Tue Jan 3 19:55:03 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 03 Jan 2006 16:55:03 -0800 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] In-Reply-To: <43BB1169.5070405@ieee.org> References: <43BB1169.5070405@ieee.org> Message-ID: <43BB1CE7.4020902@astraw.com> I always liked "numstar" to, presumably mean num* when referring to numeric or numarray, much like "numerix" did. Maybe numstar is unencumbered? It really gives a little marketing-friendly element to our name. Travis Oliphant wrote: >>Gary Ruben writes: >> >> >>>I don't know what trademark problems they're referring to, but you might >>>have to continue looking for a different name. >>> >>> >>This may be what they're talking about: >> >>http://www.numerix-dsp.com >> >> >> >> >> >> > >***Bad News*** > >Well, I did a search on the U.S. Trademark database and it looks like >they have trademarked the word Numerix. Here is the result of that search. >* >Typed Drawing* >------------------------------------------------------------------------ >*Word Mark* *NUMERIX* >*Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer >software for pricing, valuing, and managing risk of financial >instruments, derivatives, investment securities, and portfolios >therefor, and instructional manuals sold as a unit therewith. FIRST USE: >19970912. FIRST USE IN COMMERCE: 19971224 >*Mark Drawing Code* (1) TYPED DRAWING >*Design Search Code* >*Serial Number* 75092619 >*Filing Date* April 18, 1996 >*Current Filing Basis* 1A >*Original Filing Basis* 1B >*Published for Opposition* July 29, 1997 >*Registration Number* 2302369 >*Registration Date* December 21, 1999 >*Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar >Chicago ILLINOIS 60611 >*Attorney of Record* Randall M. Whitmeyer >*Type of Mark* TRADEMARK >*Register* PRINCIPAL >*Live/Dead Indicator* LIVE > > > >I'm afraid this is a deal-breaker as our library could be used in this >kind of computation, I'm very concerned that in the future, this company >may decide to try and enforce their trademark which would be a lot of >upheaval. Should I be concerned? I also noticed that lots of people >have trademarked the PYTHON word. > >How much should I care about this? > >By the way, neither psipi or scicore have trademark problems. > >Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. > >Please help... > >---Travis > > > > > > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From pebarrett at gmail.com Tue Jan 3 20:16:02 2006 From: pebarrett at gmail.com (Paul Barrett) Date: Tue, 3 Jan 2006 20:16:02 -0500 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] In-Reply-To: <43BB1CE7.4020902@astraw.com> References: <43BB1169.5070405@ieee.org> <43BB1CE7.4020902@astraw.com> Message-ID: <40e64fa20601031716r27d234cbsaa1053f9483aaad8@mail.gmail.com> How about 'arrakor' (pron. array core)? This means repetitious in the Basque language, so I'm told. :-) -- Paul On 1/3/06, Andrew Straw wrote: > > I always liked "numstar" to, presumably mean num* when referring to > numeric or numarray, much like "numerix" did. Maybe numstar is > unencumbered? It really gives a little marketing-friendly element to > our name. > > Travis Oliphant wrote: > > >>Gary Ruben writes: > >> > >> > >>>I don't know what trademark problems they're referring to, but you > might > >>>have to continue looking for a different name. > >>> > >>> > >>This may be what they're talking about: > >> > >>http://www.numerix-dsp.com > >> > >> > >> > >> > >> > >> > > > >***Bad News*** > > > >Well, I did a search on the U.S. Trademark database and it looks like > >they have trademarked the word Numerix. Here is the result of that > search. > >* > >Typed Drawing* > >------------------------------------------------------------------------ > >*Word Mark* *NUMERIX* > >*Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer > >software for pricing, valuing, and managing risk of financial > >instruments, derivatives, investment securities, and portfolios > >therefor, and instructional manuals sold as a unit therewith. FIRST USE: > >19970912. FIRST USE IN COMMERCE: 19971224 > >*Mark Drawing Code* (1) TYPED DRAWING > >*Design Search Code* > >*Serial Number* 75092619 > >*Filing Date* April 18, 1996 > >*Current Filing Basis* 1A > >*Original Filing Basis* 1B > >*Published for Opposition* July 29, 1997 > >*Registration Number* 2302369 > >*Registration Date* December 21, 1999 > >*Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East > Cedar > >Chicago ILLINOIS 60611 > >*Attorney of Record* Randall M. Whitmeyer > >*Type of Mark* TRADEMARK > >*Register* PRINCIPAL > >*Live/Dead Indicator* LIVE > > > > > > > >I'm afraid this is a deal-breaker as our library could be used in this > >kind of computation, I'm very concerned that in the future, this company > >may decide to try and enforce their trademark which would be a lot of > >upheaval. Should I be concerned? I also noticed that lots of people > >have trademarked the PYTHON word. > > > >How much should I care about this? > > > >By the way, neither psipi or scicore have trademark problems. > > > >Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be > used. > > > >Please help... > > > >---Travis > > > > > > > > > > > > > > > >_______________________________________________ > >Scipy-dev mailing list > >Scipy-dev at scipy.net > >http://www.scipy.net/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -- Paul Barrett, PhD Johns Hopkins University Assoc. Research Scientist Dept of Physics and Astronomy Phone: 410-516-5190 Baltimore, MD 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From dd55 at cornell.edu Tue Jan 3 20:20:21 2006 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 3 Jan 2006 20:20:21 -0500 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] In-Reply-To: <43BB1169.5070405@ieee.org> References: <43BB1169.5070405@ieee.org> Message-ID: <200601032020.21489.dd55@cornell.edu> On Tuesday 03 January 2006 7:06 pm, Travis Oliphant wrote: > >Gary Ruben writes: > > > I don't know what trademark problems they're referring to, but you > > > might have to continue looking for a different name. > > > >This may be what they're talking about: > > > >http://www.numerix-dsp.com > > ***Bad News*** > > Well, I did a search on the U.S. Trademark database and it looks like > they have trademarked the word Numerix. Here is the result of that search. > * > Typed Drawing* > ------------------------------------------------------------------------ > *Word Mark* *NUMERIX* > *Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer > software for pricing, valuing, and managing risk of financial > instruments, derivatives, investment securities, and portfolios > therefor, and instructional manuals sold as a unit therewith. FIRST USE: > 19970912. FIRST USE IN COMMERCE: 19971224 > *Mark Drawing Code* (1) TYPED DRAWING > *Design Search Code* > *Serial Number* 75092619 > *Filing Date* April 18, 1996 > *Current Filing Basis* 1A > *Original Filing Basis* 1B > *Published for Opposition* July 29, 1997 > *Registration Number* 2302369 > *Registration Date* December 21, 1999 > *Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar > Chicago ILLINOIS 60611 > *Attorney of Record* Randall M. Whitmeyer > *Type of Mark* TRADEMARK > *Register* PRINCIPAL > *Live/Dead Indicator* LIVE > > > > I'm afraid this is a deal-breaker as our library could be used in this > kind of computation, I'm very concerned that in the future, this company > may decide to try and enforce their trademark which would be a lot of > upheaval. Should I be concerned? I also noticed that lots of people > have trademarked the PYTHON word. > > How much should I care about this? > > By the way, neither psipi or scicore have trademark problems. > > Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. > > Please help... Can we reconsider using numpy? numpy/scipy is a consistent naming scheme, and it reflects our roots. Darren From oliphant.travis at ieee.org Tue Jan 3 20:30:50 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Tue, 03 Jan 2006 18:30:50 -0700 Subject: [SciPy-dev] Renaming scipy_core: back to square 1 Message-ID: Well unfortunately, numerix is out. Here are some ideas: scicore --- still avaiable psipi --- a cute transliteration of "SciPy" into mathy Greek letters (could use symbols in print...) nagini --- "he who should remane nameless" 's snake in Harry Potter ( since we are having such trouble with a name numerics numerik numeriks This are along the same lines as numerix (but if we are concerned about the numerix trademark, then those bringing a suit could try the "these are too similar approach" --- basically I don't want to deal with any future legal action). -Travis From oliphant.travis at ieee.org Tue Jan 3 20:36:24 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Tue, 03 Jan 2006 18:36:24 -0700 Subject: [SciPy-dev] Please vote on the names Message-ID: Please give a vote (+1, 0, -1) on the following names: 1. psipi 2. scicore 3. muscle 4. numstar 5. mantis 6. numpy We have to move from numerix due to trademark concerns (I don't want to have any that I can obviously avoid). These are the leading candidates. -Travis From oliphant.travis at ieee.org Tue Jan 3 20:35:19 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Tue, 03 Jan 2006 18:35:19 -0700 Subject: [SciPy-dev] Another name Message-ID: Here's another name. Python's are powerful because of their muscle --- Multidimensional-arrays Uniting for Scientific and Computational Library Enhancement So far my favorite. -Travis From paustin at eos.ubc.ca Tue Jan 3 20:41:56 2006 From: paustin at eos.ubc.ca (Philip Austin) Date: Tue, 3 Jan 2006 17:41:56 -0800 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <17339.10212.111640.518065@owl.eos.ubc.ca> Travis E. Oliphant writes: > Please give a vote (+1, 0, -1) on the following names: > > 1. psipi; 0 > 2. scicore; 0 > 3. muscle; -1 > 4. numstar; +1 > 5. mantis; -1 > 6. numpy; +1 > From oliphant.travis at ieee.org Tue Jan 3 20:01:53 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 18:01:53 -0700 Subject: [SciPy-dev] More possible names Message-ID: <43BB1E81.2010707@ieee.org> It seems some of my messages are not getting through. So, I apologize for the funny order they are coming in. Here's another name. Python's are powerful because of their muscle --- Multidimensional arrays Uniting for Scientific and Computational Language Enhancement Somebody stop me.... -Travis From oliphant.travis at ieee.org Tue Jan 3 19:55:00 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 17:55:00 -0700 Subject: [SciPy-dev] Renaming scipy_core: back to square 1 Message-ID: <43BB1CE4.1090709@ieee.org> Well unfortunately, numerix is out. Here are some ideas: scicore --- still avaiable psipi --- a cute transliteration of "SciPy" into mathy Greek letters (could use symbols in print...) nagini --- "he who should remane nameless" 's snake in Harry Potter ( since we are having such trouble with a name numerics numerik numeriks This are along the same lines as numerix (but if we are concerned about the numerix trademark, then those bringing a suit could try the "these are too similar approach" --- basically I don't want to deal with any future legal action). -Travis From dd55 at cornell.edu Tue Jan 3 20:49:46 2006 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 3 Jan 2006 20:49:46 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <200601032049.46472.dd55@cornell.edu> On Tuesday 03 January 2006 8:36 pm, Travis E. Oliphant wrote: > Please give a vote (+1, 0, -1) on the following names: > 0 psipi -1 scicore 0 muscle -1 numstar 0 mantis +1 numpy From oliphant.travis at ieee.org Tue Jan 3 20:39:30 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Tue, 03 Jan 2006 18:39:30 -0700 Subject: [SciPy-dev] Muscle name explained Message-ID: My mail doesn't seem to be getting through. I'm not sure why. Here is my previously posted explanation for the name muscle. First, it's the means by which Python's are powerful. Second, it can be thought of as the acronym: Multi-dimensional-arrays Uniting for Scientific and Computational Library Enhancement. -Travis From charlesr.harris at gmail.com Tue Jan 3 21:11:09 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 3 Jan 2006 19:11:09 -0700 Subject: [SciPy-dev] Renaming scipy_core: back to square 1 In-Reply-To: <43BB1CE4.1090709@ieee.org> References: <43BB1CE4.1090709@ieee.org> Message-ID: Hmm, numerique - a sophisticated blend of arrays and algorithms for the decerning scientist. OK, that wasn't serious. Chuck On 1/3/06, Travis Oliphant wrote: > > > Well unfortunately, numerix is out. > > Here are some ideas: > > scicore --- still avaiable > > psipi --- a cute transliteration of "SciPy" into mathy Greek letters > (could use symbols in print...) > > nagini --- "he who should remane nameless" 's snake in Harry Potter ( > since we are having such trouble with a name > > numerics > numerik > numeriks > > This are along the same lines as numerix (but if we are concerned > about the numerix trademark, then those bringing a suit could try the > "these are too similar approach" --- basically I don't want to deal with > any future legal action). > > -Travis > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant.travis at ieee.org Tue Jan 3 18:48:33 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 16:48:33 -0700 Subject: [SciPy-dev] Renaming scipy_core: winner is *numerix* In-Reply-To: <17339.1463.681290.226040@owl.eos.ubc.ca> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> <17339.1463.681290.226040@owl.eos.ubc.ca> Message-ID: <43BB0D51.4040206@ieee.org> Philip Austin wrote: >Gary Ruben writes: > > I don't know what trademark problems they're referring to, but you might > > have to continue looking for a different name. > >This may be what they're talking about: > >http://www.numerix-dsp.com > > > > Well, I did a search on the U.S. Trademark database and it looks like they have trademarked the word Numerix. Here is the result of that search. * Typed Drawing* ------------------------------------------------------------------------ *Word Mark* *NUMERIX* *Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer software for pricing, valuing, and managing risk of financial instruments, derivatives, investment securities, and portfolios therefor, and instructional manuals sold as a unit therewith. FIRST USE: 19970912. FIRST USE IN COMMERCE: 19971224 *Mark Drawing Code* (1) TYPED DRAWING *Design Search Code* *Serial Number* 75092619 *Filing Date* April 18, 1996 *Current Filing Basis* 1A *Original Filing Basis* 1B *Published for Opposition* July 29, 1997 *Registration Number* 2302369 *Registration Date* December 21, 1999 *Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar Chicago ILLINOIS 60611 *Attorney of Record* Randall M. Whitmeyer *Type of Mark* TRADEMARK *Register* PRINCIPAL *Live/Dead Indicator* LIVE I'm afraid this is a deal-breaker as our library could be used in this kind of computation, I'm very concerned that in the future, this company may decide to try and enforce their trademark which would be a lot of upheaval. Should I be concerned? I also noticed that lots of people have trademarked the PYTHON word. How much should I care about this? By the way, neither psipi or scicore have trademark problems. Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. Please help... ---Travis From oliphant.travis at ieee.org Tue Jan 3 19:41:18 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 17:41:18 -0700 Subject: [SciPy-dev] Renaming scipy_core: back to square 1 In-Reply-To: <43BB1168.8020607@colorado.edu> References: <200601022109.k02L9rxa007432@oobleck.astro.cornell.edu> <43B99896.8060003@colorado.edu> <43BA2202.1040100@ieee.org> <43BA248D.3020906@colorado.edu> <43BA2935.5080508@ieee.org> <43BAAFF8.5050105@colorado.edu> <43BAE8A3.8070009@ieee.org> <43BB044E.4080407@bigpond.net.au> <17339.1463.681290.226040@owl.eos.ubc.ca> <43BB0DDC.70303@ieee.org> <43BB1168.8020607@colorado.edu> Message-ID: <43BB19AE.4030305@ieee.org> Well unfortunately, numerix is out. Here are some ideas: scicore --- still avaiable psipi --- a cute transliteration of "SciPy" into mathy Greek letters (could use symbols in print...) nagini --- "he who should remane nameless" 's snake in Harry Potter ( since we are having such trouble with a name numerics numerik numeriks This are along the same lines as numerix (but if we are concerned about the numerix trademark, then those bringing a suit could try the "these are too similar approach" --- basically I don't want to deal with any future legal action). -Travis From oliphant.travis at ieee.org Tue Jan 3 19:00:34 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 17:00:34 -0700 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] Message-ID: <43BB1022.60901@ieee.org> -------- Original Message -------- Subject: Re: [SciPy-dev] Renaming scipy_core: winner is *numerix* Date: Tue, 03 Jan 2006 16:48:33 -0700 From: Travis Oliphant To: SciPy Developers List References: <200601022109.k02L9rxa007432 at oobleck.astro.cornell.edu> <43B99896.8060003 at colorado.edu> <43BA2202.1040100 at ieee.org> <43BA248D.3020906 at colorado.edu> <43BA2935.5080508 at ieee.org> <43BAAFF8.5050105 at colorado.edu> <43BAE8A3.8070009 at ieee.org> <43BB044E.4080407 at bigpond.net.au> <17339.1463.681290.226040 at owl.eos.ubc.ca> Philip Austin wrote: >Gary Ruben writes: > > I don't know what trademark problems they're referring to, but you might > > have to continue looking for a different name. > >This may be what they're talking about: > >http://www.numerix-dsp.com > > > > Well, I did a search on the U.S. Trademark database and it looks like they have trademarked the word Numerix. Here is the result of that search. * Typed Drawing* ------------------------------------------------------------------------ *Word Mark* *NUMERIX* *Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer software for pricing, valuing, and managing risk of financial instruments, derivatives, investment securities, and portfolios therefor, and instructional manuals sold as a unit therewith. FIRST USE: 19970912. FIRST USE IN COMMERCE: 19971224 *Mark Drawing Code* (1) TYPED DRAWING *Design Search Code* *Serial Number* 75092619 *Filing Date* April 18, 1996 *Current Filing Basis* 1A *Original Filing Basis* 1B *Published for Opposition* July 29, 1997 *Registration Number* 2302369 *Registration Date* December 21, 1999 *Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar Chicago ILLINOIS 60611 *Attorney of Record* Randall M. Whitmeyer *Type of Mark* TRADEMARK *Register* PRINCIPAL *Live/Dead Indicator* LIVE I'm afraid this is a deal-breaker as our library could be used in this kind of computation, I'm very concerned that in the future, this company may decide to try and enforce their trademark which would be a lot of upheaval. Should I be concerned? I also noticed that lots of people have trademarked the PYTHON word. How much should I care about this? By the way, neither psipi or scicore have trademark problems. Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. Please help... ---Travis From oliphant.travis at ieee.org Tue Jan 3 20:08:44 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 18:08:44 -0700 Subject: [SciPy-dev] Renaming scipy_core: here we go again Message-ID: <43BB201C.7050801@ieee.org> Please give feedback on the following names: 1. psipi 2. scicore 3. muscle 4. numstar We have to move from numerix due to trademark concerns (I don't want to have any that I can obviously avoid). -Travis From oliphant.travis at ieee.org Tue Jan 3 20:10:28 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 18:10:28 -0700 Subject: [SciPy-dev] [Fwd: More possible names] Message-ID: <43BB2084.1000303@ieee.org> -------- Original Message -------- Subject: More possible names Date: Tue, 03 Jan 2006 18:01:53 -0700 From: Travis Oliphant To: SciPy Developers List It seems some of my messages are not getting through. So, I apologize for the funny order they are coming in. Here's another name. Python's are powerful because of their muscle --- Multidimensional arrays Uniting for Scientific and Computational Language Enhancement Somebody stop me.... -Travis From oliphant.travis at ieee.org Tue Jan 3 20:10:47 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 18:10:47 -0700 Subject: [SciPy-dev] [Fwd: Renaming scipy_core: back to square 1] Message-ID: <43BB2097.1010102@ieee.org> -------- Original Message -------- Subject: Renaming scipy_core: back to square 1 Date: Tue, 03 Jan 2006 17:55:00 -0700 From: Travis Oliphant To: SciPy Developers List Well unfortunately, numerix is out. Here are some ideas: scicore --- still avaiable psipi --- a cute transliteration of "SciPy" into mathy Greek letters (could use symbols in print...) nagini --- "he who should remane nameless" 's snake in Harry Potter ( since we are having such trouble with a name numerics numerik numeriks This are along the same lines as numerix (but if we are concerned about the numerix trademark, then those bringing a suit could try the "these are too similar approach" --- basically I don't want to deal with any future legal action). -Travis From Chris.Fonnesbeck at MyFWC.com Tue Jan 3 21:40:14 2006 From: Chris.Fonnesbeck at MyFWC.com (Fonnesbeck, Chris) Date: Tue, 3 Jan 2006 21:40:14 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: Message-ID: On 1/3/06 8:36 PM, "Travis E. Oliphant" wrote: > Please give a vote (+1, 0, -1) on the following names: > > 1. psipi -1 > 2. scicore 0 > 3. muscle -1 > 4. numstar -1 > 5. mantis 0 > 6. numpy +1 nagini +1 -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2131 bytes Desc: not available URL: From strawman at astraw.com Tue Jan 3 21:59:38 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 03 Jan 2006 18:59:38 -0800 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <43BB3A1A.1080709@astraw.com> A new Straw poll: http://new.scipy.org/Wiki/RenamingCore (I transcribed Darren Dale and Chris Fonnesbeck's votes.) Question to the legal types: Do names like "numerics" or "numeriks" open us up to cease-and-desist letters with anything more than empty-threat status from someone who owns a trademark on "numerix"? Travis E. Oliphant wrote: >Please give a vote (+1, 0, -1) on the following names: > >1. psipi >2. scicore >3. muscle >4. numstar >5. mantis >6. numpy > >We have to move from numerix due to trademark concerns (I don't want to >have any that I can obviously avoid). > >These are the leading candidates. > >-Travis > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From strawman at astraw.com Tue Jan 3 22:04:41 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 03 Jan 2006 19:04:41 -0800 Subject: [SciPy-dev] [Fwd: Renaming scipy_core: back to square 1] In-Reply-To: <43BB2097.1010102@ieee.org> References: <43BB2097.1010102@ieee.org> Message-ID: <43BB3B49.5020706@astraw.com> > >nagini --- "he who should remane nameless" 's snake in Harry Potter ( >since we are having such trouble with a name > A nice excerpt from http://en.wikipedia.org/wiki/Nagini : "Nagini is the female name for cobra in Hindi and most other Indian languages." From Chris.Fonnesbeck at MyFWC.com Tue Jan 3 22:08:31 2006 From: Chris.Fonnesbeck at MyFWC.com (Fonnesbeck, Chris) Date: Tue, 3 Jan 2006 22:08:31 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: <43BB3A1A.1080709@astraw.com> Message-ID: On 1/3/06 9:59 PM, "Andrew Straw" wrote: > Do names like "numerics" or "numeriks" open us up to cease-and-desist > letters with anything more than empty-threat status from someone who > owns a trademark on "numerix"? According to my wife (a pretty good lawyer), they are close enough as to be confusing for a potential consumer (recall Lindows), so they should be avoided. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: Chris.Fonnesbeck at MyFWC.com -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2131 bytes Desc: not available URL: From faltet at carabos.com Tue Jan 3 22:16:56 2006 From: faltet at carabos.com (Francesc Altet) Date: Wed, 4 Jan 2006 04:16:56 +0100 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601021754.03309.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> Message-ID: <200601040416.57705.faltet@carabos.com> A Dilluns 02 Gener 2006 17:54, Francesc Altet va escriure: > Finally, I have yet another (the last one, at least for today ;-) > question about scipy_core: > > In [35]: print scicore.typeDict['I'], scicore.typeDict['L'] > > > In [36]: scicore.typeDict['I'] == scicore.typeDict['L'] > Out[36]: False > > The same goes for: > > In [38]: print scicore.typeDict['i'], scicore.typeDict['l'] > > > In [39]: print scicore.typeDict['i'] == scicore.typeDict['l'] > False > > Is this consistent?. I would say no (I'm on a 32-bit platform). I've made some progress for comparing datatypes of type 'i' and 'l' correctly. Right now, in 32-bit platforms, scipy_core gives: In [25]: a=scicore.arange(10,dtype='i') In [26]: b=scicore.arange(10,dtype='l') In [27]: a.dtypedescr == b.dtypedescr Out[27]: False # !! With the attached patch (you should revise it, because I'm not used to write C extensions in the wild way), it works as it should: In [27]: a.dtypedescr == b.dtypedescr Out[27]: True However, I cannot see the way to make the a.dtype == b.dtype comparison to work well. I'm trying to figure out where the heck should I implement the tp_compare function for this case, but I can't. Some clues anyone? Well, perhaps this is not very important, but I have been bitten enough by the 'i' vs 'l' dichotomy and I'd love that scipy_core finally does the correct thing. BTW, it seems that support for scipy_core homogeneous arrays (both numerical and strings) in PyTables is in place. Now, it remains records and unicode strings. I'll keep you informed on my progress. -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" -------------- next part -------------- A non-text attachment was scrubbed... Name: arrayobject.patch Type: text/x-diff Size: 1561 bytes Desc: not available URL: From pebarrett at gmail.com Tue Jan 3 22:39:30 2006 From: pebarrett at gmail.com (Paul Barrett) Date: Tue, 3 Jan 2006 22:39:30 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <40e64fa20601031939l5cc2329bm28fd67f31a7880aa@mail.gmail.com> On 1/3/06, Travis E. Oliphant wrote: > > Please give a vote (+1, 0, -1) on the following names: > > 1. psipi -1 > 2. scicore +1 > 3. muscle -1 > 4. numstar -1 > 5. mantis -1 > 6. numpy +1 > > And 7. arrakor +1 -- Paul Barrett, PhD Johns Hopkins University Assoc. Research Scientist Dept of Physics and Astronomy Phone: 410-516-5190 Baltimore, MD 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jonathan.taylor at utoronto.ca Tue Jan 3 23:12:13 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Tue, 3 Jan 2006 23:12:13 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <463e11f90601032012g2945ab1chffd0a7da2d27d276@mail.gmail.com> > 1. psipi -1 > 2. scicore 0 > 3. muscle -1 > 4. numstar -1 > 5. mantis -1 > 6. numpy +1 From charlesr.harris at gmail.com Tue Jan 3 23:22:10 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 3 Jan 2006 21:22:10 -0700 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: On 1/3/06, Travis E. Oliphant wrote: > > Please give a vote (+1, 0, -1) on the following names: > > 1. psipi -1 2. scicore 0 or scikor 3. muscle -1 4. numstar -1 5. mantis 0 6. numpy -1 (numpy++, to bad no +'s allowed) -------------- next part -------------- An HTML attachment was scrubbed... URL: From jonathan.taylor at utoronto.ca Tue Jan 3 23:22:45 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Tue, 3 Jan 2006 23:22:45 -0500 Subject: [SciPy-dev] [Fwd: Re: Renaming scipy_core: winner is *numerix*] In-Reply-To: <200601032020.21489.dd55@cornell.edu> References: <43BB1169.5070405@ieee.org> <200601032020.21489.dd55@cornell.edu> Message-ID: <463e11f90601032022n1426c110sa835f393002bdce8@mail.gmail.com> I love the idea of numpy/scipy. It is incredibly consistent and simple. If I recall correctly when I first was looking at numerical python solutions numpy seemed to be related to Numeric and numarray anyways, and thus it would be perfect to represent the hybrid that scipy_core is going for. Ya/no? Jon. On 1/3/06, Darren Dale wrote: > On Tuesday 03 January 2006 7:06 pm, Travis Oliphant wrote: > > >Gary Ruben writes: > > > > I don't know what trademark problems they're referring to, but you > > > > might have to continue looking for a different name. > > > > > >This may be what they're talking about: > > > > > >http://www.numerix-dsp.com > > > > ***Bad News*** > > > > Well, I did a search on the U.S. Trademark database and it looks like > > they have trademarked the word Numerix. Here is the result of that search. > > * > > Typed Drawing* > > ------------------------------------------------------------------------ > > *Word Mark* *NUMERIX* > > *Goods and Services* IC 009. US 021 023 026 036 038. G & S: computer > > software for pricing, valuing, and managing risk of financial > > instruments, derivatives, investment securities, and portfolios > > therefor, and instructional manuals sold as a unit therewith. FIRST USE: > > 19970912. FIRST USE IN COMMERCE: 19971224 > > *Mark Drawing Code* (1) TYPED DRAWING > > *Design Search Code* > > *Serial Number* 75092619 > > *Filing Date* April 18, 1996 > > *Current Filing Basis* 1A > > *Original Filing Basis* 1B > > *Published for Opposition* July 29, 1997 > > *Registration Number* 2302369 > > *Registration Date* December 21, 1999 > > *Owner* (REGISTRANT) Numerix LLC LTD LIAB CO DELAWARE 67 East Cedar > > Chicago ILLINOIS 60611 > > *Attorney of Record* Randall M. Whitmeyer > > *Type of Mark* TRADEMARK > > *Register* PRINCIPAL > > *Live/Dead Indicator* LIVE > > > > > > > > I'm afraid this is a deal-breaker as our library could be used in this > > kind of computation, I'm very concerned that in the future, this company > > may decide to try and enforce their trademark which would be a lot of > > upheaval. Should I be concerned? I also noticed that lots of people > > have trademarked the PYTHON word. > > > > How much should I care about this? > > > > By the way, neither psipi or scicore have trademark problems. > > > > Also numerics (VISUAL NUMERICS is a trademark) and numeriks could be used. > > > > Please help... > > Can we reconsider using numpy? numpy/scipy is a consistent naming scheme, and > it reflects our roots. > > Darren > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Tue Jan 3 23:23:53 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 3 Jan 2006 21:23:53 -0700 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: On 1/3/06, Charles R Harris wrote: > > > On 1/3/06, Travis E. Oliphant wrote: > > > > Please give a vote (+1, 0, -1) on the following names: > > > > 1. psipi -1 > 2. scicore 0 or scikor > 3. muscle -1 > 4. numstar -1 > 5. mantis 0 > 6. numpy -1 (numpy++, to bad no +'s allowed) > or numpyxx -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at shrogers.com Tue Jan 3 23:27:02 2006 From: steve at shrogers.com (Steven H. Rogers) Date: Tue, 03 Jan 2006 21:27:02 -0700 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <43BB4E96.9070402@shrogers.com> Travis E. Oliphant wrote: > Please give a vote (+1, 0, -1) on the following names: > > 1. psipi +1 > 2. scicore 0 > 3. muscle -1 > 4. numstar +1 > 5. mantis 0 > 6. numpy +1 > If none of these looks like a winner, how about psinum, scinum, or numeriq? - Steve From Fernando.Perez at colorado.edu Tue Jan 3 23:26:17 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 21:26:17 -0700 Subject: [SciPy-dev] Please vote on the names In-Reply-To: <463e11f90601032012g2945ab1chffd0a7da2d27d276@mail.gmail.com> References: <463e11f90601032012g2945ab1chffd0a7da2d27d276@mail.gmail.com> Message-ID: <43BB4E69.4000303@colorado.edu> 1. psipi: -1 2. scicore: 0 3. muscle: -1 4. numstar: -1 5. mantis: -1 6. numpy: +1 Cheers, f From jonathan.taylor at utoronto.ca Tue Jan 3 23:28:03 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Tue, 3 Jan 2006 23:28:03 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <463e11f90601032028u5a2a1a11h8f311747e4bd582a@mail.gmail.com> What is wrong with numpy? It really is the only one that makes sense in describing what the package is supposed to do while retaining consistentsy with standard naming conventions. On 1/3/06, Charles R Harris wrote: > > > > On 1/3/06, Charles R Harris wrote: > > > > > > On 1/3/06, Travis E. Oliphant < oliphant.travis at ieee.org> wrote: > > > Please give a vote (+1, 0, -1) on the following names: > > > > > > > > 1. psipi -1 > > 2. scicore 0 or scikor > > 3. muscle -1 > > 4. numstar -1 > > 5. mantis 0 > > 6. numpy -1 (numpy++, to bad no +'s allowed) > > or numpyxx > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > > > From pebarrett at gmail.com Tue Jan 3 23:28:49 2006 From: pebarrett at gmail.com (Paul Barrett) Date: Tue, 3 Jan 2006 23:28:49 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: <43BB4E96.9070402@shrogers.com> References: <43BB4E96.9070402@shrogers.com> Message-ID: <40e64fa20601032028ie62ae91u4abd87ccc61541e6@mail.gmail.com> numeriq has the same encumbrance as numerix. On 1/3/06, Steven H. Rogers wrote: > > > > Travis E. Oliphant wrote: > > Please give a vote (+1, 0, -1) on the following names: > > > > 1. psipi +1 > > 2. scicore 0 > > 3. muscle -1 > > 4. numstar +1 > > 5. mantis 0 > > 6. numpy +1 > > > > If none of these looks like a winner, how about psinum, scinum, or > numeriq? > > - Steve > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -- Paul Barrett, PhD Johns Hopkins University Assoc. Research Scientist Dept of Physics and Astronomy Phone: 410-516-5190 Baltimore, MD 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant.travis at ieee.org Tue Jan 3 23:34:25 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Tue, 03 Jan 2006 21:34:25 -0700 Subject: [SciPy-dev] Round 3 on naming Message-ID: I'm keeping running totals on the naming on the straw pole page. You can either vote their or respond to this email: Round 3 names are (names surviving round 2 plus with positive votes plus a few new ones) nagini scicore numpy numpyxx arrakor scikor numeriq scinum Please vote on these: -Travis From prabhu_r at users.sf.net Wed Jan 4 00:07:30 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Wed, 4 Jan 2006 10:37:30 +0530 Subject: [SciPy-dev] [Fwd: Renaming scipy_core: back to square 1] In-Reply-To: <43BB3B49.5020706@astraw.com> References: <43BB2097.1010102@ieee.org> <43BB3B49.5020706@astraw.com> Message-ID: <17339.22546.117803.594481@prpc.aero.iitb.ac.in> >>>>> "Andrew" == Andrew Straw writes: >> >> nagini --- "he who should remane nameless" 's snake in Harry >> Potter ( since we are having such trouble with a name >> Andrew> A nice excerpt from http://en.wikipedia.org/wiki/Nagini : Andrew> "Nagini is the female name for cobra in Hindi Andrew> and most other Indian Andrew> languages." If you want more along "snake" in another language: nag naga sarpa Vasuki (king of snakes) pambu (snake in Tamil) I can get you more if you are interested -- there are O(20) recognized languages in India and lots of dialects, they share similar roots but there should be a lot to pick from. Along the lines of mythical characters why not Kaa from the Jungle book? http://disney.go.com/vault/archives/villains/kaa/kaa.html cheers, prabhu From prabhu_r at users.sf.net Wed Jan 4 00:17:59 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Wed, 4 Jan 2006 10:47:59 +0530 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <17339.23175.875763.180169@prpc.aero.iitb.ac.in> >>>>> "Travis" == Travis E Oliphant writes: Travis> Please give a vote (+1, 0, -1) on the following names: 1. psipi -1 (too similar to scipy -- think of a talk when you say, install psipi first and then scipy or "scipy depends on psipi". Reminds me of Thomson and Thompson in the Tintin series). 2. scicore 0 (verbose, unimaginative) 3. muscle 0 (nice, but might be confusing to biologists?) 4. numstar -1 (not very pretty, verbose) 5. mantis 0 (cute, but we love snakes, not insects ;) 6. numpy +0.5 (OK, but problems with history and thereby confusion for new users?) How about num, numpp, npp? cheers, prabhu From Fernando.Perez at colorado.edu Wed Jan 4 00:20:33 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 22:20:33 -0700 Subject: [SciPy-dev] [Fwd: Renaming scipy_core: back to square 1] In-Reply-To: <17339.22546.117803.594481@prpc.aero.iitb.ac.in> References: <43BB2097.1010102@ieee.org> <43BB3B49.5020706@astraw.com> <17339.22546.117803.594481@prpc.aero.iitb.ac.in> Message-ID: <43BB5B21.4040000@colorado.edu> Prabhu Ramachandran wrote: > If you want more along "snake" in another language: > > nag Like in 'Numerical Algorithms Group' (http://www.nag.co.uk) ? Talk about asking for a lawsuit :) > naga I actually like it, but in its more English 'naja' form (http://www.tiscali.co.uk/reference/encyclopaedia/hutchinson/m0007527.html). It's also an excellent ice axe, which has to count for something :) The cobra certainly qualifies for a fast and mean snake, which rings nice to me with the python theme ("pythons are good, but slow killers: when you need the job done fast, get the naja out") Man, we need to hire one of those name marketing companies... We suck at this. Cheers, f From oliphant.travis at ieee.org Wed Jan 4 01:45:22 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 03 Jan 2006 23:45:22 -0700 Subject: [SciPy-dev] Leaning toward numpy Message-ID: <43BB6F02.3080704@ieee.org> Hi everyone, Well, the naming game has got to end soon. My semester starts soon and I've got to get a release out. I'm leaning toward numpy for several reasons: 1) It has history --- and therefore a web-domain name, and a sourceforge site. Googling for numpy already returns stuff about Python and numbers. 2) Currently both numarray and Numeric are under the sourceforge site, numpy could go there without difficulty and lead out as a merger of the two. 3) We've been using the concept of numpy for a long time, but never the namespace. I know it's not cool like naja would be, but it is descriptive and carries with it marketing power already. If you can come up with arguments as to why numpy is a *terrible* choice, please send them in *quickly*. Best, -Travis From Fernando.Perez at colorado.edu Wed Jan 4 01:48:55 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 03 Jan 2006 23:48:55 -0700 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB6F02.3080704@ieee.org> References: <43BB6F02.3080704@ieee.org> Message-ID: <43BB6FD7.8080203@colorado.edu> Travis Oliphant wrote: > Hi everyone, > > Well, the naming game has got to end soon. My semester starts soon and > I've got to get a release out. I'm leaning toward numpy for several > reasons: > > 1) It has history --- and therefore a web-domain name, and a sourceforge > site. Googling for numpy already returns stuff about Python and numbers. > 2) Currently both numarray and Numeric are under the sourceforge site, > numpy could go there without difficulty and lead out as a merger of the two. > 3) We've been using the concept of numpy for a long time, but never the > namespace. > > I know it's not cool like naja would be, but it is descriptive and > carries with it marketing power already. If you can come up with > arguments as to why numpy is a *terrible* choice, please send them in > *quickly*. +1 on numpy (naja was almost a joke: I suck at this, and I don't really like 'cute' names, as I said before). We'll live, even if it's not as nice as numerix. And any more of your time spent on this is time not spent coding: we don't want that :) Best, f From eric at enthought.com Wed Jan 4 01:59:21 2006 From: eric at enthought.com (eric jones) Date: Wed, 04 Jan 2006 00:59:21 -0600 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB6F02.3080704@ieee.org> References: <43BB6F02.3080704@ieee.org> Message-ID: <43BB7249.7030404@enthought.com> I'm for one of the following names: numpy really_really_fast_multi_dimensional_array_library_for_science_and_math The benefit of the second is that typing singular_value_decomposition() would seem charmingly short. eric multi_dimensional_array Travis Oliphant wrote: >Hi everyone, > >Well, the naming game has got to end soon. My semester starts soon and >I've got to get a release out. I'm leaning toward numpy for several >reasons: > >1) It has history --- and therefore a web-domain name, and a sourceforge >site. Googling for numpy already returns stuff about Python and numbers. >2) Currently both numarray and Numeric are under the sourceforge site, >numpy could go there without difficulty and lead out as a merger of the two. >3) We've been using the concept of numpy for a long time, but never the >namespace. > >I know it's not cool like naja would be, but it is descriptive and >carries with it marketing power already. If you can come up with >arguments as to why numpy is a *terrible* choice, please send them in >*quickly*. > >Best, > >-Travis > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From Fernando.Perez at colorado.edu Wed Jan 4 02:00:11 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 00:00:11 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601022247.56236.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021935.10046.faltet@carabos.com> <43B975B9.5000407@colorado.edu> <200601022247.56236.faltet@carabos.com> Message-ID: <43BB727B.7040406@colorado.edu> Francesc Altet wrote: >>I've toyed with the idea of making the autocall flag be a 0,1,2 integer, >>where today's 'on' would become 2 for 'full', while 1 would be an >>intermediate mode, doing autocall ONLY if there are arguments: >> >>1: foo bar -> foo(bar) >> foo -> foo >> >>2: foo bar -> foo(bar) >> foo -> foo() >> >>But I'm out of time for more significant ipython work for now, I'm afraid >>(a lot of new features went in recently, but now I need to work on other >>things). >> >>Vote if you like it, though, and I'll consider it for the future. > > > That's an interesting proposal, indeed. > > +1 for the improved autocall intermediate level. Done, in SVN ('smart' mode -1- is the default now) Cheers, f From prabhu_r at users.sf.net Wed Jan 4 02:10:57 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Wed, 4 Jan 2006 12:40:57 +0530 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB6F02.3080704@ieee.org> References: <43BB6F02.3080704@ieee.org> Message-ID: <17339.29953.368457.132676@prpc.aero.iitb.ac.in> >>>>> "Travis" == Travis Oliphant writes: Travis> I know it's not cool like naja would be, but it is Travis> descriptive and carries with it marketing power already. Travis> If you can come up with arguments as to why numpy is a Travis> *terrible* choice, please send them in *quickly*. I say go for it. numpy is dead, long live numpy! :) cheers, prabhu From oliphant.travis at ieee.org Wed Jan 4 02:12:57 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Wed, 04 Jan 2006 00:12:57 -0700 Subject: [SciPy-dev] Some suggestions for scipy_core In-Reply-To: <200601040416.57705.faltet@carabos.com> References: <200601020104.01140.faltet@carabos.com> <200601021754.03309.faltet@carabos.com> <200601040416.57705.faltet@carabos.com> Message-ID: Francesc Altet wrote: > > > I've made some progress for comparing datatypes of type 'i' and 'l' > correctly. Right now, in 32-bit platforms, scipy_core gives: > > In [25]: a=scicore.arange(10,dtype='i') > > In [26]: b=scicore.arange(10,dtype='l') > > In [27]: a.dtypedescr == b.dtypedescr > Out[27]: False # !! > > With the attached patch (you should revise it, because I'm not used to > write C extensions in the wild way), it works as it should: Thanks for the patch. I'll add a compare function for the typedescriptors. There is already a C-API call for testing equivalence that we can use. I'll probably use casting conventions to decide on greater or less than if they are not the same... > > However, I cannot see the way to make the a.dtype == b.dtype > comparison to work well. I'm trying to figure out where the heck > should I implement the tp_compare function for this case, but I can't. > Some clues anyone? Well, perhaps this is not very important, but I > have been bitten enough by the 'i' vs 'l' dichotomy and I'd love that > scipy_core finally does the correct thing. We would have to create a meta-class and re-define the comparison function. This would also let us change how these type objects print. The a.dtype is a typeobject which means that it's an *instance* of the PyType_Type. We would create our own PyScalarType_Type, inherit from PyType_Type and then use this has the base-type of all the array scalars (instead of object). So, I'm not surprised you didn't know where to add the compare. I barely learned myself during the course of this work. -Travis > > BTW, it seems that support for scipy_core homogeneous arrays (both > numerical and strings) in PyTables is in place. Now, it remains > records and unicode strings. I'll keep you informed on my progress. > Fantastic. If you could help us develop a numcompat module that would be great... From gruben at bigpond.net.au Wed Jan 4 02:16:13 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Wed, 04 Jan 2006 18:16:13 +1100 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB6F02.3080704@ieee.org> References: <43BB6F02.3080704@ieee.org> Message-ID: <43BB763D.70305@bigpond.net.au> FWIW +1 on numpy. I'd also like to point out that Prabhu's snake word 'sarpa' could stand for scientific array package :-) Gary R. Travis Oliphant wrote: > Hi everyone, > > Well, the naming game has got to end soon. My semester starts soon and > I've got to get a release out. I'm leaning toward numpy for several > reasons: From jonathan.taylor at utoronto.ca Wed Jan 4 02:20:45 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Wed, 4 Jan 2006 02:20:45 -0500 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB763D.70305@bigpond.net.au> References: <43BB6F02.3080704@ieee.org> <43BB763D.70305@bigpond.net.au> Message-ID: <463e11f90601032320t6518a228v4aba43f1109d141b@mail.gmail.com> Absolutely... numpy makes by far the most sense. Vive numpy and scipy J. On 1/4/06, Gary Ruben wrote: > FWIW +1 on numpy. > > I'd also like to point out that Prabhu's snake word 'sarpa' could stand > for scientific array package :-) > > Gary R. > > Travis Oliphant wrote: > > Hi everyone, > > > > Well, the naming game has got to end soon. My semester starts soon and > > I've got to get a release out. I'm leaning toward numpy for several > > reasons: > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From arnd.baecker at web.de Wed Jan 4 02:24:56 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 4 Jan 2006 08:24:56 +0100 (CET) Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB7249.7030404@enthought.com> References: <43BB6F02.3080704@ieee.org> <43BB7249.7030404@enthought.com> Message-ID: Good morning all (at least from this side of the world ;-), This is indeed a great start - just 85 messages in my `comp` inbox... On Wed, 4 Jan 2006, eric jones wrote: > I'm for one of the following names: > > numpy > really_really_fast_multi_dimensional_array_library_for_science_and_math > > The benefit of the second is that typing singular_value_decomposition() > would seem charmingly short. OK, so we would just need to do import really_really_fast_multi_dimensional_array_library_for_science_and_math as rrfmdalfsam svd=rrfmdalfsam.singular_value_decomposition svd(...) looks fine to me! What about renaming to really_truly_fast_multi_dimensional_array_library_for_science_and_math Reasons: a bit shorter and allows the shorthand version rtfmdalfsam (umpf - any hidden trademark problems here with "alf"?), maybe with some more days of joint work we could get the second part better ... Now seriously to this one - I did not yet recieve the original message: > Travis Oliphant wrote: > > >Hi everyone, > > > >Well, the naming game has got to end soon. My semester starts soon and > >I've got to get a release out. I'm leaning toward numpy for several > >reasons: > > > >1) It has history --- and therefore a web-domain name, and a sourceforge > >site. Googling for numpy already returns stuff about Python and numbers. > >2) Currently both numarray and Numeric are under the sourceforge site, > >numpy could go there without difficulty and lead out as a merger of the two. > >3) We've been using the concept of numpy for a long time, but never the > >namespace. +1 numpy/scipy combo sounds good and still ndarray could go into main python at some point. Best, Arnd From cimrman3 at ntc.zcu.cz Wed Jan 4 03:45:30 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 04 Jan 2006 09:45:30 +0100 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB6FD7.8080203@colorado.edu> References: <43BB6F02.3080704@ieee.org> <43BB6FD7.8080203@colorado.edu> Message-ID: <43BB8B2A.9000604@ntc.zcu.cz> Fernando Perez wrote: > Travis Oliphant wrote: > >>Hi everyone, >> >>Well, the naming game has got to end soon. My semester starts soon and >>I've got to get a release out. I'm leaning toward numpy for several >>reasons: >> >>1) It has history --- and therefore a web-domain name, and a sourceforge >>site. Googling for numpy already returns stuff about Python and numbers. >>2) Currently both numarray and Numeric are under the sourceforge site, >>numpy could go there without difficulty and lead out as a merger of the two. >>3) We've been using the concept of numpy for a long time, but never the >>namespace. >> >>I know it's not cool like naja would be, but it is descriptive and >>carries with it marketing power already. If you can come up with >>arguments as to why numpy is a *terrible* choice, please send them in >>*quickly*. > > > +1 on numpy (naja was almost a joke: I suck at this, and I don't really like > 'cute' names, as I said before). > > We'll live, even if it's not as nice as numerix. And any more of your time > spent on this is time not spent coding: we don't want that :) > > Best, +1 for numpy, and do not forget one can always use 'import numpy as numerix' :-) cheers, r. From arnd.baecker at web.de Wed Jan 4 04:08:06 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 4 Jan 2006 10:08:06 +0100 (CET) Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BB8B2A.9000604@ntc.zcu.cz> References: <43BB6F02.3080704@ieee.org> <43BB6FD7.8080203@colorado.edu> <43BB8B2A.9000604@ntc.zcu.cz> Message-ID: On Wed, 4 Jan 2006, Robert Cimrman wrote: > +1 for numpy, and do not forget one can always use 'import numpy as > numerix' :-) A colleague of mine just suggested numpie, scipie, f2pie - not sure if it is a good time to through in another bunch of names, but I am certain that these would be good names for the `numpy cookbook` and the `scipy cookbook`. Suggestions for an appendix with recipes for `apple pie` are welcome ... Couldn't-resist-ly, yours, Arnd From schofield at ftw.at Wed Jan 4 07:33:06 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 4 Jan 2006 12:33:06 +0000 Subject: [SciPy-dev] Round 3 on naming In-Reply-To: References: Message-ID: Here are two suggestions for round 4: newmerix newmeric -- Ed On 04/01/2006, at 4:34 AM, Travis E. Oliphant wrote: > > I'm keeping running totals on the naming on the straw pole page. You > can either vote their or respond to this email: > > Round 3 names are (names surviving round 2 plus with positive votes > plus > a few new ones) > > nagini > scicore > numpy > numpyxx > arrakor > scikor > numeriq > scinum > > > Please vote on these: > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From vel.accel at gmail.com Wed Jan 4 07:45:41 2006 From: vel.accel at gmail.com (Deiter Hering) Date: Wed, 04 Jan 2006 07:45:41 -0500 Subject: [SciPy-dev] Round 3 on naming In-Reply-To: References: Message-ID: <43BBC375.30808@gmail.com> On 04/01/2006, at 4:34 AM, Travis E. Oliphant wrote: > > >>I'm keeping running totals on the naming on the straw pole page. You >>can either vote their or respond to this email: >> >>Round 3 names are (names surviving round 2 plus with positive votes >>plus >>a few new ones) >> >>nagini >>scicore >>numpy >>numpyxx >>arrakor >>scikor >>numeriq >>scinum >> >> >>Please vote on these: >> >>-Travis >> >>_______________________________________________ >>Scipy-dev mailing list >>Scipy-dev at scipy.net >>http://www.scipy.net/mailman/listinfo/scipy-dev >> >> > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > > The python community (aside from sig sci folks) are already familiar with numpy as being a "collective" reference to the numeric/numarray. Therefore when all roads lead to the "All New Numpy" It will be intuitively understood as the convergence and eventual of the deprecation of the prior packages. +1 Numpy -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at shrogers.com Wed Jan 4 08:27:37 2006 From: steve at shrogers.com (Steven H. Rogers) Date: Wed, 04 Jan 2006 06:27:37 -0700 Subject: [SciPy-dev] Round 3 on naming In-Reply-To: References: Message-ID: <43BBCD49.3060602@shrogers.com> Travis E. Oliphant wrote: > I'm keeping running totals on the naming on the straw pole page. You > can either vote their or respond to this email: > > Round 3 names are (names surviving round 2 plus with positive votes plus > a few new ones) > > nagini 0 > scicore 0 > numpy +1 > numpyxx -1 > arrakor 0 > scikor -1 > numeriq 0 > scinum +1 > From pearu at scipy.org Wed Jan 4 07:48:46 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 06:48:46 -0600 (CST) Subject: [SciPy-dev] Round 3 on naming In-Reply-To: References: Message-ID: > nagini -1 > scicore +1 > numpy +1 > numpyxx 0 > arrakor -1 > scikor -1 > numeriq 0 > scinum 0 Has anyone offered arrpy already? Pearu From faltet at carabos.com Wed Jan 4 09:20:31 2006 From: faltet at carabos.com (Francesc Altet) Date: Wed, 4 Jan 2006 15:20:31 +0100 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <17339.29953.368457.132676@prpc.aero.iitb.ac.in> References: <43BB6F02.3080704@ieee.org> <17339.29953.368457.132676@prpc.aero.iitb.ac.in> Message-ID: <200601041520.31786.faltet@carabos.com> A Dimecres 04 Gener 2006 08:10, Prabhu Ramachandran va escriure: > I say go for it. numpy is dead, long live numpy! :) +1 :-D -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From aisaac at american.edu Wed Jan 4 09:56:43 2006 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 4 Jan 2006 09:56:43 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: <17339.23175.875763.180169@prpc.aero.iitb.ac.in> References: <17339.23175.875763.180169@prpc.aero.iitb.ac.in> Message-ID: On Wed, 4 Jan 2006, Prabhu Ramachandran apparently wrote: > 6. numpy +0.5 (OK, but problems with history and > thereby confusion for new users?) I suspect *new* users will - be no more confused than by a version change - be more likely to find numpy because they are more likely to have heard of it - are more likely to understand from the name what it is Some old Numeric users might be momentarily confused, but I doubt this is significant. A Google search on numpy suggests the current status of things ... which favors numpy IMO. Cheers, Alan Isaac From perry at stsci.edu Wed Jan 4 10:37:04 2006 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 4 Jan 2006 10:37:04 -0500 Subject: [SciPy-dev] Please vote on the names In-Reply-To: References: Message-ID: <59685e5fbeb5b442d8ecf0ee031d073b@stsci.edu> On Jan 3, 2006, at 10:08 PM, Fonnesbeck, Chris wrote: > On 1/3/06 9:59 PM, "Andrew Straw" wrote: > >> Do names like "numerics" or "numeriks" open us up to cease-and-desist >> letters with anything more than empty-threat status from someone who >> owns a trademark on "numerix"? > > According to my wife (a pretty good lawyer), they are close enough as > to be > confusing for a potential consumer (recall Lindows), so they should be > avoided. > To belabor what is almost certainly a moot point now, I doubt that a generic term like numerics could be argued to be a trademark violation of a alternate spelling of that term. I even wonder if use of numerix is a problem here since confusion doesn't seem likely; one is the name of a company and the other the name of a free software package. There is obviously no attempt to leverage off of their market or awareness of their name. Despite that there certainly seems to be a risk in that no one wants to have to argue the point in court even if it seems baseless. Probably the company's biggest concern is that google might show our numerix over theirs at some point detracting from their visibility. Perry From jh at oobleck.astro.cornell.edu Wed Jan 4 10:43:05 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Wed, 4 Jan 2006 10:43:05 -0500 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: (scipy-dev-request@scipy.net) References: Message-ID: <200601041543.k04Fh5Yd015094@oobleck.astro.cornell.edu> +1 numpy, -googolplex (ok, -1) anything else! The only objection someone raised to numpy-ish names earlier was the inclusion of "-py" in the name, which other Python packages lack, perhaps deliberately. However, "scipy" already offends on this score, and nobody favored changing it in Round 1. Maybe we don't get double penalty for two crimes. I haven't heard any complaints about "scipy" over the past few years (well, not about the name, anyway). I like "numpy" for several reasons: 1. it has history/name recognition 2. it is short 3. it is descriptive 4. it is easy to remember 5. it is not "cute" (nobody will be embarrassed to type it in a lecture) 6. we will not have to explain the etymology to every new user (eg, nagini) 7. it is the hands-down winner of Round 3 voting on the web page 8. it is the only name we will agree on *today* Travis uttered chilling words: "My semester starts soon"! Please, let's bring this historic conversation to an end and go with numpy now (before historic becomes hysteric...). [You can all breathe a sigh of relief that I chose the better part of valor and deleted the "Grease" chorus I had typed here, with "Numpy's" substituted for "Grease is"... or maybe you can't...;-)] --jh-- From oliphant.travis at ieee.org Wed Jan 4 11:49:46 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 04 Jan 2006 09:49:46 -0700 Subject: [SciPy-dev] New name is *numpy* Message-ID: <43BBFCAA.50001@ieee.org> Thank you for all the great ideas regarding a new name. There were some very clever ones. In the end practicality beat out purity and numpy will be the new name. With all the history behind the name (yet no namespace usage) it seems the perfect candidate. Yes, there are reasons against it, but in the end it's just a name. I think Fernando said it best: "We'll get used to it..." There is a new name for the repository: http://svn.scipy.org/svn/numpy/trunk I did a dump and load operation so the new repository contains all the old history. I will delete the scipy_core repository as soon as the trac pages are set up. Note that the new repository still needs to do the name changes (these are happening on the branches/numpy branch so the trunk will be usable during the change). I will work very hard to make the name changes quickly, but it may take several hours to a full day. I think this will push our release day out a bit... If you run into permissions difficulties on the new svn repo, let me know. Everybody who had access should still have access... If you still need access, send me a username... If I know who you are, I'll set you up immediately :-) otherwise I may need some persuading :-) -Travis From swisher at enthought.com Wed Jan 4 12:19:02 2006 From: swisher at enthought.com (Janet M. Swisher) Date: Wed, 04 Jan 2006 11:19:02 -0600 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: References: Message-ID: <43BC0386.5090409@enthought.com> > What is wrong with numpy? The following are the objections that were raised against "numpy", before "numerix" was selected and then abandoned. I have no opinion, just reminding y'all of the previous discussion. From: Charles R Harris I think numpy has too strong a connection to numeric. From: Fernando Perez -1 on 'numpy': if at some point in the future this is considered a candidate for the stdlib, I don't think we want anything with a 'py' suffix in the package name (not a single module in the stdlib ends in 'py' today, except for 'copy'). I also don't want to have to explain to anybody "well, this isn't really the old numpy, it's the new numpy, which was written by the guy who was maintaining the old numpy, but it's a new code, and it also has features from numarray, so it's like the old numpy + numarray + better, but it's called numpy again. Easy, no?" Really, I don't. From: Francesc Altet -1 numpy (may sound confusing with existing Numeric) From: Ed Schofield I'm with Fernando on this, for all the same reasons. -- Janet Swisher --- Senior Technical Writer Enthought, Inc. http://www.enthought.com From jonathan.taylor at utoronto.ca Wed Jan 4 12:30:56 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Wed, 4 Jan 2006 12:30:56 -0500 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <43BC0386.5090409@enthought.com> References: <43BC0386.5090409@enthought.com> Message-ID: <463e11f90601040930l79e686b2n9d9c908610ac6281@mail.gmail.com> > I also don't want to have to explain to anybody "well, this isn't really the > old numpy, it's the new numpy, which was written by the guy who was > maintaining the old numpy, but it's a new code, and it also has features from > numarray, so it's like the old numpy + numarray + better, but it's called > numpy again. Easy, no?" Really, I don't. Why explain this to anyone? Simply put it on the web page. If it is really a source of confusion, just up the version number to numpy 2.x and say that as of 2.x, numpy incorporates all the features of Numeric and numarray. I sincerely believe numpy/scipy is really is the most consistent naming scheme and the most natural for people to look for, when they are first looking for numerical python software. Cheers. Jon. From Fernando.Perez at colorado.edu Wed Jan 4 13:10:44 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 11:10:44 -0700 Subject: [SciPy-dev] Leaning toward numpy In-Reply-To: <463e11f90601040930l79e686b2n9d9c908610ac6281@mail.gmail.com> References: <43BC0386.5090409@enthought.com> <463e11f90601040930l79e686b2n9d9c908610ac6281@mail.gmail.com> Message-ID: <43BC0FA4.8070602@colorado.edu> Jonathan Taylor wrote: >>I also don't want to have to explain to anybody "well, this isn't really the >>old numpy, it's the new numpy, which was written by the guy who was >>maintaining the old numpy, but it's a new code, and it also has features from >>numarray, so it's like the old numpy + numarray + better, but it's called >>numpy again. Easy, no?" Really, I don't. > > > Why explain this to anyone? Simply put it on the web page. If it is > really a source of confusion, just up the version number to numpy 2.x > and say that as of 2.x, numpy incorporates all the features of Numeric > and numarray. I sincerely believe numpy/scipy is really is the most > consistent naming scheme and the most natural for people to look for, > when they are first looking for numerical python software. Not to worry: numpy is set now, we're moving on. The above was my reason to -1 numpy *when numerix was in play*. Now that numerix is out, I'm happy that Travis went for numpy. We'll adapt and explain it in the transition period, and in a few months all possible confusion will be history. My other con for numpy was the 'no py endings for the stdlib'. But now we've agreed that the only part for (potential, maybe, future) inclusion in the stdlib would be the ndarray subpackage, so the 'py' suffix is not an issue for numpy. In the words of our very own Prabhu, 'numpy is dead, long live numpy!' Cheers, f From perry at stsci.edu Wed Jan 4 13:59:25 2006 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 4 Jan 2006 13:59:25 -0500 Subject: [SciPy-dev] chararry array method In-Reply-To: <43B45C97.2090103@ee.byu.edu> References: <43B43957.6080509@stsci.edu> <43B45569.4040408@ee.byu.edu> <43B45918.7030902@stsci.edu> <43B45C97.2090103@ee.byu.edu> Message-ID: On Dec 29, 2005, at 5:00 PM, Travis Oliphant wrote: > > So, this is taking a buffer and chopping it into string bits. > Currently, the chararray array function does not take a buffer input. > Yes, this is common for us as we usually create these from tables obtained from files where some columns of the tables contain fixed width strings. It would be uncommon for the data buffer to contain only strings, but we generally need to create such arrays from data buffers. > I would suggest not using character arrays in pyfits just yet. They > are > not really necessary, because normal arrays can be of string type. If > you really need the functionality of the chararray (string methods or > equality testing), then create it after creating the normal array (no > data will be copied). I'd like to better understand use cases of > special string arrays a little better. I'm not sure I completely > understand why numarray split everything into different array types. > Much more is supported in the basic array type in scipy. > > -Travis I suppose this points to the fact that I'm not clear on what different roles the string array (and unicode) and character arrays play. In numarray it was thought that eventually that character arrays would support all the string methods (within reason considering the constraints of fixed size) and that made it different enough from numeric arrays. Is this detailed anywhere? I tried finding it in the latest version of the Guide but it seems that the topic of string arrays isn't discussed a lot. So a brief outline of how you see this working might help (e.g., should we really be working on enhancing the string array instead of focusing on character arrays?) Perry From oliphant at ee.byu.edu Wed Jan 4 14:19:08 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 12:19:08 -0700 Subject: [SciPy-dev] chararry array method In-Reply-To: References: <43B43957.6080509@stsci.edu> <43B45569.4040408@ee.byu.edu> <43B45918.7030902@stsci.edu> <43B45C97.2090103@ee.byu.edu> Message-ID: <43BC1FAC.8000305@ee.byu.edu> Perry Greenfield wrote: >On Dec 29, 2005, at 5:00 PM, Travis Oliphant wrote: > > >>So, this is taking a buffer and chopping it into string bits. >>Currently, the chararray array function does not take a buffer input. >> >> >> >Yes, this is common for us as we usually create these from tables >obtained >from files where some columns of the tables contain fixed width strings. >It would be uncommon for the data buffer to contain only strings, but we >generally need to create such arrays from data buffers. > > Well, the new chararray function actually does support this (it was easy enough to just do it). Right now, the chararray's are essentially string and/or unicode ndarray's with added methods for rich-comparisons, and the same methods as strings and unicode objects. It's also a nice example for how to do broadcasting in Python alone.... I would like to move the rich comparisions into the ndarray object at some point (either by having ufuncs supported for extended types or by special-casing the richcompare for string and unicode type ndarray's), so that any string or unicode type can use them... >I suppose this points to the fact that I'm not clear on what different >roles the string array (and unicode) and character arrays play. In >numarray >it was thought that eventually that character arrays would support all >the >string methods (within reason considering the constraints of fixed >size) and >that made it different enough from numeric arrays. Is this detailed >anywhere? > > The string and unicode arrays are separate data-types for ndarray's. They are supported at a fundamental level throughout the code base. In other words you can have an ndarray of type (string, 30) (i.e. 'S30') or (unicode, 45) (i.e. 'U45'). However, because ufuncs do not support extended types at this time, and the richcompare for the ndarray defaults to use ufuncs, rich comparisons don't work on them. Now, it would be possible to make it so that the ndarray supported the string methods for string and unicode arrays, but it also makes sense to subclass for that kind of special support, which is what is done now. >I tried finding it in the latest version of the Guide but it seems that >the >topic of string arrays isn't discussed a lot. So a brief outline of how >you see this working might help (e.g., should we really be working on >enhancing the string array instead of focusing on character arrays?) > > > My thinking is that we should get at least the rich comparisons working for string/unicode arrays (whether this makes sense by expanding the ufuncs or simply special casing support for them in the richcomparison function is an immediate question). I can see how it would be possible (but not trivial) to do it in the ufuncs (which would make the ufunc interface more flexible -- but maybe too flexible... I'm not sure I know the use case beyond the comparisions). Whether or not we should look at over-riding the getattribute function to add string and unicode methods for all string/unicode chararrays is another question, but that could also be done... Then, again, it is an easy enough thing to wrap a string array in a subclass if you really want to call the string methods on all the items in the array... So, what is there now is workable and is essentially the same as what was in numarray (I think numarray string comparison functions are faster though --- they were compiled). -Travis From aisaac at american.edu Wed Jan 4 15:14:15 2006 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 4 Jan 2006 15:14:15 -0500 Subject: [SciPy-dev] New name is *numpy* In-Reply-To: <43BBFCAA.50001@ieee.org> References: <43BBFCAA.50001@ieee.org> Message-ID: On Wed, 04 Jan 2006, Travis Oliphant apparently wrote: > In the end practicality beat out purity and numpy will be > the new name. Perhaps you should assert common law trademark rights in numpy. I think you do that just by referring to it as numpy (TM) but IANAL. fwiw, Alan Isaac From pearu at scipy.org Wed Jan 4 15:24:43 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 14:24:43 -0600 (CST) Subject: [SciPy-dev] New name is *numpy* In-Reply-To: <43BBFCAA.50001@ieee.org> References: <43BBFCAA.50001@ieee.org> Message-ID: Hi Travis, On Wed, 4 Jan 2006, Travis Oliphant wrote: > Note that the new repository still needs to do the name changes (these > are happening on the branches/numpy branch so the trunk will be usable > during the change). I will work very hard to make the name changes > quickly, but it may take several hours to a full day. I think this will > push our release day out a bit... How are you doing with the translation? I have almost finished script that converts codes to using 'numpy', if you are almost finished with converting scipy_core, I'll concentrate on converting "full scipy" to use numpy. Pearu From oliphant at ee.byu.edu Wed Jan 4 16:30:56 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 14:30:56 -0700 Subject: [SciPy-dev] New name is *numpy* In-Reply-To: References: <43BBFCAA.50001@ieee.org> Message-ID: <43BC3E90.8060900@ee.byu.edu> Pearu Peterson wrote: >Hi Travis, > >On Wed, 4 Jan 2006, Travis Oliphant wrote: > > > >>Note that the new repository still needs to do the name changes (these >>are happening on the branches/numpy branch so the trunk will be usable >>during the change). I will work very hard to make the name changes >>quickly, but it may take several hours to a full day. I think this will >>push our release day out a bit... >> >> > >How are you doing with the translation? I have almost finished script >that converts codes to using 'numpy', if you are almost finished with >converting scipy_core, I'll concentrate on converting "full scipy" to use >numpy. > > I'm almost done with numpy. I'm trying to get tests to pass, right now... Focus on full scipy. Especially, the __init__.py file. Should we move the Package Loader Class out of the numpy/__init__.py file or not? I've split out old scipy.base into numpy.core numpy.lib but both namespaces will be in numpy so from numpy import xxx, yyy should work as a replacement for from scipy.base import *** In numpy I could just search and replace scipy->numpy, but in full scipy this will be a bit trickier I think. Thanks, -Travis From pearu at scipy.org Wed Jan 4 15:40:41 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 14:40:41 -0600 (CST) Subject: [SciPy-dev] New name is *numpy* In-Reply-To: <43BC3E90.8060900@ee.byu.edu> References: <43BBFCAA.50001@ieee.org> <43BC3E90.8060900@ee.byu.edu> Message-ID: On Wed, 4 Jan 2006, Travis Oliphant wrote: > I'm almost done with numpy. I'm trying to get tests to pass, right now... Ok, let me know if you have commited your changes to svn, then I can relay on numpy when testing full scipy. > Focus on full scipy. Especially, the __init__.py file. Should we move > the Package Loader Class out of the numpy/__init__.py file or not? I think there is no need for using pkgload in numpy as there are only few subpackages in numpy and they are known. We needed pkgload only because the contents of installed scipy/ changed after installing full scipy but scipy/__init__.py came from scipy_core. So, yes, you can remove Package Loader Class out of the numpy/__init__.py and insert explicit imports of subpackages. > I've split out old scipy.base into > > numpy.core > numpy.lib > > but both namespaces will be in numpy > > so from numpy import xxx, yyy > > should work as a replacement for from scipy.base import *** > > In numpy I could just search and replace scipy->numpy, but in full > scipy this will be a bit trickier I think. Yes. Once numpy is stable and I have converted full scipy to use numpy so that all its tests will pass, I'll make a commit without creating intermediate branch. Is that ok with you? Pearu From oliphant at ee.byu.edu Wed Jan 4 17:39:37 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 15:39:37 -0700 Subject: [SciPy-dev] New numpy svn ready Message-ID: <43BC4EA9.9060500@ee.byu.edu> O.K. The new numpy svn is ready for testing. Check it out, and try it please... I'd like to make an official release of it soon... svn co http://svn.scipy.org/svn/numpy/trunk numpy cd numpy python setup.py install -Travis From Fernando.Perez at colorado.edu Wed Jan 4 17:46:00 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 15:46:00 -0700 Subject: [SciPy-dev] New name is *numpy* In-Reply-To: <43BC3E90.8060900@ee.byu.edu> References: <43BBFCAA.50001@ieee.org> <43BC3E90.8060900@ee.byu.edu> Message-ID: <43BC5028.2090406@colorado.edu> Travis Oliphant wrote: > I'm almost done with numpy. I'm trying to get tests to pass, right now... > > Focus on full scipy. Especially, the __init__.py file. Should we move > the Package Loader Class out of the numpy/__init__.py file or not? > > I've split out old scipy.base into > > numpy.core > numpy.lib > > but both namespaces will be in numpy > > so from numpy import xxx, yyy > > should work as a replacement for from scipy.base import *** > > In numpy I could just search and replace scipy->numpy, but in full > scipy this will be a bit trickier I think. One word of advice: a quick scan of the sources showed me several unqualified imports of the type import mtrand I think it would be a good idea to ban these altogether, in light of (as of yet unenforced) PEP 328: http://www.python.org/peps/pep-0328.html I made that change in ipython a while back (import genutils -> from IPython import genutils) to future-proof myself and avoid possible clashes with the stdlib at some point. Cheers, f From pearu at scipy.org Wed Jan 4 16:46:51 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 15:46:51 -0600 (CST) Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <43BC4EA9.9060500@ee.byu.edu> References: <43BC4EA9.9060500@ee.byu.edu> Message-ID: On Wed, 4 Jan 2006, Travis Oliphant wrote: > > O.K. > > The new numpy svn is ready for testing. Check it out, and try it > please... I'd like to make an official release of it soon... > > svn co http://svn.scipy.org/svn/numpy/trunk numpy Do yoy mean svn co http://svn.scipy.org/svn/numpy/branches/numpy numpy ? trunk contains scipy/. Pearu From chris at trichech.us Wed Jan 4 17:48:54 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 17:48:54 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <43BC4EA9.9060500@ee.byu.edu> References: <43BC4EA9.9060500@ee.byu.edu> Message-ID: <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > The new numpy svn is ready for testing. Perhaps related to my error reported the other day, but numpy also breaks PyMC via linalg importing: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site- packages/PyMC/MCMC.py 27 28 # Import scipy functions ---> 29 from scipy import random, linalg global scipy = undefined random = None linalg = undefined 30 # Generalized inverse 31 inverse = linalg.pinv ImportError: cannot import name linalg -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From pearu at scipy.org Wed Jan 4 16:52:17 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 15:52:17 -0600 (CST) Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> Message-ID: On Wed, 4 Jan 2006, Christopher Fonnesbeck wrote: > On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > >> The new numpy svn is ready for testing. > > Perhaps related to my error reported the other day, but numpy also breaks > PyMC via linalg importing: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/PyMC/MCMC.py > 27 > 28 # Import scipy functions > ---> 29 from scipy import random, linalg > global scipy = undefined > random = None > linalg = undefined > 30 # Generalized inverse > 31 inverse = linalg.pinv > > ImportError: cannot import name linalg Full scipy does not work with numpy yet. We are working on it.. Pearu From chris at trichech.us Wed Jan 4 17:53:19 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 17:53:19 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: <43BC4EA9.9060500@ee.byu.edu> Message-ID: <75C5C58E-03E8-4495-B823-3CF79818381C@trichech.us> On Jan 4, 2006, at 4:46 PM, Pearu Peterson wrote: > Do yoy mean > > svn co http://svn.scipy.org/svn/numpy/branches/numpy numpy > > ? trunk contains scipy/. trunk worked for me. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From pearu at scipy.org Wed Jan 4 16:55:31 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 15:55:31 -0600 (CST) Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <75C5C58E-03E8-4495-B823-3CF79818381C@trichech.us> References: <43BC4EA9.9060500@ee.byu.edu> <75C5C58E-03E8-4495-B823-3CF79818381C@trichech.us> Message-ID: On Wed, 4 Jan 2006, Christopher Fonnesbeck wrote: > On Jan 4, 2006, at 4:46 PM, Pearu Peterson wrote: > >> Do yoy mean >> >> svn co http://svn.scipy.org/svn/numpy/branches/numpy numpy >> >> ? trunk contains scipy/. > > trunk worked for me. As far as I can tell, trunk contains old scipy core, not numpy. Pearu From chris at trichech.us Wed Jan 4 17:56:23 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 17:56:23 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> Message-ID: <1D74ECCA-D066-446D-ABA7-74EF87CB3F0C@trichech.us> On Jan 4, 2006, at 4:52 PM, Pearu Peterson wrote: >> >> Perhaps related to my error reported the other day, but numpy also >> breaks >> PyMC via linalg importing: >> >> /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/ >> site-packages/PyMC/MCMC.py >> 27 >> 28 # Import scipy functions >> ---> 29 from scipy import random, linalg >> global scipy = undefined >> random = None >> linalg = undefined >> 30 # Generalized inverse >> 31 inverse = linalg.pinv >> >> ImportError: cannot import name linalg > > Full scipy does not work with numpy yet. We are working on it.. numpy downloaded according to Travis' svn instructions creates "numpy" directory, but installs into "scipy" on site-packages. Replacing my error-generating import above with: from scipy corelinalg as linalg makes my code run perfectly. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From chris at trichech.us Wed Jan 4 17:59:56 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 17:59:56 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: <43BC4EA9.9060500@ee.byu.edu> <75C5C58E-03E8-4495-B823-3CF79818381C@trichech.us> Message-ID: <7560E74E-0D61-4D38-A29C-FFAAF414E887@trichech.us> On Jan 4, 2006, at 4:55 PM, Pearu Peterson wrote: >> On Jan 4, 2006, at 4:46 PM, Pearu Peterson wrote: >> >>> Do yoy mean >>> >>> svn co http://svn.scipy.org/svn/numpy/branches/numpy numpy >>> >>> ? trunk contains scipy/. >> >> trunk worked for me. > > As far as I can tell, trunk contains old scipy core, not numpy. Ok, well it downloaded something called numpy, but you are right, it appears to be scipy_core. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From chris at trichech.us Wed Jan 4 18:05:17 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 18:05:17 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <43BC4EA9.9060500@ee.byu.edu> References: <43BC4EA9.9060500@ee.byu.edu> Message-ID: <10130DD3-D71E-4059-A681-7072C6E6C4AF@trichech.us> On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > The new numpy svn is ready for testing. A build from numpy in branch builds and installs fine, then causes the following when trying to install PyMC: Traceback (most recent call last): File "setup.py", line 18, in ? ext_modules = [flib] File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/distutils/core.py", line 93, in setup return old_setup(**new_attr) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/core.py", line 149, in setup dist.run_commands() File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/dist.py", line 946, in run_commands self.run_command(cmd) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/command/build.py", line 112, in run self.run_command(cmd_name) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/cmd.py", line 333, in run_command self.distribution.run_command(command) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/distutils/command/build_src.py", line 86, in run self.build_sources() File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/distutils/command/build_src.py", line 99, in build_sources self.build_extension_sources(ext) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/distutils/command/build_src.py", line 149, in build_extension_sources sources = self.f2py_sources(sources, ext) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/distutils/command/build_src.py", line 352, in f2py_sources import numpy.f2py as f2py2e File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/f2py/__init__.py", line 10, in ? import f2py2e File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/f2py/f2py2e.py", line 43, in ? __usage__ = """\ NameError: name 'numpy_core_version' is not defined C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From Fernando.Perez at colorado.edu Wed Jan 4 18:10:20 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 16:10:20 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <10130DD3-D71E-4059-A681-7072C6E6C4AF@trichech.us> References: <43BC4EA9.9060500@ee.byu.edu> <10130DD3-D71E-4059-A681-7072C6E6C4AF@trichech.us> Message-ID: <43BC55DC.2080407@colorado.edu> Christopher Fonnesbeck wrote: > On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > python2.4/site-packages/numpy/f2py/f2py2e.py", line 43, in ? > __usage__ = """\ > NameError: name 'numpy_core_version' is not defined I imagine that, since we now have numpy/scipy as separate package, we'll be able to revert to standard python naming conventions: scipy.__core_version__ -> numpy.__version__ scipy.__full_version__ -> scipy.__version__ no? A number of python tools (such as pydoc) use __version__, so it's nice to define them in the standard name form. f From oliphant at ee.byu.edu Wed Jan 4 17:57:25 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 15:57:25 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> Message-ID: <43BC52D5.6040104@ee.byu.edu> Christopher Fonnesbeck wrote: > On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > >> The new numpy svn is ready for testing. > > > Perhaps related to my error reported the other day, but numpy also > breaks PyMC via linalg importing: Full scipy is not ready yet... To use the new numpy, you must convert scipy.base --> numpy and scipy --> numpy corefft --> numpy.dft corelinalg --> numpy.linalg and so forth. Also, my previous announcement was a tiny bit premature as I had not merged the branch into the trunk yet. But, that should be done in about 5 minutes from the time this was sent (probably by the time you read this...) -Travis From oliphant at ee.byu.edu Wed Jan 4 18:21:28 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 16:21:28 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <10130DD3-D71E-4059-A681-7072C6E6C4AF@trichech.us> References: <43BC4EA9.9060500@ee.byu.edu> <10130DD3-D71E-4059-A681-7072C6E6C4AF@trichech.us> Message-ID: <43BC5878.7080505@ee.byu.edu> Christopher Fonnesbeck wrote: > On Jan 4, 2006, at 5:39 PM, Travis Oliphant wrote: > >> The new numpy svn is ready for testing. > > > A build from numpy in branch builds and installs fine, then causes > the following when trying to install PyMC: > > Traceback (most recent call last): > File "setup.py", line 18, in ? > ext_modules = [flib] > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/distutils/core.py", line 93, in setup > return old_setup(**new_attr) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/core.py", line 149, in setup > dist.run_commands() > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/dist.py", line 946, in run_commands > self.run_command(cmd) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/dist.py", line 966, in run_command > cmd_obj.run() > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/command/build.py", line 112, in run > self.run_command(cmd_name) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/cmd.py", line 333, in run_command > self.distribution.run_command(command) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/distutils/dist.py", line 966, in run_command > cmd_obj.run() > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/distutils/command/build_src.py", line > 86, in run > self.build_sources() > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/distutils/command/build_src.py", line > 99, in build_sources > self.build_extension_sources(ext) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/distutils/command/build_src.py", line > 149, in build_extension_sources > sources = self.f2py_sources(sources, ext) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/distutils/command/build_src.py", line > 352, in f2py_sources > import numpy.f2py as f2py2e > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/f2py/__init__.py", line 10, in ? > import f2py2e > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/f2py/f2py2e.py", line 43, in ? > __usage__ = """\ > NameError: name 'numpy_core_version' is not defined > Should be fixed now... Thank you. Simply remove core from any core_version you find... I missed that one... -Travis From pearu at scipy.org Wed Jan 4 17:29:45 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 16:29:45 -0600 (CST) Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <43BC52D5.6040104@ee.byu.edu> References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> <43BC52D5.6040104@ee.byu.edu> Message-ID: On Wed, 4 Jan 2006, Travis Oliphant wrote: > Full scipy is not ready yet... I have applied my script to full scipy to use numpy. It builds now after few fixes in numpy and things are looking good. I'd like to commit my changes to scipy svn soon but this may break scipy - some scipy imports need to be checked manually. Pearu From oliphant at ee.byu.edu Wed Jan 4 18:32:45 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 16:32:45 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> <43BC52D5.6040104@ee.byu.edu> Message-ID: <43BC5B1D.9080907@ee.byu.edu> Pearu Peterson wrote: >On Wed, 4 Jan 2006, Travis Oliphant wrote: > > > >>Full scipy is not ready yet... >> >> > >I have applied my script to full scipy to use numpy. It builds now after >few fixes in numpy and things are looking good. I'd like to commit my >changes to scipy svn soon but this may break scipy - some scipy imports >need to be checked manually. > > Do it. People can always get an earlier revision if they need to.. Also. I seem to have broken the svn revison number magic for numpy. I liked having the svn revision number after the version name. Could you look into that? -Travis From oliphant at ee.byu.edu Wed Jan 4 17:54:26 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 15:54:26 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: <43BC4EA9.9060500@ee.byu.edu> Message-ID: <43BC5222.5060903@ee.byu.edu> Pearu Peterson wrote: >On Wed, 4 Jan 2006, Travis Oliphant wrote: > > > >>O.K. >> >>The new numpy svn is ready for testing. Check it out, and try it >>please... I'd like to make an official release of it soon... >> >>svn co http://svn.scipy.org/svn/numpy/trunk numpy >> >> > >Do yoy mean > >svn co http://svn.scipy.org/svn/numpy/branches/numpy numpy > >? trunk contains scipy/. > > Yes. I'm merging now. Give it a couple of minutes... -Travis From loredo at astro.cornell.edu Wed Jan 4 18:40:32 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Wed, 4 Jan 2006 18:40:32 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: Message-ID: <1136418032.43bc5cf0ae2eb@astrosun2.astro.cornell.edu> I installed it per Travis's instructions on FC3, Python 2.4.2. I'm unclear from the last batch of emails exactly what this installs where, but FWIW: >>> import numpy Failed to import fftpack No module named fftpack Failed to import signal No module named fftpack This import problem was fixed in a recent scipy_core svn checkout I've been using, so even if this is scipy_core, something is broken that wasn't broken a week ago. >>> numpy.test() Found 2 tests for numpy.core.umath Found 19 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 3 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ ......................................................................................................................................................................... ---------------------------------------------------------------------- Ran 170 tests in 0.536s OK I won't try it on my OS X machine until the SVN location is cleared up. -Tom ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From pearu at scipy.org Wed Jan 4 17:42:36 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 16:42:36 -0600 (CST) Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <43BC5B1D.9080907@ee.byu.edu> References: <43BC4EA9.9060500@ee.byu.edu> <067EF115-C779-40D8-B9C7-265CC0BDA5B1@trichech.us> <43BC5B1D.9080907@ee.byu.edu> Message-ID: On Wed, 4 Jan 2006, Travis Oliphant wrote: > Do it. People can always get an earlier revision if they need to.. I'll do it within 5-8 minutes after scipy build finishes here... > Also. I seem to have broken the svn revison number magic for numpy. > > I liked having the svn revision number after the version name. > > Could you look into that? Yes, it's fixed in svn now. Pearu From oliphant at ee.byu.edu Wed Jan 4 18:07:52 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 16:07:52 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready Message-ID: <43BC5548.5080604@ee.byu.edu> My previous post was pre-mature as the fixed numpy was only in the numpy branch. I have now finished *merging* the changes from the numpy branch into the trunk so svn co http://svn.scipy.org/svn/numpy/trunk numpy should not only check out something that compiles and installs *but also* intalls into the numpy namespace so that running in python >>> import numpy >>> numpy.test(1,1) should work. -Travis From chris at trichech.us Wed Jan 4 18:55:24 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 18:55:24 -0500 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC5548.5080604@ee.byu.edu> References: <43BC5548.5080604@ee.byu.edu> Message-ID: On Jan 4, 2006, at 6:07 PM, Travis Oliphant wrote: > so that running in python > >>>> import numpy >>>> numpy.test(1,1) > > should work. No errors on OSX 10.4 running Python 2.4.2: In [1]: import numpy In [2]: numpy.test(1,1) Found 3 tests for numpy.distutils.misc_util Found 2 tests for numpy.core.umath Found 42 tests for numpy.lib.type_check Found 9 tests for numpy.lib.twodim_base Found 3 tests for numpy.dft.helper Found 19 tests for numpy.core.ma Found 3 tests for numpy.lib.getlimits Found 33 tests for numpy.lib.function_base Found 6 tests for numpy.core.records Found 4 tests for numpy.lib.index_tricks Found 44 tests for numpy.lib.shape_base Found 0 tests for __main__ ........................................................................ ........................................................................ .......................... ---------------------------------------------------------------------- Ran 170 tests in 2.407s OK Also, PyMC's unit tests run error-free. Want that easy?! C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From pearu at scipy.org Wed Jan 4 18:00:16 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Jan 2006 17:00:16 -0600 (CST) Subject: [SciPy-dev] scipy from svn uses numpy Message-ID: Hi, Recent full scipy uses now numpy, the corresponding patches are in svn. There are several scipy tests failing but they can be easily fixed. It's rather late here, I am going to bed now.. Regards, Pearu From loredo at astro.cornell.edu Wed Jan 4 19:00:02 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Wed, 4 Jan 2006 19:00:02 -0500 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: References: Message-ID: <1136419202.43bc618210223@astrosun2.astro.cornell.edu> Okay, the import problems I just reported were for revision 1794 from trunk. I then installed revision 1797 from branches, with the same results. I have now just installed revision 1797 from trunk; again, the import problem from mid-December is back: >>> import numpy Failed to import fftpack No module named fftpack Failed to import signal No module named fftpack >>> numpy.test(1,1) Found 2 tests for numpy.core.umath Found 19 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 3 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ ......................................................................................................................................................................... ---------------------------------------------------------------------- Ran 170 tests in 0.497s >>> numpy.version.version '0.9.2.1797' Do I perhaps need to remove something from my previous installs first? -Tom ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From oliphant at ee.byu.edu Wed Jan 4 19:38:59 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 17:38:59 -0700 Subject: [SciPy-dev] New numpy svn ready In-Reply-To: <1136419202.43bc618210223@astrosun2.astro.cornell.edu> References: <1136419202.43bc618210223@astrosun2.astro.cornell.edu> Message-ID: <43BC6AA3.8080602@ee.byu.edu> Tom Loredo wrote: >Okay, the import problems I just reported were for revision 1794 from trunk. >I then installed revision 1797 from branches, with the same results. >I have now just installed revision 1797 from trunk; again, the import >problem from mid-December is back: > > > >>>>import numpy >>>> >>>> >Failed to import fftpack >No module named fftpack >Failed to import signal >No module named fftpack > > Curious... Make sure everything is deleted (scipy and numpy) and start again.... >>>>numpy.test(1,1) >>>> >>>> > Found 2 tests for numpy.core.umath > Found 19 tests for numpy.core.ma > Found 6 tests for numpy.core.records > Found 3 tests for numpy.distutils.misc_util > Found 3 tests for numpy.lib.getlimits > Found 9 tests for numpy.lib.twodim_base > Found 44 tests for numpy.lib.shape_base > Found 4 tests for numpy.lib.index_tricks > Found 42 tests for numpy.lib.type_check > Found 3 tests for numpy.dft.helper > Found 33 tests for numpy.lib.function_base > Found 0 tests for __main__ >......................................................................................................................................................................... >---------------------------------------------------------------------- >Ran 170 tests in 0.497s > > > >>>>numpy.version.version >>>> >>>> >'0.9.2.1797' > > also numpy.__version__ should give you the same thing.... >Do I perhaps need to remove something from my previous installs first? > > Very likely. -Travis From oliphant at ee.byu.edu Wed Jan 4 19:39:41 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 04 Jan 2006 17:39:41 -0700 Subject: [SciPy-dev] scipy from svn uses numpy In-Reply-To: References: Message-ID: <43BC6ACD.7060008@ee.byu.edu> Pearu Peterson wrote: >Hi, > >Recent full scipy uses now numpy, the corresponding patches are in svn. >There are several scipy tests failing but they can be easily fixed. >It's rather late here, I am going to bed now.. > > > Amazing job Pearu. You are a hero.... I will look at the failing tests and try to fix them... -Travis From ivazquez at ivazquez.net Wed Jan 4 19:47:34 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Wed, 04 Jan 2006 19:47:34 -0500 Subject: [SciPy-dev] numpy vs. numpy (was: Now new svn of numpy is ready) In-Reply-To: <43BC5548.5080604@ee.byu.edu> References: <43BC5548.5080604@ee.byu.edu> Message-ID: <1136422054.26954.2.camel@ignacio.lan> On Wed, 2006-01-04 at 16:07 -0700, Travis Oliphant wrote: > I have now finished *merging* the changes from the numpy > branch into the trunk so > > svn co http://svn.scipy.org/svn/numpy/trunk numpy > > should not only check out something that compiles and installs *but > also* intalls into the numpy namespace > > so that running in python > > >>> import numpy > >>> numpy.test(1,1) > > should work. Okay, call me an idiot but how does this "interact" with python-numeric? Is there somewhere where I can see the differences? I'd like to package The Package FKA scipy but the core of the distro I'm working with already has python-numeric and I want to avoid stepping on toes if possible. -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From Fernando.Perez at colorado.edu Wed Jan 4 21:03:59 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 19:03:59 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC5548.5080604@ee.byu.edu> References: <43BC5548.5080604@ee.byu.edu> Message-ID: <43BC7E8F.7070309@colorado.edu> Travis Oliphant wrote: > My previous post was pre-mature as the fixed numpy was only in the numpy > branch. I have now finished *merging* the changes from the numpy > branch into the trunk so > > svn co http://svn.scipy.org/svn/numpy/trunk numpy > > should not only check out something that compiles and installs *but > also* intalls into the numpy namespace Am I doing something stupid? The build fails with: creating build/temp.linux-i686-2.3/numpy/random/mtrand compile options: '-Inumpy/core/include -Ibuild/src/numpy/core -Inumpy/core/src -Inumpy/lib/../core/include -I/usr/include/python2.3 -c' gcc: numpy/random/mtrand/randomkit.c gcc: numpy/random/mtrand/initarray.c gcc: mtrand/mtrand.c gcc: mtrand/mtrand.c: No such file or directory And indeed: abdul[mtrand]> svup At revision 1810. abdul[mtrand]> d /usr/local/installers/src/scipy/numpy/numpy/random/mtrand total 108 -rw-r--r-- 1 fperez 19567 Jan 4 16:44 distributions.c -rw-r--r-- 1 fperez 7261 Jan 4 16:44 distributions.h -rw-r--r-- 1 fperez 1411 Jan 4 16:44 generate_mtrand_c.py -rw-r--r-- 1 fperez 4803 Jan 4 16:44 initarray.c -rw-r--r-- 1 fperez 135 Jan 4 16:44 initarray.h -rw-r--r-- 1 fperez 33061 Jan 4 18:54 mtrand.pyx -rw-r--r-- 1 fperez 1339 Jan 4 16:44 numpy.pxi -rw-r--r-- 1 fperez 599 Jan 4 16:44 Python.pxi -rw-r--r-- 1 fperez 8950 Jan 4 16:44 randomkit.c -rw-r--r-- 1 fperez 5342 Jan 4 16:44 randomkit.h No mtrand.c in sight, though there's an mtrand.pyx. I suspect you forgot to include the generated C file and only committed the pyrex sources. Cheers, f From oliphant.travis at ieee.org Wed Jan 4 22:24:42 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 04 Jan 2006 20:24:42 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC7E8F.7070309@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> Message-ID: <43BC917A.7080701@ieee.org> Fernando Perez wrote: >Travis Oliphant wrote: > > >>My previous post was pre-mature as the fixed numpy was only in the numpy >>branch. I have now finished *merging* the changes from the numpy >>branch into the trunk so >> >>svn co http://svn.scipy.org/svn/numpy/trunk numpy >> >>should not only check out something that compiles and installs *but >>also* intalls into the numpy namespace >> >> > >Am I doing something stupid? The build fails with: > >creating build/temp.linux-i686-2.3/numpy/random/mtrand >compile options: '-Inumpy/core/include -Ibuild/src/numpy/core -Inumpy/core/src >-Inumpy/lib/../core/include -I/usr/include/python2.3 -c' >gcc: numpy/random/mtrand/randomkit.c >gcc: numpy/random/mtrand/initarray.c >gcc: mtrand/mtrand.c >gcc: mtrand/mtrand.c: No such file or directory > > Something is strange with the svn repo. I was having trouble committing that changed file. Supposedly it's there now. You should try... -Travis From Fernando.Perez at colorado.edu Wed Jan 4 22:30:06 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 20:30:06 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC917A.7080701@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> Message-ID: <43BC92BE.6050302@colorado.edu> Travis Oliphant wrote: > Something is strange with the svn repo. I was having trouble committing > that changed file. Supposedly it's there now. You should try... Better, much better: In [5]: numpy.test() ................................................................................................................................................................................ ---------------------------------------------------------------------- Ran 176 tests in 0.683s OK Found 2 tests for numpy.core.umath Found 19 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 3 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ Out[5]: On to testing scipy... (and happy from not having to say 'scipy full' :) Cheers, f From Fernando.Perez at colorado.edu Wed Jan 4 22:34:07 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 20:34:07 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC92BE.6050302@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> Message-ID: <43BC93AF.7070703@colorado.edu> Fernando Perez wrote: > On to testing scipy... (and happy from not having to say 'scipy full' :) Which currently doesn't build: [...] copying Lib/weave/base_info.py -> build/lib.linux-i686-2.3/scipy/weave running build_clib customize UnixCCompiler customize UnixCCompiler using build_clib customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_clib building 'dfftpack' library compiling Fortran sources Traceback (most recent call last): File "setup.py", line 42, in ? setup_package() File "setup.py", line 35, in setup_package setup( **config.todict() ) File "/home/fperez/usr/local/lib/python2.3/site-packages/numpy/distutils/core.py", line 93, in setup return old_setup(**new_attr) File "/usr/lib/python2.3/distutils/core.py", line 149, in setup dist.run_commands() File "/usr/lib/python2.3/distutils/dist.py", line 907, in run_commands self.run_command(cmd) File "/usr/lib/python2.3/distutils/dist.py", line 927, in run_command cmd_obj.run() File "/usr/lib/python2.3/distutils/command/build.py", line 107, in run self.run_command(cmd_name) File "/usr/lib/python2.3/distutils/cmd.py", line 333, in run_command self.distribution.run_command(command) File "/usr/lib/python2.3/distutils/dist.py", line 927, in run_command cmd_obj.run() File "/home/fperez/usr/local/lib/python2.3/site-packages/numpy/distutils/command/build_clib.py", line 88, in run self.build_libraries(self.libraries) File "/home/fperez/usr/local/lib/python2.3/site-packages/numpy/distutils/command/build_clib.py", line 171, in build_libraries extra_postargs=[]) File "/usr/lib/python2.3/site-packages/scipy_distutils/ccompiler.py", line 81, in CCompiler_compile ccomp = self.compiler_so AttributeError: GnuFCompiler instance has no attribute 'compiler_so' removed Lib/__svn_version__.py removed Lib/__svn_version__.pyc ========================= Should we expect this to work, or should we wait for a while while the repo gets reorganized? I'm willing to do testing and report back, but there's no point if you are in the middle of things and this becomes just noise. Just let me know... Best, f From chris at trichech.us Wed Jan 4 23:02:55 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Wed, 4 Jan 2006 23:02:55 -0500 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC93AF.7070703@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> Message-ID: On Jan 4, 2006, at 10:34 PM, Fernando Perez wrote: > Fernando Perez wrote: > >> On to testing scipy... (and happy from not having to say 'scipy >> full' :) > > Which currently doesn't build: Builds fine on OSX, but importing gives: In [1]: import scipy import misc -> failed: No module named Image import fftpack -> failed: No module named corefft.helper import stats -> failed: No module named umath C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From oliphant.travis at ieee.org Thu Jan 5 00:36:40 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 04 Jan 2006 22:36:40 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC93AF.7070703@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> Message-ID: <43BCB068.2020807@ieee.org> Fernando Perez wrote: >Fernando Perez wrote: > > > >>On to testing scipy... (and happy from not having to say 'scipy full' :) >> >> > >Which currently doesn't build: > >[...] > >Should we expect this to work, or should we wait for a while while the repo >gets reorganized? I'm willing to do testing and report back, but there's no >point if you are in the middle of things and this becomes just noise. Just >let me know... > > It should build, but there are some errors on install that I'm still fixing. -Travis From oliphant.travis at ieee.org Thu Jan 5 00:47:53 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 04 Jan 2006 22:47:53 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC93AF.7070703@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> Message-ID: <43BCB309.9040500@ieee.org> Fernando Perez wrote: >Fernando Perez wrote: > > > >>On to testing scipy... (and happy from not having to say 'scipy full' :) >> >> > >Which currently doesn't build: > > Be sure to delete the scipy tree from where you have it installed, before building again. For some reason, the build is picking up scipy.distutils. The same problem happened to me.. -Travis From oliphant.travis at ieee.org Thu Jan 5 00:53:07 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Wed, 04 Jan 2006 22:53:07 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC93AF.7070703@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> Message-ID: Fernando Perez wrote: > Fernando Perez wrote: > > >>On to testing scipy... (and happy from not having to say 'scipy full' :) > > > Which currently doesn't build: > "/home/fperez/usr/local/lib/python2.3/site-packages/numpy/distutils/command/build_clib.py", > line 171, in build_libraries > extra_postargs=[]) > File "/usr/lib/python2.3/site-packages/scipy_distutils/ccompiler.py", line > 81, in CCompiler_compile > ccomp = self.compiler_so > AttributeError: GnuFCompiler instance has no attribute 'compiler_so' Here's the problem. Notice this is picking up the old scipy_distutils/ccompiler (I'm not sure why, but it is...) -Travis From Fernando.Perez at colorado.edu Thu Jan 5 00:54:45 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 22:54:45 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCB309.9040500@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> Message-ID: <43BCB4A5.6000204@colorado.edu> Travis Oliphant wrote: > Fernando Perez wrote: > > >>Fernando Perez wrote: >> >> >> >> >>>On to testing scipy... (and happy from not having to say 'scipy full' :) >>> >>> >> >>Which currently doesn't build: >> >> > > Be sure to delete the scipy tree from where you have it installed, > before building again. For some reason, the build is picking up > scipy.distutils. The same problem happened to me.. Yup, I figured that might have happened. I went through and made sure that all possible references to scipy.distutils would be gone. Now the build passes, and I can install fine. Note, however, a rather odd behavior: abdul[~]> python -c 'import scipy' import stats -> failed: cannot import name dot abdul[~]> python -c 'import numpy' import fftpack -> failed: 'module' object has no attribute 'fftpack' import stats -> failed: 'module' object has no attribute 'linalg' Now, let's move scipy out of the way: abdul[~]> mv ~/usr/local/lib/python2.3/site-packages/scipy ~/usr/local/lib/python2.3/site-packages/scipy-TEMP Check that scipy really can't be imported: abdul[~]> python -c 'import scipy' Traceback (most recent call last): File "", line 1, in ? ImportError: No module named scipy And now, let's get numpy again: abdul[~]> python -c 'import numpy' abdul[~]> No errors... This is a sign that numpy is pulling in dependencies from scipy, since the _presence_ of scipy is causing differences in the behavior of numpy. I'll keep digging... f From Fernando.Perez at colorado.edu Thu Jan 5 00:57:55 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 22:57:55 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCB4A5.6000204@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> Message-ID: <43BCB563.9090008@colorado.edu> Fernando Perez wrote: > This is a sign that numpy is pulling in dependencies from scipy, since the > _presence_ of scipy is causing differences in the behavior of numpy. > > I'll keep digging... The culprits: abdul[numpy]> pwd /usr/local/installers/src/scipy/numpy abdul[numpy]> svup At revision 1812. abdul[numpy]> egrep -r scipy * | egrep -v 'Binary file|\.svn|numpy/doc' numpy/lib/polynomial.py: import scipy.signal numpy/dft/__init__.py: import scipy.fftpack numpy/dft/__init__.py: fft = scipy.fftpack.fft numpy/dft/__init__.py: ifft = scipy.fftpack.ifft numpy/dft/__init__.py: fftn = scipy.fftpack.fftn numpy/dft/__init__.py: ifftn = scipy.fftpack.ifftn numpy/dft/__init__.py: fft2 = scipy.fftpack.fft2 numpy/dft/__init__.py: ifft2 = scipy.fftpack.ifft2 numpy/linalg/__init__.py:# re-define duplicated functions if full scipy installed. numpy/linalg/__init__.py: import scipy.linalg numpy/linalg/__init__.py: inv = scipy.linalg.inv numpy/linalg/__init__.py: svd = scipy.linalg.svd numpy/linalg/__init__.py: solve = scipy.linalg.solve numpy/linalg/__init__.py: det = scipy.linalg.det numpy/linalg/__init__.py: eig = scipy.linalg.eig numpy/linalg/__init__.py: eigvals = scipy.linalg.eigvals numpy/linalg/__init__.py: lstsq = scipy.linalg.lstsq numpy/linalg/__init__.py: pinv = scipy.linalg.pinv numpy/linalg/__init__.py: cholesky = scipy.linalg.cholesky setup.py: maintainer_email = "scipy-dev at scipy.org", setup.py: url = "http://numeric.scipy.org", abdul[numpy]> All those references to scipy in numpy have to go, else we have funny cross-dependencies. Cheers, f From oliphant.travis at ieee.org Thu Jan 5 01:33:04 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 04 Jan 2006 23:33:04 -0700 Subject: [SciPy-dev] ***[Possible UCE]*** Re: Now new svn of numpy is ready In-Reply-To: <43BCB563.9090008@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> Message-ID: <43BCBDA0.4030809@ieee.org> Fernando Perez wrote: >Fernando Perez wrote: > > > >>This is a sign that numpy is pulling in dependencies from scipy, since the >>_presence_ of scipy is causing differences in the behavior of numpy. >> >>I'll keep digging... >> >> > >The culprits: > > >numpy/dft/__init__.py: import scipy.fftpack >numpy/dft/__init__.py: fft = scipy.fftpack.fft >numpy/dft/__init__.py: ifft = scipy.fftpack.ifft >numpy/dft/__init__.py: fftn = scipy.fftpack.fftn >numpy/dft/__init__.py: ifftn = scipy.fftpack.ifftn >numpy/dft/__init__.py: fft2 = scipy.fftpack.fft2 >numpy/dft/__init__.py: ifft2 = scipy.fftpack.ifft2 >numpy/linalg/__init__.py:# re-define duplicated functions if full scipy installed. >numpy/linalg/__init__.py: import scipy.linalg >numpy/linalg/__init__.py: inv = scipy.linalg.inv >numpy/linalg/__init__.py: svd = scipy.linalg.svd >numpy/linalg/__init__.py: solve = scipy.linalg.solve >numpy/linalg/__init__.py: det = scipy.linalg.det >numpy/linalg/__init__.py: eig = scipy.linalg.eig >numpy/linalg/__init__.py: eigvals = scipy.linalg.eigvals >numpy/linalg/__init__.py: lstsq = scipy.linalg.lstsq >numpy/linalg/__init__.py: pinv = scipy.linalg.pinv >numpy/linalg/__init__.py: cholesky = scipy.linalg.cholesky > > But these need to be there. We need some way to update the functions in numpy if scipy is installed (so that you can always call the numpy functions but get the scipy functions if they are there)... -Travis From Fernando.Perez at colorado.edu Thu Jan 5 01:47:13 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 04 Jan 2006 23:47:13 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCBDA0.4030809@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> Message-ID: <43BCC0F1.2040103@colorado.edu> Travis Oliphant wrote: > But these need to be there. We need some way to update the functions in > numpy if scipy is installed (so that you can always call the numpy > functions but get the scipy functions if they are there)... Mmh, I'm not sure I agree with this approach. I think this may lead to surprising behavior, hard to find bugs, etc. I really don't think that the presence of scipy should change the behavior of numpy's internals, esp. when we're advertising the dependency chain to be numpy < scipy Finding out that in reality, there's a hidden |-> numpy scipy--| | | |------------------| cycle kind of threw me for a loop :) I would argue that users should know whether they want to call scipy's libraries or numpy's, and be clearly aware of the differences. For example, publishing a benchmark based on code that uses numpy on one system may be misleading: you write from numpy import foo foo() and publish that it takes 0.1s to run. Then somebody downloads your code, they run it and get significantly different results. Still, you swear that's what you get. It turns out the other user didn't have scipy installed, only numpy, but you had scipy on your system, and the difference in performance was being caused by this hidden overwriting. IMHO, we should follow here the whole 'explicit is better than implicit' thingie. If people want to get the benefits of scipy, they should say so explicitly and use scipy. I can also see the flip side of the argument: installing scipy can give you a 'magical' performance boost in your numpy code, without having to rewrite anything. But the more I write larger codes, the more I end up disliking hidden side-effects: in the long run, they always seem to cause more headaches than their apparent initial benefit. Perhaps a compromise solution would be to have something like numpy.scipy_enable() which would trigger the overwriting. At least this would force the users to make an explicit call, so they are aware of what's going on. This would obviously be a no-op if scipy isn't present, nicely wrapping the 'import scipy' in a try/except (as it is today). Well, it's ultimately your call, but I wanted to at least express my view. Cheers, f From pearu at scipy.org Thu Jan 5 01:05:44 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 00:05:44 -0600 (CST) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCC0F1.2040103@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> Message-ID: I agree with Fernando's analysis. If we keep numpy objects in scipy namespace then users can do from scipy import * to get fast fft, etc and so keeping numpy scipy-clean. The current problems are due to the fact that scipy is being imported while numpy import is not finished. To resolve this, overwriting fft and other routines should be carried out at the end of numpy/__init__.py and this can be accomplished with calling scipy_enable style functions at the end of __init__.py. Pearu On Wed, 4 Jan 2006, Fernando Perez wrote: > Travis Oliphant wrote: > >> But these need to be there. We need some way to update the functions in >> numpy if scipy is installed (so that you can always call the numpy >> functions but get the scipy functions if they are there)... > > Mmh, I'm not sure I agree with this approach. I think this may lead to > surprising behavior, hard to find bugs, etc. I really don't think that the > presence of scipy should change the behavior of numpy's internals, esp. when > we're advertising the dependency chain to be > > numpy < scipy > > Finding out that in reality, there's a hidden > > |-> numpy scipy--| > | | > |------------------| > > cycle kind of threw me for a loop :) > > I would argue that users should know whether they want to call scipy's > libraries or numpy's, and be clearly aware of the differences. > > For example, publishing a benchmark based on code that uses numpy on one > system may be misleading: you write > > from numpy import foo > foo() > > and publish that it takes 0.1s to run. Then somebody downloads your code, > they run it and get significantly different results. Still, you swear that's > what you get. It turns out the other user didn't have scipy installed, only > numpy, but you had scipy on your system, and the difference in performance was > being caused by this hidden overwriting. > > IMHO, we should follow here the whole 'explicit is better than implicit' > thingie. If people want to get the benefits of scipy, they should say so > explicitly and use scipy. > > I can also see the flip side of the argument: installing scipy can give you a > 'magical' performance boost in your numpy code, without having to rewrite > anything. But the more I write larger codes, the more I end up disliking > hidden side-effects: in the long run, they always seem to cause more headaches > than their apparent initial benefit. > > Perhaps a compromise solution would be to have something like > > numpy.scipy_enable() > > which would trigger the overwriting. At least this would force the users to > make an explicit call, so they are aware of what's going on. This would > obviously be a no-op if scipy isn't present, nicely wrapping the 'import > scipy' in a try/except (as it is today). > > Well, it's ultimately your call, but I wanted to at least express my view. > > Cheers, > > f > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From strawman at astraw.com Thu Jan 5 02:13:54 2006 From: strawman at astraw.com (Andrew Straw) Date: Wed, 04 Jan 2006 23:13:54 -0800 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCC0F1.2040103@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> Message-ID: <43BCC732.4000004@astraw.com> >Perhaps a compromise solution would be to have something like > >numpy.scipy_enable() > > Or, even better (IMO), shift the overwriting to scipy: scipy.enhance_numpy_namespace() # or other suitable name Obviously, scipy would do the slightly naughty trick of modifying numpy's namespace. I think that's acceptable if it's done under such a function name. This would make numpy contain no references to scipy, which I think is a great idea. From arnd.baecker at web.de Thu Jan 5 02:15:00 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 08:15:00 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCC0F1.2040103@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> Message-ID: On Wed, 4 Jan 2006, Fernando Perez wrote: > Travis Oliphant wrote: > > > But these need to be there. We need some way to update the functions in > > numpy if scipy is installed (so that you can always call the numpy > > functions but get the scipy functions if they are there)... > > Mmh, I'm not sure I agree with this approach. I think this may lead to > surprising behavior, hard to find bugs, etc. I really don't think that the > presence of scipy should change the behavior of numpy's internals, esp. when > we're advertising the dependency chain to be > > numpy < scipy > > Finding out that in reality, there's a hidden > > |-> numpy scipy--| > | | > |------------------| > > cycle kind of threw me for a loop :) > > I would argue that users should know whether they want to call scipy's > libraries or numpy's, and be clearly aware of the differences. Valid point (as your other one below) > For example, publishing a benchmark based on code that uses numpy on one > system may be misleading: you write > > from numpy import foo > foo() > > and publish that it takes 0.1s to run. Then somebody downloads your code, > they run it and get significantly different results. Still, you swear that's > what you get. It turns out the other user didn't have scipy installed, only > numpy, but you had scipy on your system, and the difference in performance was > being caused by this hidden overwriting. I don't buy this argument about benchmarking, as numpy can be installed with and without ATLAS (ACML/MKL/scsl/...). Depending on this, very different results will show up. > IMHO, we should follow here the whole 'explicit is better than implicit' > thingie. If people want to get the benefits of scipy, they should say so > explicitly and use scipy. > > I can also see the flip side of the argument: installing scipy can give you a > 'magical' performance boost in your numpy code, without having to rewrite > anything. But the more I write larger codes, the more I end up disliking > hidden side-effects: in the long run, they always seem to cause more headaches > than their apparent initial benefit. So it boils down to a performance-vs-possible-side-effects issue. Performance is good, side-effects are evil - what is better? > Perhaps a compromise solution would be to have something like > > numpy.scipy_enable() > > which would trigger the overwriting. At least this would force the users to > make an explicit call, so they are aware of what's going on. This would > obviously be a no-op if scipy isn't present, nicely wrapping the 'import > scipy' in a try/except (as it is today). What about a `numpy.scipy_disable()`? Best, Arnd From Fernando.Perez at colorado.edu Thu Jan 5 02:22:15 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 00:22:15 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> Message-ID: <43BCC927.9000104@colorado.edu> Arnd Baecker wrote: > I don't buy this argument about benchmarking, as numpy can be > installed with and without ATLAS (ACML/MKL/scsl/...). > Depending on this, very different results will show up. Yes, but that's still something that only affects numpy, at install time: the installation conditions are a known, fixed quantity (just like the compiler used, optimization flags, etc). The other one means that on Monday your code runs one way, on Tuesday (when you de-install scipy, for whatever reason) it mysteriously runs slower, and then on Wednesday (when you reinstall scipy from SVN, which happens to have a subtle bug that day) it just segfaults. And yet nowhere are you calling scipy. It's even better when the above scipy changes were made by someone else in your lab with root access, and you didn't even know they were happening. Fun, eh? I know that the above sounds silly, but in real life a less absurd set of circumstances can still lead to hair-pulling frustration when faced with this kind of hidden side effect setup. Cheers, f From oliphant.travis at ieee.org Thu Jan 5 02:22:52 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 00:22:52 -0700 Subject: [SciPy-dev] _import_tools Message-ID: <43BCC94C.5050608@ieee.org> Pearu, what is wrong with having _import_tools in numpy? Putting the package loader stuff in numpy is nice. By removing it from numpy you've undid the change I just made to scipy (where I explicitly loaded import_tools) -Travis From pearu at scipy.org Thu Jan 5 01:26:23 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 00:26:23 -0600 (CST) Subject: [SciPy-dev] _import_tools In-Reply-To: <43BCC94C.5050608@ieee.org> References: <43BCC94C.5050608@ieee.org> Message-ID: On Thu, 5 Jan 2006, Travis Oliphant wrote: > > Pearu, > > what is wrong with having _import_tools in numpy? Putting the package loader > stuff in numpy is nice. > By removing it from numpy you've undid the change I just made to scipy (where > I explicitly loaded import_tools) Ok sorry. My intention was to remove _import_tools hooks from numpy/__init__.py and they were in scipy/__init__.py. But yes, PackageLoader could be in numpy. I'll restore _import_tools if you haven't yet. Pearu From arnd.baecker at web.de Thu Jan 5 02:27:33 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 08:27:33 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC5548.5080604@ee.byu.edu> References: <43BC5548.5080604@ee.byu.edu> Message-ID: On Wed, 4 Jan 2006, Travis Oliphant wrote: > > My previous post was pre-mature as the fixed numpy was only in the numpy > branch. I have now finished *merging* the changes from the numpy > branch into the trunk so > > svn co http://svn.scipy.org/svn/numpy/trunk numpy > > should not only check out something that compiles and installs *but > also* intalls into the numpy namespace > > so that running in python > > >>> import numpy > >>> numpy.test(1,1) > > should work. First many thanks for your hard work! Looks very good, just a minor problem: On the Opteron I get: ====================================================================== ERROR: check_basic (numpy.core.defmatrix.test_defmatrix.test_algebra) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS3/Build_113/inst_numpy/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", l ine 108, in check_basic Ainv = linalg.inv(A) File "/scr/python/lib/python2.4/site-packages/scipy/linalg/basic.py", line 176, in inv a1 = asarray_chkfinite(a) File "/scr/python/lib/python2.4/site-packages/scipy_base/function_base.py", line 27, in asarray_chkfinite if not all(_nx.isfinite(x)): TypeError: function not supported for these types, and can't coerce to supported types ====================================================================== ERROR: check_basic (numpy.core.defmatrix.test_defmatrix.test_properties) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS3/Build_113/inst_numpy/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", l ine 34, in check_basic assert allclose(linalg.inv(A), mA.I) File "/scr/python/lib/python2.4/site-packages/scipy/linalg/basic.py", line 176, in inv a1 = asarray_chkfinite(a) File "/scr/python/lib/python2.4/site-packages/scipy_base/function_base.py", line 27, in asarray_chkfinite if not all(_nx.isfinite(x)): TypeError: function not supported for these types, and can't coerce to supported types ---------------------------------------------------------------------- Ran 176 tests in 0.206s Clearly, it picks some previously installed scipy_base (from a separate, old install). Explicitely: In [1]: from numpy import * In [2]: A = array([[1., 2.],[3., 4.]]) In [3]: type(A) Out[3]: In [4]: A.itemsize Out[4]: 8 In [5]: A.dtype() Out[5]: 0.0 In [6]: A.dtypestr Out[6]: ' /scr/python/lib/python2.4/site-packages/scipy/linalg/basic.py in inv(a, overwrite_a) 174 Return inverse of square matrix a. 175 """ --> 176 a1 = asarray_chkfinite(a) 177 if len(a1.shape) != 2 or a1.shape[0] != a1.shape[1]: 178 raise ValueError, 'expected square matrix' /scr/python/lib/python2.4/site-packages/scipy_base/function_base.py in asarray_chkfinite(x) 25 """ 26 x = asarray(x) ---> 27 if not all(_nx.isfinite(x)): 28 raise ValueError, "Array must not contain infs or nans." 29 return x TypeError: function not supported for these types, and can't coerce to supported types In [8]: import numpy In [9]: numpy.__version__ Out[9]: '0.9.2.1812' Many thanks, Arnd From arnd.baecker at web.de Thu Jan 5 02:33:45 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 08:33:45 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCC927.9000104@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> <43BCC927.9000104@colorado.edu> Message-ID: On Thu, 5 Jan 2006, Fernando Perez wrote: > Arnd Baecker wrote: > > > I don't buy this argument about benchmarking, as numpy can be > > installed with and without ATLAS (ACML/MKL/scsl/...). > > Depending on this, very different results will show up. > > Yes, but that's still something that only affects numpy, at install time: the > installation conditions are a known, fixed quantity (just like the compiler > used, optimization flags, etc). > > The other one means that on Monday your code runs one way, on Tuesday (when > you de-install scipy, for whatever reason) it mysteriously runs slower, and > then on Wednesday (when you reinstall scipy from SVN, which happens to have a > subtle bug that day) it just segfaults. And yet nowhere are you calling > scipy. It's even better when the above scipy changes were made by someone > else in your lab with root access, and you didn't even know they were > happening. Fun, eh? Things like that do happen, indeed - I remember that some time ago at a different university my-most-beloved-system-adminstrator-who-should-not-be-named silently upgraded the compilers and so we managed to compute crap for two months (needless to say that the crap even had the right orders of magnitude, so we only spotted this by an accidental run on some other machine ...) > I know that the above sounds silly, but in real life a less absurd set of > circumstances can still lead to hair-pulling frustration when faced with this > kind of hidden side effect setup. No, it does not sound silly - I am convinced. Best, Arnd From Fernando.Perez at colorado.edu Thu Jan 5 02:34:14 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 00:34:14 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: <43BCCBF6.2070103@colorado.edu> Arnd Baecker wrote: > In [1]: from numpy import * [...] > /scr/python/lib/python2.4/site-packages/scipy/linalg/basic.py in inv(a, > overwrite_a) [...] > TypeError: function not supported for these types, and can't coerce to > supported types You were saying... performance or side effects? I think you just made my case. ;-) Cheers, f From oliphant.travis at ieee.org Thu Jan 5 02:36:15 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 00:36:15 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: <43BCCC6F.5080709@ieee.org> Arnd Baecker wrote: >On Wed, 4 Jan 2006, Travis Oliphant wrote: > > > >>My previous post was pre-mature as the fixed numpy was only in the numpy >>branch. I have now finished *merging* the changes from the numpy >>branch into the trunk so >> >>svn co http://svn.scipy.org/svn/numpy/trunk numpy >> >>should not only check out something that compiles and installs *but >>also* intalls into the numpy namespace >> >>so that running in python >> >> >>> import numpy >> >>> numpy.test(1,1) >> >>should work. >> >> > >First many thanks for your hard work! Looks very good, >just a minor problem: > > You need to remove scipy_base from your system (or at least from the site-packages directory you are running for Python). I'm pretty sure the new scipy won't work with the old scipy (which is what scipy_base comes from). This will need to be emphasized on install of the new system.... Get rid of the old system first... -Travis From oliphant.travis at ieee.org Thu Jan 5 02:38:54 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 00:38:54 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCC732.4000004@astraw.com> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> <43BCC732.4000004@astraw.com> Message-ID: <43BCCD0E.5040201@ieee.org> Andrew Straw wrote: >>Perhaps a compromise solution would be to have something like >> >>numpy.scipy_enable() >> >> >> >> >Or, even better (IMO), shift the overwriting to scipy: > >scipy.enhance_numpy_namespace() # or other suitable name > >Obviously, scipy would do the slightly naughty trick of modifying >numpy's namespace. I think that's acceptable if it's done under such a >function name. > > I don't know why my posts take so long sometimes... The solution is that now to get at functions that are in both numpy and scipy and you want the scipy ones first and default to the numpy ones if scipy is not installed, there is a numpy.dual module that must be loaded separately that contains all the overlapping functions. -Travis From oliphant.travis at ieee.org Thu Jan 5 02:17:31 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 00:17:31 -0700 Subject: [SciPy-dev] SciPy --> NumPy transition complete In-Reply-To: <43BCB4A5.6000204@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> Message-ID: <43BCC80B.5020003@ieee.org> The scipy -> numpy transition is complete. I've factored out the code that uses dual numpy/scipy calls to a separate module: numpy.dual. This module must be imported by users who want to support numpy but use the scipy calls when available. This fixes the circular import problem in numpy/scipy Calls availabe are >>> import numpy.dual >>> print filter(lambda x: x[0] != '_', dir(numpy.dual)) ['cholesky', 'det', 'eig', 'eigvals', 'fft', 'fft2', 'fftn', 'ifft', 'ifft2', 'ifftn', 'inv', 'lstsq', 'pinv', 'solve', 'svd'] Please grab the new code base and play with it... I'd like to make a release of numpy tomorrow.... -Travis From arnd.baecker at web.de Thu Jan 5 02:50:22 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 08:50:22 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCCC6F.5080709@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BCCC6F.5080709@ieee.org> Message-ID: Hi, On Thu, 5 Jan 2006, Travis Oliphant wrote: > Arnd Baecker wrote: > > >On Wed, 4 Jan 2006, Travis Oliphant wrote: > > > > > > > >>My previous post was pre-mature as the fixed numpy was only in the numpy > >>branch. I have now finished *merging* the changes from the numpy > >>branch into the trunk so > >> > >>svn co http://svn.scipy.org/svn/numpy/trunk numpy > >> > >>should not only check out something that compiles and installs *but > >>also* intalls into the numpy namespace > >> > >>so that running in python > >> > >> >>> import numpy > >> >>> numpy.test(1,1) > >> > >>should work. > >> > >> > > > >First many thanks for your hard work! Looks very good, > >just a minor problem: > > > > > > You need to remove scipy_base from your system (or at least from the > site-packages directory you are running for Python). I'm pretty sure > the new scipy won't work with the old scipy (which is what scipy_base > comes from). > > This will need to be emphasized on install of the new system.... Get > rid of the old system first... Fair enough - *but*: I have a working (old) scipy in site-packages (the one which I use for real work. All testing installs of numpy and (new) new scipy go to some other place (~/BUILDS3/Build_113/inst_numpy, ~/BUILDS3/Build_113/inst_scipy) These two directories are pre-pended to my python path and if I had installed (new) new scipy then the import should have found the right one. Removing the `try: import scipy.linalg except ImportError:` from `site-packages/numpy/linalg/__init__.py` cures the problem (all tests pass). (Actually, this was already adressed in Fernando's mail before). Fernando wrote: > You were saying... performance or side effects? > > I think you just made my case. ;-) yes I did. (but you convinced me before ... ;-) best, Arnd From Fernando.Perez at colorado.edu Thu Jan 5 02:51:00 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 00:51:00 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCCD0E.5040201@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCB563.9090008@colorado.edu> <43BCBDA0.4030809@ieee.org> <43BCC0F1.2040103@colorado.edu> <43BCC732.4000004@astraw.com> <43BCCD0E.5040201@ieee.org> Message-ID: <43BCCFE4.3030801@colorado.edu> Travis Oliphant wrote: > The solution is that now to get at functions that are in both numpy and > scipy and you want the scipy ones first and default to the numpy ones if > scipy is not installed, there is a numpy.dual module that must be > loaded separately that contains all the overlapping functions. Great: abdul[~]> python -c 'import numpy;print numpy.__version__' 0.9.2.1818 Without scipy: abdul[~]> python -c 'import scipy' Traceback (most recent call last): File "", line 1, in ? ImportError: No module named scipy abdul[~]> python -c 'import numpy;numpy.test()' Found 3 tests for numpy.distutils.misc_util Found 2 tests for numpy.core.umath Found 3 tests for numpy.dft.helper Found 42 tests for numpy.lib.type_check Found 9 tests for numpy.lib.twodim_base Found 3 tests for numpy.lib.getlimits Found 20 tests for numpy.core.ma Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 6 tests for numpy.core.records Found 4 tests for numpy.lib.index_tricks Found 44 tests for numpy.lib.shape_base Found 0 tests for __main__ .................................................................... ....................................................................... ...................................... ---------------------------------------------------------------------- Ran 177 tests in 0.535s OK And with it: abdul[~]> mv ~/usr/local/lib/python2.3/site-packages/scipy-TEMP/ ~/usr/local/lib/python2.3/site-packages/scipy abdul[~]> python -c 'import scipy' import fftpack -> failed: No module named corefft.helper import stats -> failed: cannot import name dot abdul[~]> python -c 'import numpy;numpy.test()' Found 3 tests for numpy.distutils.misc_util Found 2 tests for numpy.core.umath Found 3 tests for numpy.dft.helper Found 42 tests for numpy.lib.type_check Found 9 tests for numpy.lib.twodim_base Found 3 tests for numpy.lib.getlimits Found 20 tests for numpy.core.ma Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 6 tests for numpy.core.records Found 4 tests for numpy.lib.index_tricks Found 44 tests for numpy.lib.shape_base Found 0 tests for __main__ ....................................................................... ....................................................................... ................................... ---------------------------------------------------------------------- Ran 177 tests in 0.506s OK This is excellent. Many, many thanks for your gigantic effort, and for humoring my never-ending pestering ;) g'night. Cheers, f From oliphant.travis at ieee.org Thu Jan 5 03:01:07 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 01:01:07 -0700 Subject: [SciPy-dev] _import_tools In-Reply-To: References: <43BCC94C.5050608@ieee.org> Message-ID: <43BCD243.10306@ieee.org> Pearu Peterson wrote: >On Thu, 5 Jan 2006, Travis Oliphant wrote: > > > >>Pearu, >> >>what is wrong with having _import_tools in numpy? Putting the package loader >>stuff in numpy is nice. >>By removing it from numpy you've undid the change I just made to scipy (where >>I explicitly loaded import_tools) >> >> > >Ok sorry. My intention was to remove _import_tools hooks from >numpy/__init__.py and they were in scipy/__init__.py. But yes, >PackageLoader could be in numpy. I'll restore _import_tools if you haven't >yet. > > Scipy does not build for me anymore.... something about missing core_config... -Travis From oliphant.travis at ieee.org Thu Jan 5 03:04:20 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 01:04:20 -0700 Subject: [SciPy-dev] Ignore previous complaint about scipy not building. Message-ID: <43BCD304.5070109@ieee.org> It was fixed after an update to numpy. -Travis From pearu at scipy.org Thu Jan 5 02:06:59 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 01:06:59 -0600 (CST) Subject: [SciPy-dev] _import_tools In-Reply-To: <43BCD243.10306@ieee.org> References: <43BCC94C.5050608@ieee.org> <43BCD243.10306@ieee.org> Message-ID: On Thu, 5 Jan 2006, Travis Oliphant wrote: > Pearu Peterson wrote: > >> On Thu, 5 Jan 2006, Travis Oliphant wrote: >> >> >> >>> Pearu, >>> >>> what is wrong with having _import_tools in numpy? Putting the package loader >>> stuff in numpy is nice. >>> By removing it from numpy you've undid the change I just made to scipy (where >>> I explicitly loaded import_tools) >>> >>> >> >> Ok sorry. My intention was to remove _import_tools hooks from >> numpy/__init__.py and they were in scipy/__init__.py. But yes, >> PackageLoader could be in numpy. I'll restore _import_tools if you haven't >> yet. >> >> > Scipy does not build for me anymore.... something about missing > core_config... Should be fixed in numpy svn. You may need to remove build/ dir before installing. Pearu From arnd.baecker at web.de Thu Jan 5 03:13:01 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 09:13:01 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: success! Installing numpy/scipy on the opteron: In [8]: numpy.__version__ Out[8]: '0.9.2.1818' numpy.test(10) Ran 178 tests in 0.171s In [7]: scipy.__numpy_version__ Out[7]: '0.9.2.1818' scipy.test(10) Ran 945 tests in 56.373s Best, Arnd From Fernando.Perez at colorado.edu Thu Jan 5 03:23:18 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 01:23:18 -0700 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: <43BCD776.3000207@colorado.edu> Arnd Baecker wrote: > success! > > Installing numpy/scipy on the opteron: > > In [8]: numpy.__version__ > Out[8]: '0.9.2.1818' > > numpy.test(10) > Ran 178 tests in 0.171s > > In [7]: scipy.__numpy_version__ > Out[7]: '0.9.2.1818' > > scipy.test(10) > Ran 945 tests in 56.373s On my Fedora 3 x86 box, using python2.3, gcc 3.4.4: abdul[scipy]> ./testpkg numpy [...] ---------------------------------------------------------------------- Ran 178 tests in 4.986s OK ******************************************************************************** Package: numpy Version: 0.9.2.1818 abdul[scipy]> ./testpkg scipy [...] ====================================================================== ERROR: line-search Newton conjugate gradient optimization routine ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/tests/test_optimize.py", line 104, in check_ncg full_output=False, disp=False, retall=False) File "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", line 969, in fmin_ncg alphak, fc, gc, old_fval = line_search_BFGS(f,xk,pk,gfk,old_fval) File "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", line 556, in line_search_BFGS phi_a2 = apply(f,(xk+alpha2*pk,)+args) File "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", line 133, in function_wrapper return function(x, *args) File "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/tests/test_optimize.py", line 31, in func raise RuntimeError, "too many iterations in optimization routine" RuntimeError: too many iterations in optimization routine ---------------------------------------------------------------------- Ran 945 tests in 114.135s FAILED (errors=1) ******************************************************************************** Package: scipy Version: Traceback (most recent call last): File "/usr/local/lib/python2.3/site-packages/test.py", line 28, in ? AttributeError: 'module' object has no attribute '__version__' Could we please have scipy.__version__ as well, in addition to scipy.__numpy_version__? Cheers, f ps - I'm attaching the make/test scripts that I'm using, in case anyone finds them useful to do this over and over. They both work as scriptname [scipy/numpy] The make one does a make/install, just configure your preferred python version and install prefix. For scipy, we'll need scipy.__version__ to be added so it works without that last error. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: makepkg URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: testpkg URL: From pearu at scipy.org Thu Jan 5 02:53:05 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 01:53:05 -0600 (CST) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BCD776.3000207@colorado.edu> References: <43BC5548.5080604@ee.byu.edu> <43BCD776.3000207@colorado.edu> Message-ID: On Thu, 5 Jan 2006, Fernando Perez wrote: > Arnd Baecker wrote: >> success! >> >> Installing numpy/scipy on the opteron: >> >> In [8]: numpy.__version__ >> Out[8]: '0.9.2.1818' >> >> numpy.test(10) >> Ran 178 tests in 0.171s >> >> In [7]: scipy.__numpy_version__ >> Out[7]: '0.9.2.1818' >> >> scipy.test(10) >> Ran 945 tests in 56.373s Try also scipy.pkgload() scipy.test(10) to run more tests. > On my Fedora 3 x86 box, using python2.3, gcc 3.4.4: > > abdul[scipy]> ./testpkg numpy > [...] > ---------------------------------------------------------------------- > Ran 178 tests in 4.986s > > OK > ******************************************************************************** > > Package: numpy > Version: 0.9.2.1818 > > abdul[scipy]> ./testpkg scipy > [...] > ====================================================================== > ERROR: line-search Newton conjugate gradient optimization routine > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/tests/test_optimize.py", > line 104, in check_ncg > full_output=False, disp=False, retall=False) > File > "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", > line 969, in fmin_ncg > alphak, fc, gc, old_fval = line_search_BFGS(f,xk,pk,gfk,old_fval) > File > "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", > line 556, in line_search_BFGS > phi_a2 = apply(f,(xk+alpha2*pk,)+args) > File > "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/optimize.py", > line 133, in function_wrapper > return function(x, *args) > File > "/home/fperez/usr/local/lib/python2.3/site-packages/scipy/optimize/tests/test_optimize.py", > line 31, in func > raise RuntimeError, "too many iterations in optimization routine" > RuntimeError: too many iterations in optimization routine I get that too occacionally. > > Package: scipy > Version: > Traceback (most recent call last): > File "/usr/local/lib/python2.3/site-packages/test.py", line 28, in ? > > AttributeError: 'module' object has no attribute '__version__' > > > Could we please have scipy.__version__ as well, in addition to > scipy.__numpy_version__? Yes, it's there now. Pearu From arnd.baecker at web.de Thu Jan 5 04:11:13 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 10:11:13 +0100 (CET) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: On Thu, 5 Jan 2006, Pearu Peterson wrote: > On Thu, 5 Jan 2006, Fernando Perez wrote: > > > Arnd Baecker wrote: > >> success! > >> > >> Installing numpy/scipy on the opteron: > >> > >> In [8]: numpy.__version__ > >> Out[8]: '0.9.2.1818' > >> > >> numpy.test(10) > >> Ran 178 tests in 0.171s > >> > >> In [7]: scipy.__numpy_version__ > >> Out[7]: '0.9.2.1818' > >> > >> scipy.test(10) > >> Ran 945 tests in 56.373s > > Try also > > scipy.pkgload() > scipy.test(10) > > to run more tests. Ran 1222 tests in 52.578s Excellent! (It seems that I'll have to familiarize myself with pkgload ... - though I am wondering, whether scipy.test(*10*) maybe should trigger that on its own? Just to make sure that really all available test are run?) A small glitch: First I accidentally started from the installation directory: In [1]: import numpy Running from numpy source directory. In [2]: import scipy --------------------------------------------------------------------------- exceptions.ImportError Traceback (most recent call last) /home/abaecker/BUILDS3/Build_114/numpy/ /home/abaecker/BUILDS3/Build_114/inst_numpy/lib/python2.4/site-packages/scipy/__init__.py 49 50 if show_numpy_config is not None: ---> 51 from numpy import __version__ as __numpy_version__ 52 __doc__ += __numpy_doc__ 53 pkgload(verbose=NUMPY_IMPORT_VERBOSE,postpone=True) ImportError: cannot import name __version__ Best, Arnd From strawman at astraw.com Thu Jan 5 04:13:05 2006 From: strawman at astraw.com (Andrew Straw) Date: Thu, 05 Jan 2006 01:13:05 -0800 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC5548.5080604@ee.byu.edu> References: <43BC5548.5080604@ee.byu.edu> Message-ID: <43BCE321.5000303@astraw.com> Dear Travis "Array" Oliphant, :) I'm happy to inform you that all numpy and scipy tests pass on my AMD64 machine. They even install as .eggs out-of-the-box. Thanks again! Andrew From oliphant.travis at ieee.org Thu Jan 5 04:26:54 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 02:26:54 -0700 Subject: [SciPy-dev] Ignore last numpy namespace mail --- scipy.pkgload() does the right thing... Message-ID: <43BCE65E.2020500@ieee.org> Big thank you's to all who have helped with the transition to numpy. It went relatively smoothly, I think. Special kudos to Pearu Peterson who helped tremendously with getting scipy converted to use numpy instead of scipy.base. Tests are passing for me and I've heard reports that tests are passing for others as well. I'll still wait until tomorrow to cut a release of numpy. Then Friday a release of scipy should be ready. -Travis From oliphant.travis at ieee.org Thu Jan 5 03:55:38 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 01:55:38 -0700 Subject: [SciPy-dev] Don't add numpy to the scipy name-space Message-ID: <43BCDF0A.3000901@ieee.org> Despite my recent playing with the scipy repo to the contrary, I think it is best that we don't add the numpy namespace to the scipy namespace by default. Let's let the user do that in their interactive scripts or in ipython -pylab. -Travis From pearu at scipy.org Thu Jan 5 04:12:51 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 03:12:51 -0600 (CST) Subject: [SciPy-dev] Don't add numpy to the scipy name-space In-Reply-To: <43BCDF0A.3000901@ieee.org> References: <43BCDF0A.3000901@ieee.org> Message-ID: On Thu, 5 Jan 2006, Travis Oliphant wrote: > > Despite my recent playing with the scipy repo to the contrary, I think > it is best that we don't add the numpy namespace to the scipy namespace > by default. > > Let's let the user do that in their interactive scripts or in ipython > -pylab. +1 Scipy users may be used to writing from scipy import * in their scripts. Such scripts need additional from numpy import * then. Pearu From arnd.baecker at web.de Thu Jan 5 05:24:03 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 11:24:03 +0100 (CET) Subject: [SciPy-dev] ReSt for new.scipy.org? Message-ID: Hi, would it be possible to add restructured text support for the new http://new.scipy.org/Wiki/ ? See http://new.scipy.org/Wiki/Moin_configuration According to http://moinmoin.wikiwikiweb.de/HelpOnParsers/ReStructuredText the `ReStructuredText Parser` is part of MoinMoin >=1.3.2 (and can be used in all versions of MoinMoin 1.2.X and 1.3.X). So it only needs the "Python docutils package (0.4.0 or newer, or a recent 0.3.10 snapshot)". I am eager to contribute, but all my notes are in ReSt Best, Arnd From prabhu_r at users.sf.net Thu Jan 5 05:46:34 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Thu, 5 Jan 2006 16:16:34 +0530 Subject: [SciPy-dev] SciPy --> NumPy transition complete In-Reply-To: <43BCC80B.5020003@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BC7E8F.7070309@colorado.edu> <43BC917A.7080701@ieee.org> <43BC92BE.6050302@colorado.edu> <43BC93AF.7070703@colorado.edu> <43BCB309.9040500@ieee.org> <43BCB4A5.6000204@colorado.edu> <43BCC80B.5020003@ieee.org> Message-ID: <17340.63754.579005.71930@prpc.aero.iitb.ac.in> >>>>> "Travis" == Travis Oliphant writes: Travis> The scipy -> numpy transition is complete. [...] Travis> Please grab the new code base and play with it... I'd Travis> like to make a release of numpy tomorrow.... All tests pass for numpy and scipy on Debian stable on an i386. cheers, prabhu From faltet at carabos.com Thu Jan 5 07:07:43 2006 From: faltet at carabos.com (Francesc Altet) Date: Thu, 5 Jan 2006 13:07:43 +0100 Subject: [SciPy-dev] SciPy --> NumPy transition complete In-Reply-To: <43BCC80B.5020003@ieee.org> References: <43BC5548.5080604@ee.byu.edu> <43BCB4A5.6000204@colorado.edu> <43BCC80B.5020003@ieee.org> Message-ID: <200601051307.44712.faltet@carabos.com> A Dijous 05 Gener 2006 08:17, Travis Oliphant va escriure: > Please grab the new code base and play with it... I'd like to make a > release of numpy tomorrow.... Everything goes fine with Debian unstable on x86: faltet at inspiron:~/PyTables/pytables/trunk$ python -c 'import numpy;numpy.test()' Found 3 tests for numpy.distutils.misc_util Found 2 tests for numpy.core.umath Found 3 tests for numpy.dft.helper Found 42 tests for numpy.lib.type_check Found 9 tests for numpy.lib.twodim_base Found 3 tests for numpy.lib.getlimits Found 20 tests for numpy.core.ma Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 6 tests for numpy.core.records Found 4 tests for numpy.lib.index_tricks Found 44 tests for numpy.lib.shape_base Found 0 tests for __main__ ................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 177 tests in 0.907s OK Congratulations for the great job to all the scipy_core^H*10 numpy community. You rock guys! -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From faltet at carabos.com Thu Jan 5 06:57:19 2006 From: faltet at carabos.com (Francesc Altet) Date: Thu, 5 Jan 2006 12:57:19 +0100 Subject: [SciPy-dev] Long live to numpy (and its list as well) Message-ID: <200601051257.20004.faltet@carabos.com> Hi, Now that numpy is dead and we all want a long life to numpy, what about moving discussions in this list into the venerable: "numpy-discussion " I think this would be more consistent with the new naming conventions and also would be a good way to publicly re-affirm that the numeric community has joined efforts again. Of course, dicussion on numpy should (and must) be welcomed in scipy lists, but the numpy-discussion would be *its* place. Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From a.h.jaffe at gmail.com Thu Jan 5 07:14:33 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Thu, 05 Jan 2006 12:14:33 +0000 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: Test error on OS X [test(1,1) succeeds] -- In [1]: import numpy In [2]: numpy.__version__ Out[2]: '0.9.2.1824' In [3]: numpy.test(10) Found 2 tests for numpy.core.umath Found 20 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 4 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ .................................E................................................................................................................................................ ====================================================================== ERROR: check_singleton (numpy.lib.getlimits.test_getlimits.test_longdouble) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/tests/test_getlimits.py", line 33, in check_singleton ftype = finfo(longdouble) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", line 47, in __new__ obj = object.__new__(cls)._init(dtype) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", line 73, in _init "numpy longfloat precision floating "\ File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/machar.py", line 128, in __init__ raise RuntimeError, "could not determine machine tolerance " \ RuntimeError: could not determine machine tolerance for 'negep' ---------------------------------------------------------------------- Ran 178 tests in 2.651s FAILED (errors=1) Out[3]: From schofield at ftw.at Thu Jan 5 07:47:30 2006 From: schofield at ftw.at (Ed Schofield) Date: Thu, 5 Jan 2006 12:47:30 +0000 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC Message-ID: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> Hi all, Congratulations to all who helped out with the naming transition! Thanks especially to Travis and Pearu. Here are some unit test results, with >>> numpy.__version__ '0.9.2.1824' >>> scipy.__version__ '0.4.4.1523' Pentium 4, Linux 2.6.13.4, gcc 3.4.3 All tests pass (numpy and scipy) PPC G4, OS X 10.4.3, gcc 3.3 All numpy tests pass 10 scipy test failures, all related to fftpack. I think these don't relate to the recent renaming, but rather date back several weeks. The build and test logs are too large to post here, but they're at: http://www.doc.ic.ac.uk/~ejs102/scipy_r1523_build_test_Jan_5.log These failures don't affect me, but if anyone has suggestions for isolating the problem before the imminent scipy release I'll see what I can do. -- Ed From pearu at scipy.org Thu Jan 5 06:53:14 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 05:53:14 -0600 (CST) Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: On Thu, 5 Jan 2006, Arnd Baecker wrote: >> Try also >> >> scipy.pkgload() >> scipy.test(10) >> >> to run more tests. > > Ran 1222 tests in 52.578s > > Excellent! > (It seems that I'll have to familiarize myself with pkgload ... - > though I am wondering, whether scipy.test(*10*) maybe > should trigger that on its own? Just to make sure that > really all available test are run?) from scipy import test test(10) will run all tests without calling pkgload(). But I agree with you, I'll look into.. > A small glitch: First I accidentally started from the > installation directory: > > In [1]: import numpy > Running from numpy source directory. > > In [2]: import scipy > --------------------------------------------------------------------------- > exceptions.ImportError Traceback (most This will be fixed by raising importerror with proper error message. Thanks, Pearu From chris at trichech.us Thu Jan 5 08:50:03 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Thu, 5 Jan 2006 08:50:03 -0500 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: On Jan 5, 2006, at 6:53 AM, Pearu Peterson wrote: > test(10) This runs pretty far on OSX, but ending with 16 errors from fftpack: **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** .................... ====================================================================== FAIL: bench_random (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 210, in bench_random assert_array_almost_equal(diff(f,1),direct_diff(f,1)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 96.0%): Array 1: [ 4.0000000e+00 4.4583823e+00 4.6585202e+00 4.5314873e+00 4.0154475e+00 3.0782890e+00 1.7457348e+00 1.22... Array 2: [ 4.0000000e+00 3.9029585e+00 3.6083502e+00 3.1083616e+00 2.3987172e+00 1.4909128e+00 4.2567472e-01 -7.18... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 86, in check_definition assert_array_almost_equal(diff(sin(x),2),direct_diff(sin(x),2)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ -6.3519652e-15 -3.8268343e-01 -7.0710678e-01 -9.2387953e-01 -1.0000000e+00 -9.2387953e-01 -7.0710678e-01 -3.82... Array 2: [ -7.3854931e-15 6.5259351e-15 -2.4942634e-15 -7.5636114e-17 1.4745663e-15 -1.9133685e-15 2.2804788e-16 8.70... ====================================================================== FAIL: bench_random (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 356, in bench_random assert_array_almost_equal(hilbert(f),direct_hilbert(f)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 96.0%): Array 1: [ -1.0865211e+00 -1.1538025e+00 -1.1519434e+00 -1.0725580e+00 -9.1204050e-01 -6.7405401e-01 -3.7184092e-01 -2.91... Array 2: [ -1.0865211e+00 +0.0000000e+00j -1.0575653e+00 -5.3966502e-17j -9.7152085e-01 -3.8615213e-17j -8.3113061e-01 -1.544... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 332, in check_random_even assert_array_almost_equal(direct_hilbert(direct_ihilbert(f)),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ -4.3368087e-19+0.j 0.0000000e+00-0.j -4.0066888e-19-0.j 0.0000000e+00-0.j -3.0665868e-19-0.j 0.0000000e+00-0.... Array 2: [-0.3138695 -0.320617 -0.0202909 0.0488052 -0.373401 -0.1621773 -0.2607906 -0.1146876 -0.12861 0.3907841 0.09393... ====================================================================== FAIL: bench_random (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 424, in bench_random assert_array_almost_equal(direct_shift(f,1),sf) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0%): Array 1: [ 0.6015407 0.6432035 0.7632704 0.947396 1.1731603 1.4125187 1.635358 1.8137685 1.9262065 1.9605388 1.91521... Array 2: [ 0.6015407 0.5654719 0.6040595 0.7013589 0.8369433 0.9885575 1.1344713 1.2554716 1.336468 1.3676865 1.34542... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 390, in check_definition assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 88.8888888889%): Array 1: [ 0.0998334 0.4341242 0.7160532 0.9116156 0.9972237 0.9625519 0.8117822 0.5630995 0.2464987 -0.0998334 -0.43412... Array 2: [ 0.0998334 0.0938127 0.0764768 0.0499167 0.0173359 -0.0173359 -0.0499167 -0.0764768 -0.0938127 -0.0998334 -0.09381... ====================================================================== FAIL: bench_random (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 273, in bench_random assert_array_almost_equal(tilbert(f,1),direct_tilbert(f,1)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 96.0%): Array 1: [ -1.0896991e+00 -1.1569240e+00 -1.1548950e+00 -1.0752326e+00 -9.1434140e-01 -6.7589846e-01 -3.7316325e-01 -2.99... Array 2: [ -1.0896991e+00 +0.0000000e+00j -1.0606855e+00 -6.1149403e-17j -9.7447017e-01 -6.0785235e-17j -8.3380221e-01 +1.609... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 240, in check_random_even assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ -1.5935877e-17+0.j -6.5549159e-18-0.j -6.0983963e-18-0.j 1.5380005e-18-0.j 1.1268367e-17-0.j 7.7320506e-18-0.... Array 2: [-0.4516442 -0.4788809 0.260043 -0.1347756 -0.2030805 0.1716475 -0.4098054 0.0137122 0.4516803 -0.2769733 -0.27983... ====================================================================== FAIL: bench_random (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 162, in bench_random assert_array_almost_equal(fft(x),y) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 5.1042851e+01+0.j 3.3063755e+00+1.7852228j -6.7669114e-01+2.3293438j -1.2238892e+00+1.0355247j 9.51554... Array 2: [ 5.1042851e+01 +4.5495051e+01j 4.3657884e-01 +2.1522035e+00j 1.9258532e+00 +5.7806139e-01j -1.3614140e+00 -3.418... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 98, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 20.+0.j 0.+0.j -4.+4.j 0.+0.j -4.+0.j 0.-0.j -4.-4.j 0.-0.j] Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 425, in check_definition assert_array_almost_equal(y,direct_dftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+7.7942286j 0. +0.j 0. +0.j ] [-13.5-7.79... Array 2: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+0.j 0. +0.j 0. -0.j ] [-13.5+0.j ... ====================================================================== FAIL: bench_random (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 250, in bench_random assert_array_almost_equal(ifft(x),y) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.5398477 +0.0000000e+00j -0.0113129 +1.7200661e-02j 0.0226073 -1.0733876e-02j -0.0069369 -9.2497604e-03j -0.033912... Array 2: [ 5.3984769e-01 +5.2528836e-01j -1.8824526e-02 +2.8057247e-02j 2.7252891e-02 -1.4375254e-02j -2.9920605e-02 +1.643... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 183, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.5+0.j 0. -0.j -0.5-0.5j 0. -0.j -0.5+0.j 0. +0.j -0.5+0.5j 0. +0.j ] Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 217, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0392156863%): Array 1: [ 0.0599215 +0.0000000e+00j 0.4412427 -1.5238355e-17j 0.7672772 +8.2722500e-17j 0.3091665 -1.2772878e-17j 0.562163... Array 2: [ 0.0599215 0.8014683 0.9653611 0.3622346 0.22084 0.2799468 0.3725071 0.5772085 0.8525039 0.8912614 0.53216... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 594, in check_definition assert_array_almost_equal(y,direct_idftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5-0.8660254j 0. +0.j 0. +0.j ] [-1.5+0.8660254j ... Array 2: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5+0.j 0. -0.j 0. +0.j ] [-1.5+0.j ... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 341, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ python2.4/site-packages/numpy/testing/utils.py", line 182, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 2.625+0.j -0.375-0.j -0.375-0.j -0.375-0.j 0.625 +0.j -0.375+0.j -0.375+0.j -0.375+0.j] ---------------------------------------------------------------------- Ran 945 tests in 60.079s FAILED (failures=16) -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From dd55 at cornell.edu Thu Jan 5 09:18:22 2006 From: dd55 at cornell.edu (Darren Dale) Date: Thu, 5 Jan 2006 09:18:22 -0500 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: References: <43BC5548.5080604@ee.byu.edu> Message-ID: <200601050918.23185.dd55@cornell.edu> I also got several fft errors, but only when scipy uses fftw-3. Try commenting out the fftw-3 dictionary in system_info.fftw_info. I did so to force scipy to use fftw-2, and test(10) completes without errors. Darren On Thursday 05 January 2006 08:50, Christopher Fonnesbeck wrote: > On Jan 5, 2006, at 6:53 AM, Pearu Peterson wrote: > > test(10) > > This runs pretty far on OSX, but ending with 16 errors from fftpack: > > **************************************************************** > WARNING: cblas module is empty > ----------- > See scipy/INSTALL.txt for troubleshooting. > Notes: > * If atlas library is not found by numpy/distutils/system_info.py, > then scipy uses fblas instead of cblas. > **************************************************************** > > .................... > ====================================================================== > FAIL: bench_random > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 210, in bench_random > assert_array_almost_equal(diff(f,1),direct_diff(f,1)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 96.0%): > Array 1: [ 4.0000000e+00 4.4583823e+00 4.6585202e+00 > 4.5314873e+00 > 4.0154475e+00 3.0782890e+00 1.7457348e+00 1.22... > Array 2: [ 4.0000000e+00 3.9029585e+00 3.6083502e+00 > 3.1083616e+00 > 2.3987172e+00 1.4909128e+00 4.2567472e-01 -7.18... > > > ====================================================================== > FAIL: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 86, in check_definition > assert_array_almost_equal(diff(sin(x),2),direct_diff(sin(x),2)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 87.5%): > Array 1: [ -6.3519652e-15 -3.8268343e-01 -7.0710678e-01 > -9.2387953e-01 > -1.0000000e+00 -9.2387953e-01 -7.0710678e-01 -3.82... > Array 2: [ -7.3854931e-15 6.5259351e-15 -2.4942634e-15 > -7.5636114e-17 > 1.4745663e-15 -1.9133685e-15 2.2804788e-16 8.70... > > > ====================================================================== > FAIL: bench_random > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 356, in bench_random > assert_array_almost_equal(hilbert(f),direct_hilbert(f)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 96.0%): > Array 1: [ -1.0865211e+00 -1.1538025e+00 -1.1519434e+00 > -1.0725580e+00 > -9.1204050e-01 -6.7405401e-01 -3.7184092e-01 -2.91... > Array 2: [ -1.0865211e+00 +0.0000000e+00j -1.0575653e+00 > -5.3966502e-17j > -9.7152085e-01 -3.8615213e-17j -8.3113061e-01 -1.544... > > > ====================================================================== > FAIL: check_random_even > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 332, in check_random_even > assert_array_almost_equal(direct_hilbert(direct_ihilbert(f)),f) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ -4.3368087e-19+0.j 0.0000000e+00-0.j > -4.0066888e-19-0.j > 0.0000000e+00-0.j -3.0665868e-19-0.j 0.0000000e+00-0.... > Array 2: [-0.3138695 -0.320617 -0.0202909 0.0488052 > -0.373401 -0.1621773 > -0.2607906 -0.1146876 -0.12861 0.3907841 0.09393... > > > ====================================================================== > FAIL: bench_random > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 424, in bench_random > assert_array_almost_equal(direct_shift(f,1),sf) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 98.0%): > Array 1: [ 0.6015407 0.6432035 0.7632704 0.947396 > 1.1731603 1.4125187 > 1.635358 1.8137685 1.9262065 1.9605388 1.91521... > Array 2: [ 0.6015407 0.5654719 0.6040595 0.7013589 > 0.8369433 0.9885575 > 1.1344713 1.2554716 1.336468 1.3676865 1.34542... > > > ====================================================================== > FAIL: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 390, in check_definition > assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 88.8888888889%): > Array 1: [ 0.0998334 0.4341242 0.7160532 0.9116156 > 0.9972237 0.9625519 > 0.8117822 0.5630995 0.2464987 -0.0998334 -0.43412... > Array 2: [ 0.0998334 0.0938127 0.0764768 0.0499167 > 0.0173359 -0.0173359 > -0.0499167 -0.0764768 -0.0938127 -0.0998334 -0.09381... > > > ====================================================================== > FAIL: bench_random > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 273, in bench_random > assert_array_almost_equal(tilbert(f,1),direct_tilbert(f,1)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 96.0%): > Array 1: [ -1.0896991e+00 -1.1569240e+00 -1.1548950e+00 > -1.0752326e+00 > -9.1434140e-01 -6.7589846e-01 -3.7316325e-01 -2.99... > Array 2: [ -1.0896991e+00 +0.0000000e+00j -1.0606855e+00 > -6.1149403e-17j > -9.7447017e-01 -6.0785235e-17j -8.3380221e-01 +1.609... > > > ====================================================================== > FAIL: check_random_even > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 240, in check_random_even > assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ -1.5935877e-17+0.j -6.5549159e-18-0.j > -6.0983963e-18-0.j > 1.5380005e-18-0.j 1.1268367e-17-0.j 7.7320506e-18-0.... > Array 2: [-0.4516442 -0.4788809 0.260043 -0.1347756 > -0.2030805 0.1716475 > -0.4098054 0.0137122 0.4516803 -0.2769733 -0.27983... > > > ====================================================================== > FAIL: bench_random (scipy.fftpack.basic.test_basic.test_fft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 162, > in bench_random > assert_array_almost_equal(fft(x),y) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 5.1042851e+01+0.j 3.3063755e+00+1.7852228j > -6.7669114e-01+2.3293438j -1.2238892e+00+1.0355247j > 9.51554... > Array 2: [ 5.1042851e+01 +4.5495051e+01j 4.3657884e-01 > +2.1522035e+00j > 1.9258532e+00 +5.7806139e-01j -1.3614140e+00 -3.418... > > > ====================================================================== > FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 98, > in check_definition > assert_array_almost_equal(y,y1) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 20.+0.j 0.+0.j -4.+4.j 0.+0.j -4.+0.j > 0.-0.j -4.-4.j 0.-0.j] > Array 2: [ 20. +3.j -0.7071068+0.7071068j > -7. +4.j > -0.7071068-0.7071068j -4. -3.j 0.707106... > > > ====================================================================== > FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 425, > in check_definition > assert_array_almost_equal(y,direct_dftn(x)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 22.2222222222%): > Array 1: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] > [-13.5+7.7942286j 0. +0.j 0. +0.j ] > [-13.5-7.79... > Array 2: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] > [-13.5+0.j 0. +0.j 0. -0.j ] > [-13.5+0.j ... > > > ====================================================================== > FAIL: bench_random (scipy.fftpack.basic.test_basic.test_ifft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 250, > in bench_random > assert_array_almost_equal(ifft(x),y) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 0.5398477 +0.0000000e+00j -0.0113129 +1.7200661e-02j > 0.0226073 -1.0733876e-02j -0.0069369 -9.2497604e-03j > -0.033912... > Array 2: [ 5.3984769e-01 +5.2528836e-01j -1.8824526e-02 > +2.8057247e-02j > 2.7252891e-02 -1.4375254e-02j -2.9920605e-02 +1.643... > > > ====================================================================== > FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 183, > in check_definition > assert_array_almost_equal(y,y1) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 2.5+0.j 0. -0.j -0.5-0.5j 0. -0.j -0.5+0.j > 0. +0.j -0.5+0.5j > 0. +0.j ] > Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j > -0.125 -0.5j > 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... > > > ====================================================================== > FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 217, > in check_random_real > assert_array_almost_equal (ifft(fft(x)),x) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 98.0392156863%): > Array 1: [ 0.0599215 +0.0000000e+00j 0.4412427 -1.5238355e-17j > 0.7672772 +8.2722500e-17j 0.3091665 -1.2772878e-17j > 0.562163... > Array 2: [ 0.0599215 0.8014683 0.9653611 0.3622346 > 0.22084 0.2799468 > 0.3725071 0.5772085 0.8525039 0.8912614 0.53216... > > > ====================================================================== > FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 594, > in check_definition > assert_array_almost_equal(y,direct_idftn(x)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 22.2222222222%): > Array 1: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] > [-1.5-0.8660254j 0. +0.j 0. +0.j ] > [-1.5+0.8660254j ... > Array 2: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] > [-1.5+0.j 0. -0.j 0. +0.j ] > [-1.5+0.j ... > > > ====================================================================== > FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 341, > in check_definition > assert_array_almost_equal(y,ifft(x1)) > File "/Library/Frameworks/Python.framework/Versions/2.4/lib/ > python2.4/site-packages/numpy/testing/utils.py", line 182, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 50.0%): > Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 > 0.625 0.4356602 -0.375 > 0.9356602] > Array 2: [ 2.625+0.j -0.375-0.j -0.375-0.j -0.375-0.j 0.625 > +0.j -0.375+0.j > -0.375+0.j -0.375+0.j] > > > ---------------------------------------------------------------------- > Ran 945 tests in 60.079s > > FAILED (failures=16) > > > -- > Christopher J. Fonnesbeck > > Population Ecologist, Marine Mammal Section > Fish & Wildlife Research Institute (FWC) > St. Petersburg, FL > > Adjunct Assistant Professor > Warnell School of Forest Resources > University of Georgia > Athens, GA > > T: 727.235.5570 > E: chris at trichech.us -- Darren S. Dale, Ph.D. Cornell High Energy Synchrotron Source Cornell University 200L Wilson Lab Rt. 366 & Pine Tree Road Ithaca, NY 14853 dd55 at cornell.edu office: (607) 255-9894 fax: (607) 255-9001 From chris at trichech.us Thu Jan 5 09:45:13 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Thu, 5 Jan 2006 09:45:13 -0500 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <200601050918.23185.dd55@cornell.edu> References: <43BC5548.5080604@ee.byu.edu> <200601050918.23185.dd55@cornell.edu> Message-ID: <6D83A8E7-6556-498E-9457-EFF39CBDA1B2@trichech.us> On Jan 5, 2006, at 9:18 AM, Darren Dale wrote: > I also got several fft errors, but only when scipy uses fftw-3. Try > commenting > out the fftw-3 dictionary in system_info.fftw_info. I did so to > force scipy > to use fftw-2, and test(10) completes without errors. I only have fftw-2.1.5 on my system, as far as I can tell: fftw_info: fftw3 not found FOUND: libraries = ['rfftw', 'fftw'] library_dirs = ['/usr/local/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/usr/local/include'] -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From jdhunter at ace.bsd.uchicago.edu Thu Jan 5 09:42:23 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 05 Jan 2006 08:42:23 -0600 Subject: [SciPy-dev] Now new svn of numpy is ready In-Reply-To: <43BC5548.5080604@ee.byu.edu> (Travis Oliphant's message of "Wed, 04 Jan 2006 16:07:52 -0700") References: <43BC5548.5080604@ee.byu.edu> Message-ID: <87irsyg2i8.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Travis" == Travis Oliphant writes: Travis> My previous post was pre-mature as the fixed numpy was Travis> only in the numpy branch. I have now finished *merging* Travis> the changes from the numpy branch into the trunk so Travis> svn co http://svn.scipy.org/svn/numpy/trunk numpy I just tried an install with a fresh svn co on solaris 8, and failed. There were some problems apparently linking with libatlas, I don't know if these were fatal, but the build eventually dies with File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 443, in add_data_dir self.add_data_files((ds,filenames)) File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 498, in add_data_files dist.data_files.extend(data_dict.items()) AttributeError: 'NoneType' object has no attribute 'extend' Below is my complete build and version info. Thanks, JDH Running from numpy source directory. Assuming default configuration (numpy/distutils/command/{setup_command,setup}.py was not found) Appending numpy.distutils.command configuration to numpy.distutils Assuming default configuration (numpy/distutils/fcompiler/{setup_fcompiler,setup}.py was not found) Appending numpy.distutils.fcompiler configuration to numpy.distutils Appending numpy.distutils configuration to numpy Appending numpy.testing configuration to numpy No module named __svn_version__ Creating numpy/f2py/__svn_version__.py (version='1825') F2PY Version 2_1825 Appending numpy.f2py configuration to numpy blas_opt_info: blas_mkl_info: NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS Setting PTATLAS=ATLAS Setting PTATLAS=ATLAS FOUND: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/opt/lib'] language = c include_dirs = ['/opt/include'] running build_src building extension "atlas_version" sources creating build creating build/src adding 'build/src/atlas_version_0x4568501b.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'atlas_version' extension compiling C sources gcc options: '-fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.solaris-2.8-i86pc-2.3 creating build/temp.solaris-2.8-i86pc-2.3/build creating build/temp.solaris-2.8-i86pc-2.3/build/src compile options: '-I/opt/include -Inumpy/core/include -I/opt/lang/python/include/python2.3 -c' gcc: build/src/atlas_version_0x4568501b.c gcc -shared build/temp.solaris-2.8-i86pc-2.3/build/src/atlas_version_0x4568501b.o -L/opt/lib -lptf77blas -lptcblas -latlas -o build/temp.solaris-2.8-i86pc-2.3/atlas_version.so Text relocation remains referenced against symbol offset in file 0x7 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc /opt/lib/libatlas.a(ATL_buildinfo.o) 0x11 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x16 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x23 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x28 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x38 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x3d /opt/lib/libatlas.a(ATL_buildinfo.o) 0x4a /opt/lib/libatlas.a(ATL_buildinfo.o) 0x4f /opt/lib/libatlas.a(ATL_buildinfo.o) 0x5f /opt/lib/libatlas.a(ATL_buildinfo.o) 0x64 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x71 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x76 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x8b /opt/lib/libatlas.a(ATL_buildinfo.o) 0x95 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x9a /opt/lib/libatlas.a(ATL_buildinfo.o) 0x9f /opt/lib/libatlas.a(ATL_buildinfo.o) 0xa4 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xb4 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xb9 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc3 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc8 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xcd /opt/lib/libatlas.a(ATL_buildinfo.o) 0xd2 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xe2 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xe7 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xf1 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xf6 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xfb /opt/lib/libatlas.a(ATL_buildinfo.o) 0x100 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x110 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x115 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x1b /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x2d /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x42 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x54 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x69 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x7b /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x90 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xa9 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xbe /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xd7 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xec /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x105 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x11a /opt/lib/libatlas.a(ATL_buildinfo.o) ld: fatal: relocations remain against allocatable but non-writable sections collect2: ld returned 1 exit status Text relocation remains referenced against symbol offset in file 0x7 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc /opt/lib/libatlas.a(ATL_buildinfo.o) 0x11 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x16 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x23 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x28 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x38 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x3d /opt/lib/libatlas.a(ATL_buildinfo.o) 0x4a /opt/lib/libatlas.a(ATL_buildinfo.o) 0x4f /opt/lib/libatlas.a(ATL_buildinfo.o) 0x5f /opt/lib/libatlas.a(ATL_buildinfo.o) 0x64 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x71 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x76 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x8b /opt/lib/libatlas.a(ATL_buildinfo.o) 0x95 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x9a /opt/lib/libatlas.a(ATL_buildinfo.o) 0x9f /opt/lib/libatlas.a(ATL_buildinfo.o) 0xa4 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xb4 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xb9 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc3 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xc8 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xcd /opt/lib/libatlas.a(ATL_buildinfo.o) 0xd2 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xe2 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xe7 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xf1 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xf6 /opt/lib/libatlas.a(ATL_buildinfo.o) 0xfb /opt/lib/libatlas.a(ATL_buildinfo.o) 0x100 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x110 /opt/lib/libatlas.a(ATL_buildinfo.o) 0x115 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x1b /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x2d /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x42 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x54 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x69 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x7b /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x90 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xa9 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xbe /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xd7 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0xec /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x105 /opt/lib/libatlas.a(ATL_buildinfo.o) printf 0x11a /opt/lib/libatlas.a(ATL_buildinfo.o) ld: fatal: relocations remain against allocatable but non-writable sections collect2: ld returned 1 exit status ##### msg: error: Command "gcc -shared build/temp.solaris-2.8-i86pc-2.3/build/src/atlas_version_0x4568501b.o -L/opt/lib -lptf77blas -lptcblas -latlas -o build/temp.solaris-2.8-i86pc-2.3/atlas_version.so" failed with exit status 1 error: Command "gcc -shared build/temp.solaris-2.8-i86pc-2.3/build/src/atlas_version_0x4568501b.o -L/opt/lib -lptf77blas -lptcblas -latlas -o build/temp.solaris-2.8-i86pc-2.3/atlas_version.so" failed with exit status 1 FOUND: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/opt/lib'] language = c define_macros = [('NO_ATLAS_INFO', 2)] include_dirs = ['/opt/include'] distutils distribution has been initialized, it may be too late to add an extension _dotblas Traceback (most recent call last): File "setup.py", line 73, in ? setup_package() File "setup.py", line 58, in setup_package config.add_subpackage('numpy') File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 399, in add_subpackage config = self.get_subpackage(subpackage_name,subpackage_path) File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 389, in get_subpackage config = setup_module.configuration(*args) File "/home/titan/johnh/python/svn/numpy/numpy/setup.py", line 10, in configuration config.add_subpackage('core') File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 399, in add_subpackage config = self.get_subpackage(subpackage_name,subpackage_path) File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 389, in get_subpackage config = setup_module.configuration(*args) File "numpy/core/setup.py", line 195, in configuration config.add_data_dir('tests') File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 443, in add_data_dir self.add_data_files((ds,filenames)) File "/home/titan/johnh/python/svn/numpy/numpy/distutils/misc_util.py", line 498, in add_data_files dist.data_files.extend(data_dict.items()) AttributeError: 'NoneType' object has no attribute 'extend' removed numpy/f2py/__svn_version__.py removed numpy/f2py/__svn_version__.pyc Python 2.3.4 (#12, Jul 2 2004, 09:48:10) [GCC 3.3.2] on sunos5 Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> sys.platform 'sunos5' >>> From arnd.baecker at web.de Thu Jan 5 10:29:20 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 5 Jan 2006 16:29:20 +0100 (CET) Subject: [SciPy-dev] version check for dual? Message-ID: Hi, I just tried to install numpy on another machine: ====================================================================== ERROR: check_basic (numpy.core.defmatrix.test_defmatrix.test_algebra) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_scsl/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 111, in check_basic assert allclose((mA ** -i).A, B) File "/home/baecker/python2//scipy_lintst_n_scsl/lib/python2.4/site-packages/numpy/core/defmatrix.py", line 148, in __pow__ x = self.I File "/home/baecker/python2//scipy_lintst_n_scsl/lib/python2.4/site-packages/numpy/core/defmatrix.py", line 202, in getI from numpy.dual import inv File "/home/baecker/python2//scipy_lintst_n_scsl/lib/python2.4/site-packages/numpy/dual.py", line 32, in ? inv = linpkg.inv File "/home/baecker/python2/lib/python2.4/site-packages/scipy_base/ppimport.py", line 303, in __getattr__ module = self._ppimport_importer() File "/home/baecker/python2/lib/python2.4/site-packages/scipy_base/ppimport.py", line 273, in _ppimport_importer module = __import__(name,None,None,['*']) File "/home/baecker/python2/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/home/baecker/python2/lib/python2.4/site-packages/scipy/linalg/basic.py", line 12, in ? from lapack import get_lapack_funcs File "/home/baecker/python2/lib/python2.4/site-packages/scipy/linalg/lapack.py", line 15, in ? import flapack ImportError: /home/baecker/python2/lib/python2.4/site-packages/scipy/linalg/flapack.so: undefined symbol: for_cpstr ---------------------------------------------------------------------- Ran 178 tests in 3.777s The reason is that an (old) new scipy from site-packages is found. In this particular case I *could* delete that stuff, but in general a user might want to test out the new numpy by installing it to a different place than the system-wide site-packages. In such a situation he could not remove the `site-packages/scipy/` stuff (unless he has root rights). I am not quite sure how to resolve this (apart from installing a recent scipy to the same place as numpy). Could maybe a version check help? Best, Arnd From chris at pseudogreen.org Thu Jan 5 10:29:58 2006 From: chris at pseudogreen.org (Christopher Stawarz) Date: Thu, 5 Jan 2006 10:29:58 -0500 Subject: [SciPy-dev] PATCH: numpy build problems on Solaris Message-ID: Hi, I've attached a patch to fix some problems I encountered when building numpy (latest from SVN) on Solaris using Sun's WorkShop 10 compiler. The important fix is that the files _signbit.c and _isnan.c weren't getting included in the source distribution, which caused my build to fail when _signbit.c was needed and not found. The rest are just a few minor tweaks to remove a mess of compiler warnings about empty structure members. Cheers, Chris -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: numpy_solaris_patch.txt URL: From travis at enthought.com Thu Jan 5 10:41:02 2006 From: travis at enthought.com (Travis N. Vaught) Date: Thu, 05 Jan 2006 09:41:02 -0600 Subject: [SciPy-dev] Congratulations In-Reply-To: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> Message-ID: <43BD3E0E.10704@enthought.com> Ed Schofield wrote: >Hi all, > >Congratulations to all who helped out with the naming transition! >Thanks especially to Travis and Pearu. > > > Congratulations indeed, and many thanks for the sacrifices of many. January 4, should be remembered as a true milestone in scientific computing with python. January 5th should also be noted as a day to remember the sacrifices of the Amy Oliphants of the world for putting up with their grumpy, sleep-deprived other-halves. With Gratitude and Best Regards, Travis (happy-to-be-confused-with-Oliphant) Vaught -- ........................ Travis N. Vaught CEO Enthought, Inc. http://www.enthought.com ........................ From jh at oobleck.astro.cornell.edu Thu Jan 5 11:25:44 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Thu, 5 Jan 2006 11:25:44 -0500 Subject: [SciPy-dev] Long live to numpy (and its list as well) In-Reply-To: (scipy-dev-request@scipy.net) References: Message-ID: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> Francesc Altet on scipy-dev: > Now that numpy is dead and we all want a long life to numpy, what > about moving discussions in this list into the venerable: > "numpy-discussion " We're one community (or want to be). Could we just have one mailing list per type of discussion (dev and user)? All the cross-posting and cross-monitoring is tedious. I don't care whether we keep scipy-* or numpy-* lists, but is there a functional reason to have four lists? Consider that soon we may have *-doc, *-newbie, *-announce, and others as well, if this takes off like we all hope. If the developers want separate lists because some are only working on one of the two packages, I can see that argument (in the future if not now). I don't see a need for two user lists, unless perhaps sorted by high and low traffic. I propose we either drop the numpy-* lists (if subscribers there agree), or leave them for ongoing discussion of the legacy packages, and discourage discussion of the new numpy/scipy there. Ok, flame me. --jh-- From Fernando.Perez at colorado.edu Thu Jan 5 12:57:14 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 10:57:14 -0700 Subject: [SciPy-dev] [SciPy-user] Long live to numpy (and its list as well) In-Reply-To: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> References: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> Message-ID: <43BD5DFA.20906@colorado.edu> Joe Harrington wrote: > Francesc Altet on scipy-dev: > > >>Now that numpy is dead and we all want a long life to numpy, what >>about moving discussions in this list into the venerable: > > >>"numpy-discussion " > > > We're one community (or want to be). Could we just have one mailing > list per type of discussion (dev and user)? All the cross-posting and > cross-monitoring is tedious. I don't care whether we keep scipy-* or > numpy-* lists, but is there a functional reason to have four lists? > Consider that soon we may have *-doc, *-newbie, *-announce, and others > as well, if this takes off like we all hope. If the developers want > separate lists because some are only working on one of the two > packages, I can see that argument (in the future if not now). I don't > see a need for two user lists, unless perhaps sorted by high and low > traffic. > > I propose we either drop the numpy-* lists (if subscribers there > agree), or leave them for ongoing discussion of the legacy packages, > and discourage discussion of the new numpy/scipy there. > > Ok, flame me. Uh, no. I'm actually with you on this one: I just don't think we are a large enough group to warrant the existence of separate numpy- and scipy- lists, especially when the overlap in topics is so high. Every scipy user is, by necessity, a numpy user as well. I think that, IF in the future: 1. the traffic on the scipy- lists becomes enormous, AND 2. a significant portion of that traffic is for users of numpy as a pure array library with no scientific concerns (if it really becomes a popular 'data exchange' system for Python-and-C libraries), THEN we can consider resuscitating the numpy lists. For now, I vote on leaving them dormant, and moving all numeric(abandoned), numarray(maintenance-transition) and numpy/scipy (new development) discussion to the scipy-* lists. I don't think the occasional post about Numeric or numarray is a major concern (at least it doesn't bother me). It is an issue also of friendliness to newbies: I'd like to tell newcomers "for information and discussion, just join scipy-user and matplotlib-user, and you should be set on all numerics and plotting in python". Telling them to subscribe to, or monitor via gmane, 8 different lists is annoying. Cheers, f From Fernando.Perez at colorado.edu Thu Jan 5 13:00:41 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 11:00:41 -0700 Subject: [SciPy-dev] Carrying the lists on gmane? Message-ID: <43BD5EC9.7010200@colorado.edu> Hi all, I propose subscribing scipy-user and scipy-dev to gmane, so they are carried as newsgroups: http://gmane.org/subscribe.php I do this with ipython: http://news.gmane.org/gmane.comp.python.ipython.user http://news.gmane.org/gmane.comp.python.ipython.devel and I think it's a very convenient system. People can monitor the lists without being subscribed, from the comfort of a newsreader or web interface which doesn't clog their inbox (that's how I read comp.lang.python and python-dev, and I love it). This also provides an excellent, fast search interface (better than the painfully slow plone search on scipy.org). If we do it, we should advertise this on the new scipy wiki, for newcomers. Anyone against? f From robert.kern at gmail.com Thu Jan 5 13:06:35 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 05 Jan 2006 12:06:35 -0600 Subject: [SciPy-dev] Carrying the lists on gmane? In-Reply-To: <43BD5EC9.7010200@colorado.edu> References: <43BD5EC9.7010200@colorado.edu> Message-ID: <43BD602B.9030301@gmail.com> Fernando Perez wrote: > Hi all, > > I propose subscribing scipy-user and scipy-dev to gmane, so they are carried > as newsgroups: > > http://gmane.org/subscribe.php http://dir.gmane.org/gmane.comp.python.scientific.user http://dir.gmane.org/gmane.comp.python.scientific.devel -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From eric at enthought.com Thu Jan 5 13:04:40 2006 From: eric at enthought.com (eric jones) Date: Thu, 05 Jan 2006 12:04:40 -0600 Subject: [SciPy-dev] Carrying the lists on gmane? In-Reply-To: <43BD5EC9.7010200@colorado.edu> References: <43BD5EC9.7010200@colorado.edu> Message-ID: <43BD5FB8.2040702@enthought.com> Sounds good to me. Are there any drawbacks to this? Its pretty painless from an IT perspective right? We just subscribe and it watches our list... thanks, eric Fernando Perez wrote: >Hi all, > >I propose subscribing scipy-user and scipy-dev to gmane, so they are carried >as newsgroups: > >http://gmane.org/subscribe.php > >I do this with ipython: > >http://news.gmane.org/gmane.comp.python.ipython.user >http://news.gmane.org/gmane.comp.python.ipython.devel > >and I think it's a very convenient system. People can monitor the lists >without being subscribed, from the comfort of a newsreader or web interface >which doesn't clog their inbox (that's how I read comp.lang.python and >python-dev, and I love it). > >This also provides an excellent, fast search interface (better than the >painfully slow plone search on scipy.org). > >If we do it, we should advertise this on the new scipy wiki, for newcomers. > >Anyone against? > >f > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From Fernando.Perez at colorado.edu Thu Jan 5 13:12:51 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 11:12:51 -0700 Subject: [SciPy-dev] Carrying the lists on gmane? In-Reply-To: <43BD602B.9030301@gmail.com> References: <43BD5EC9.7010200@colorado.edu> <43BD602B.9030301@gmail.com> Message-ID: <43BD61A3.4030809@colorado.edu> Robert Kern wrote: > Fernando Perez wrote: > >>Hi all, >> >>I propose subscribing scipy-user and scipy-dev to gmane, so they are carried >>as newsgroups: >> >>http://gmane.org/subscribe.php > > > http://dir.gmane.org/gmane.comp.python.scientific.user > http://dir.gmane.org/gmane.comp.python.scientific.devel Oh. I searched their list for 'scipy' and it turned out empty, so I thought we weren't subscribed. Sorry. I guess then all we need is prominent mention of this fact on our shiny new wiki. Cheers, f From Fernando.Perez at colorado.edu Thu Jan 5 13:14:05 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 05 Jan 2006 11:14:05 -0700 Subject: [SciPy-dev] Carrying the lists on gmane? In-Reply-To: <43BD5FB8.2040702@enthought.com> References: <43BD5EC9.7010200@colorado.edu> <43BD5FB8.2040702@enthought.com> Message-ID: <43BD61ED.2010708@colorado.edu> eric jones wrote: > Sounds good to me. Are there any drawbacks to this? Its pretty > painless from an IT perspective right? We just subscribe and it watches > our list... Yup, it's so painless that we already had it (as Robert pointed out) and didn't even know it :) Cheers, f From oliphant.travis at ieee.org Thu Jan 5 14:48:27 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 12:48:27 -0700 Subject: [SciPy-dev] Congratulations In-Reply-To: <43BD3E0E.10704@enthought.com> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <43BD3E0E.10704@enthought.com> Message-ID: <43BD780B.9060203@ieee.org> Travis N. Vaught wrote: >Congratulations indeed, and many thanks for the sacrifices of many. > >January 4, should be remembered as a true milestone in scientific >computing with python. January 5th should also be noted as a day to >remember the sacrifices of the Amy Oliphants of the world for putting up >with their grumpy, sleep-deprived other-halves. > > Very astute observation. Amy has been super-humanly supportive of all the time I've spent on this. Having even a little bit tangible from book sales has helped her I think (unless she actually sat down and figured out that my per-hour wage so far based on book sales is ridiculously low). Her email is apoliphant at gmail.com if anybody else wants to express appreciation... ;-) -Travis From chris at trichech.us Thu Jan 5 14:49:27 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Thu, 5 Jan 2006 14:49:27 -0500 Subject: [SciPy-dev] eggs and f2py extensions Message-ID: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> Are f2py extensions compatible with eggs? At the moment, if I try to import setup from setuptools rather than from numpy.distutils, it does not recognize fortran extensions. Thanks, C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From chanley at stsci.edu Thu Jan 5 15:49:00 2006 From: chanley at stsci.edu (Christopher Hanley) Date: Thu, 05 Jan 2006 15:49:00 -0500 Subject: [SciPy-dev] numpy build on Solaris 8 Message-ID: <43BD863C.6040502@stsci.edu> Hi Travis, Just wanted to let you know that I can build numpy revision 1829 on our Solaris 8 systems using the native (ancient) Solaris compilers. All of my regression tests pass. All is right with the world. Thank you for all of your time and help, Chris From oliphant.travis at ieee.org Thu Jan 5 16:16:35 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 14:16:35 -0700 Subject: [SciPy-dev] [SciPy-user] scipy.basic to numpy In-Reply-To: <3EE37000-1673-4B3A-97DB-470221C9E051@trichech.us> References: <43BD899C.4060403@ieee.org> <3EE37000-1673-4B3A-97DB-470221C9E051@trichech.us> Message-ID: <43BD8CB3.80508@ieee.org> Christopher Fonnesbeck wrote: > On Jan 5, 2006, at 4:03 PM, Travis Oliphant wrote: > >> Please re-send any bug reports regarding fft on OSX. > > > Here is mine again, from earlier: > Thanks, but these are tests from full scipy, right? Important, but right now my pressing concern is numpy. Rob seemed to be saying that he was getting fft failures in numpy. Perhaps this makes the case for a separate numpy list.... -Travis From Norbert.Nemec.list at gmx.de Thu Jan 5 16:33:49 2006 From: Norbert.Nemec.list at gmx.de (Norbert Nemec) Date: Thu, 05 Jan 2006 22:33:49 +0100 Subject: [SciPy-dev] Intel MKL without Intel Fortran??!! Message-ID: <43BD90BD.8030403@gmx.de> Hi there, on my system, I have the Intel Fortran compiler installed along with Intel MKL, as well as the g77 and gfortran 4.0 and Atlas. The default configuration in this setup seems to be somewhat strange: the gnu compiler is preferred over the intel compiler, but still, the intel MKL is preferred over atlas. This means, that the intel MKL is linked in with GNU compiled libraries. If I'm not mistaken, such a setup is doomed to fail, because the intel MKL only works with Intel compiled programs and libraries. (Correct me if I'm wrong.) What I see is: ------------------------------------------- In [1]: import numpy In [2]: numpy.__version__ Out[2]: '0.9.2.1828' In [2]: numpy.test() Found 3 tests for numpy.distutils.misc_util Found 2 tests for numpy.core.umath Found 42 tests for numpy.lib.type_check Found 9 tests for numpy.lib.twodim_base Found 3 tests for numpy.dft.helper Found 20 tests for numpy.core.ma Found 3 tests for numpy.lib.getlimits Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 6 tests for numpy.core.records Found 4 tests for numpy.lib.index_tricks Found 44 tests for numpy.lib.shape_base Found 0 tests for __main__ ..................................................................................MKL FATAL ERROR: /usr/local/lib/libmkl_def.so: undefined symbol: __kmpc_global_thread_num ------------------------------------------- After that, the python interpreter itself crashes. I'm not sure whether my analysis above is correct or whether there is some other problem. In any case: the test breaks and it seems to be related to the MKL... Thanks for help, Greetings, Norbert From pearu at scipy.org Thu Jan 5 15:40:04 2006 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 5 Jan 2006 14:40:04 -0600 (CST) Subject: [SciPy-dev] Intel MKL without Intel Fortran??!! In-Reply-To: <43BD90BD.8030403@gmx.de> References: <43BD90BD.8030403@gmx.de> Message-ID: On Thu, 5 Jan 2006, Norbert Nemec wrote: > Hi there, > > on my system, I have the Intel Fortran compiler installed along with > Intel MKL, as well as the g77 and gfortran 4.0 and Atlas. > > The default configuration in this setup seems to be somewhat strange: > the gnu compiler is preferred over the intel compiler, but still, the > intel MKL is preferred over atlas. This means, that the intel MKL is > linked in with GNU compiled libraries. If I'm not mistaken, such a setup > is doomed to fail, because the intel MKL only works with Intel compiled > programs and libraries. (Correct me if I'm wrong.) In my debian box with g77-3.4.5 and gcc-4.0.3 I can use MKL-8.0.1 without intel fortran compiler. Anyway, I also find choosing MKL by default not the best solution. To disable MKL, define export MKL=None before building scipy. > What I see is: > > Found 0 tests for __main__ > ..................................................................................MKL > FATAL ERROR: /usr/local/lib/libmkl_def.so: undefined symbol: > __kmpc_global_thread_num > ------------------------------------------- What MKL version are you using? Pearu From faltet at carabos.com Thu Jan 5 16:40:57 2006 From: faltet at carabos.com (Francesc Altet) Date: Thu, 5 Jan 2006 22:40:57 +0100 Subject: [SciPy-dev] A proposal for dtype/dtypedescr in numpy objects Message-ID: <200601052240.58099.faltet@carabos.com> Hi, In my struggle for getting consistent behaviours with data types, I've ended with a new proposal for treating them. The basic thing is that I suggest to deprecate .dtype as being a first-class attribute and replace it instead by the descriptor type container, which I find quite more useful for end users. The current .dtype type will be still accessible (mainly for developers) but buried in .dtype.type. Briefly stated: current proposed ======= ======== .dtypedescr --> moved into .dtype .dtype --> moved into .dtype.type .dtype.dtypestr --> moved into .dtype.str new .dtype.name What is achieved with that? Well, not much, except easy of use and type comparison correctness. For example, with the next setup: >>> import numpy >>> a=numpy.arange(10,dtype='i') >>> b=numpy.arange(10,dtype='l') we have currently: >>> a.dtype >>> a.dtypedescr dtypedescr('>> a.dtypedescr.dtypestr '>> a.dtype.__name__[:-8] 'int32' >>> a.dtype == b.dtype False With the new proposal, we would have: >>> a.dtype.type >>> a.dtype dtype('>> a.dtype.str '>> a.dtype.name 'int32' >>> a.dtype == b.dtype True The advantages of the new proposal are: - No more .dtype and .dtypedescr lying together, just current .dtypedescr renamed to .dtype. I think that current .dtype does not provide more useful information than current .dtypedesc, and giving it a shorter name than .dtypedescr seems to indicate that it is more useful to users (and in my opinion, it isn't). - Current .dtype is still accessible, but specifying and extra name in path: .dtype.type (can be changed into .dtype.type_ or whatever). This should be useful mainly for developers. - Added a useful dtype(descr).name so that one can quickly access to the type name. - Comparison between data types works as it should now (without having to create a metaclass for PyType_Type). Drawbacks: - Backward incompatible change. However, provided the advantages are desirable, I think it is better changing now than later. - I don't specially like the string representation for the new .dtype class. For example, I'd find dtype('Int32') much better than dtype('0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" -------------- next part -------------- A non-text attachment was scrubbed... Name: dtype-dtypedescr.patch Type: text/x-diff Size: 28690 bytes Desc: not available URL: From Norbert.Nemec.list at gmx.de Thu Jan 5 16:48:02 2006 From: Norbert.Nemec.list at gmx.de (Norbert Nemec) Date: Thu, 05 Jan 2006 22:48:02 +0100 Subject: [SciPy-dev] Intel MKL without Intel Fortran??!! In-Reply-To: References: <43BD90BD.8030403@gmx.de> Message-ID: <43BD9412.2060709@gmx.de> Pearu Peterson wrote: >On Thu, 5 Jan 2006, Norbert Nemec wrote: > > > >>Hi there, >> >>on my system, I have the Intel Fortran compiler installed along with >>Intel MKL, as well as the g77 and gfortran 4.0 and Atlas. >> >>The default configuration in this setup seems to be somewhat strange: >>the gnu compiler is preferred over the intel compiler, but still, the >>intel MKL is preferred over atlas. This means, that the intel MKL is >>linked in with GNU compiled libraries. If I'm not mistaken, such a setup >>is doomed to fail, because the intel MKL only works with Intel compiled >>programs and libraries. (Correct me if I'm wrong.) >> >> > >In my debian box with g77-3.4.5 and gcc-4.0.3 I can use MKL-8.0.1 without >intel fortran compiler. > OK, then obviously, that was not the problem. > Anyway, I also find choosing MKL by default not >the best solution. To disable MKL, define > > export MKL=None > >before building scipy. > > Thanks, that would actually be the easiest solution in my case. >>What I see is: >> >> Found 0 tests for __main__ >>..................................................................................MKL >>FATAL ERROR: /usr/local/lib/libmkl_def.so: undefined symbol: >>__kmpc_global_thread_num >>------------------------------------------- >> >> > >What MKL version are you using? > > Something rather ancient: 6.1 - I guess I'll just throw it out. I haven't been using it in a long time anyway. From faltet at carabos.com Thu Jan 5 16:49:57 2006 From: faltet at carabos.com (Francesc Altet) Date: Thu, 5 Jan 2006 22:49:57 +0100 Subject: [SciPy-dev] A proposal for dtype/dtypedescr in numpy objects In-Reply-To: <200601052240.58099.faltet@carabos.com> References: <200601052240.58099.faltet@carabos.com> Message-ID: <200601052249.57811.faltet@carabos.com> Oops, re-reading my last message, I discovered a small errata: A Dijous 05 Gener 2006 22:40, Francesc Altet va escriure: > - I don't specially like the string representation for the new .dtype > class. For example, I'd find dtype('Int32') much better than ^^^^^ should read 'int32' (to follow numpy conventions) > dtype(' code, but they can be made later on (much less disruptive than the > proposed change). -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From oliphant.travis at ieee.org Thu Jan 5 15:58:45 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 13:58:45 -0700 Subject: [SciPy-dev] Just saw this Message-ID: <43BD8885.5080300@ieee.org> http://heim.ifi.uio.no/~kent-and/software/Instant/doc/Instant.html I'd like to fix weave so it can be used *more easily* in this way (creating an extension module on-the-fly). -Travis From faltet at carabos.com Thu Jan 5 17:03:03 2006 From: faltet at carabos.com (Francesc Altet) Date: Thu, 5 Jan 2006 23:03:03 +0100 Subject: [SciPy-dev] [Numpy-discussion] Re: [SciPy-user] Long live to numpy (and its list as well) In-Reply-To: <43BD7A99.8070002@noaa.gov> References: <43BD6109.9030402@sympatico.ca> <43BD7A99.8070002@noaa.gov> Message-ID: <200601052303.04502.faltet@carabos.com> A Dijous 05 Gener 2006 20:59, Christopher Barker va escriure: > Colin J. Williams wrote: > > It seems to me that NumPy is the better way to go, the archives and > > downloads are more readily available there, but it's up to Todd Miller > > and the folk who have been maintaining NumPy. > > > > NumPy has been around for a while, probably longer than SciPy. > > I, for one, am only subscribed to the NumPy list, and have been for > years. I don't know how much non-NumPy stuff there is on the SciPy list, > but while I do use NumPy for Scientific stuff, I don't use the rest of > SciPy (because, frankly, it's always been a pain in the @#! to install) > and I don't need another high-traffic list. Yeah, I was thinking in people like you. In fact, I'm myself in the same case than you: I'm very interested in a basic array module (Numeric/numarray/numpy) and not so much in more powerful (but complex) packages like scipy. And I think there can be a lot of people out there that can be in the same position. Accordingly, my vote is also: +1 numpy-discussion -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From strawman at astraw.com Thu Jan 5 17:22:09 2006 From: strawman at astraw.com (Andrew Straw) Date: Thu, 05 Jan 2006 14:22:09 -0800 Subject: [SciPy-dev] ReSt for new.scipy.org? In-Reply-To: References: Message-ID: <43BD9C11.8090605@astraw.com> Arnd Baecker wrote: >Hi, > >would it be possible to add restructured text support for the new >http://new.scipy.org/Wiki/ ? > > Done. Use this in the wiki: {{{ #!rst This is in ReStructuredText! }}} From robert.kern at gmail.com Thu Jan 5 17:24:54 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 05 Jan 2006 16:24:54 -0600 Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> Message-ID: <43BD9CB6.50103@gmail.com> Christopher Fonnesbeck wrote: > Are f2py extensions compatible with eggs? At the moment, if I try to > import setup from setuptools rather than from numpy.distutils, it does > not recognize fortran extensions. setuptools does interfere with the Fortran extensions to build_ext a little bit if one isn't careful. Everything works fine if you *build* without setuptools, and then do the bdist_egg command separately with setuptools. E.g. $ python setup.py build $ python -c "import setuptools; execfile('setup.py')" bdist_egg -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Thu Jan 5 18:25:07 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 16:25:07 -0700 Subject: [SciPy-dev] ANN: Release of NumPy 0.9.2 Message-ID: <43BDAAD3.6070705@ieee.org> Numpy 0.9.2 is the successor to both Numeric and Numarray and builds and uses code from both. This release marks the first release using the new (but historical) Numpy name. The release notes are included below: Best regards, -Travis Oliphant Release Notes ================== NumPy 0.9.2 marks the first release of the new array package under its new name. This new name should reflect that the new package is a hybrid of the Numeric and Numarray packages. This release adds many more features and speed-enhancements from Numarray. Changes from (SciPy Core) 0.8.4: - Namespace and Python package name is now "numpy" and "numpy" instead of "scipy" and "scipy_core" respectively. This should help packagers and egg-builders. - The NumPy arrayobject now both exports and consumes the full array_descr protocol (including field information). - Removed NOTSWAPPED flag. The byteswapping information is handled by the data-type descriptor. - The faster sorting functions were brought over from numarray leading to a factor of 2-3 speed increase in sorting. Also changed .sort() method to be in-place like numarray and lists. - Polynomial division has been fixed. - basic.fft, basic.linalg, basic.random have been moved to dft, linalg, and random respectively (no extra basic sub-package layer). - Introduced numpy.dual to allow calling of functions that are both in SciPy and NumPy when it is desired to use the SciPy function if the user has it but otherwise use the NumPy function. - The "rtype" keyword used in a couple of places has been changed to "dtype" for consistency. - Fixes so that the standard array constructor can be used to construct record-arrays with fields. - Changed method .toscalar() to .item() (added to convertcode.py) - Added numpy.lib.mlab to be fully compatible with old MLab including the addition of a kaiser window even when full SciPy is not installed. - Arrays of nested records should behave better. - Fixed masked arrays buglets. - Added code so that strings can be converted to numbers using .astype() - Added a lexsort (lexigraphic) function so that sorting on multiple keys can be done -- very useful for record-arrays - Speed ups and bug-fixes for 1-d "fancy" indexing by going through the flattened array iterator when possible. - Added the ability to add docstrings to builtin objects "on-the-fly". Allows adding docstrings without re-compiling C-code. - Moved the weave subpackage to SciPy. - Changed the fields attribute of the dtypedescr object to return a "read-only" dictionary when accessed from Python. - Added a typeNA dictionary for the numarray types and added a compare function for dtypedescr objects so that equivalent types can be detected. Please not that all modules are imported using lower-case letters (so don't let the NumPy marketing name confuse you, the package to import is "numpy"). From chris at trichech.us Thu Jan 5 19:55:11 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Thu, 5 Jan 2006 19:55:11 -0500 Subject: [SciPy-dev] Updated OS X build instructions Message-ID: <05961580-3BBA-4F58-86A6-28F2888CA30D@trichech.us> Following today's release of Numpy, I have updated the SciPy installation instructions for OS X, and have added them to the installation page of the Wiki: http://new.scipy.org/Wiki/Installing_SciPy This made the page rather long, so I split it into a couple of pages. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From prabhu_r at users.sf.net Thu Jan 5 21:42:15 2006 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Fri, 6 Jan 2006 08:12:15 +0530 Subject: [SciPy-dev] Just saw this In-Reply-To: <43BD8885.5080300@ieee.org> References: <43BD8885.5080300@ieee.org> Message-ID: <17341.55559.291887.235802@monster.iitb.ac.in> >>>>> "Travis" == Travis Oliphant writes: Travis> http://heim.ifi.uio.no/~kent-and/software/Instant/doc/Instant.html Travis> I'd like to fix weave so it can be used *more easily* in Travis> this way (creating an extension module on-the-fly). It is quite easy to do this with weave but with 2-3 lines more code. Look at weave/examples/ramp2.py, fibonacci.py or increment_example.py. cheers, prabhu From strawman at astraw.com Thu Jan 5 22:38:04 2006 From: strawman at astraw.com (Andrew Straw) Date: Thu, 05 Jan 2006 19:38:04 -0800 Subject: [SciPy-dev] our wiki - it's getting closer... Message-ID: <43BDE61C.3040305@astraw.com> Hi All, I've made some progress getting the wiki prettier, more organized, and configured to our tastes. Well, actually, to my tastes, which is the point of this email. Unless you're happy to suffer at my whims, please adjust the website to suit your taste. If we don't like it, we'll change it back! :) That's the point of a wiki. For changing items in the menubar, I'll have to edit by hand the wikiconfig.py file on the server, so you'll have to let me know about requests for that. Other than that, everyone should be able to edit and make new pages. Members of the "EditorsGroup" (see http://new.scipy.org/Wiki/EditorsGroup ) are able to rename, delete, and revert pages. If anyone would like to be added to this group, just let anyone who is currently a member know, and they'll add you. (Chris Fonnesbeck -- I put you in this group because I think you may have left a couple pages called Building* that could be removed given the Install* pages. Also, I moved your Install* pages subpages of Installing_SciPy.) We're waiting on Enthought to install the "sinorca4moin" theme, which also includes CSS for the "SectionParser" Moin addon. This means that our website is about to get a lot prettier and have sidebars and "figures". (If anyone with permission at Enthought is listening, on new.scipy.org, please just copy the /home/scipy/homes/strawman/fakeroot/usr/share/moin/htdocs/sinorca4moin directory into /usr/share/moin/htdocs/). Once things settle down, we might tighten up privileges on the front page(s), but I'd really like to minimze that if necessary. We should start going through the old (current) Plone website and making sure we've got all the useful information on the wiki. One last question -- is Enthought backing this up? Otherwise, I think we should put some backup scheme in place. MoinMoin has some anti-spam features I've enabled, but I'd still like to know there's an off-site backup. Cheers! Andrew From robert.kern at gmail.com Thu Jan 5 22:48:30 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 05 Jan 2006 21:48:30 -0600 Subject: [SciPy-dev] our wiki - it's getting closer... In-Reply-To: <43BDE61C.3040305@astraw.com> References: <43BDE61C.3040305@astraw.com> Message-ID: <43BDE88E.7060108@gmail.com> Andrew Straw wrote: > One last question -- is Enthought backing this up? Otherwise, I think we > should put some backup scheme in place. MoinMoin has some anti-spam > features I've enabled, but I'd still like to know there's an off-site > backup. We back up just about everything including scipy.org, I believe. Our off-site storage location for the backup disks is still in Austin, though. If Austin is ever a target of a city-wide zombie attack, we might be in trouble. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From robert.kern at gmail.com Thu Jan 5 22:55:29 2006 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 05 Jan 2006 21:55:29 -0600 Subject: [SciPy-dev] our wiki - it's getting closer... In-Reply-To: <43BDE61C.3040305@astraw.com> References: <43BDE61C.3040305@astraw.com> Message-ID: <43BDEA31.1020804@gmail.com> [private] Andrew Straw wrote: > We're waiting on Enthought to install the "sinorca4moin" theme, which > also includes CSS for the "SectionParser" Moin addon. This means that > our website is about to get a lot prettier and have sidebars and > "figures". (If anyone with permission at Enthought is listening, on > new.scipy.org, please just copy the > /home/scipy/homes/strawman/fakeroot/usr/share/moin/htdocs/sinorca4moin > directory into /usr/share/moin/htdocs/). I'll ping Joe on this. BTW, addressing Enthought on the Scipy list is probably not very efficient. Joe is usually the person who you need to talk to, and he doesn't read the list. Emailing Joe and CCing me (so I can make sure it gets done) is probably the most efficient way. I'll try to get some permissions to make changes myself so we don't have to wait for whenever Joe comes in to get things done on the website. I'm just about settled in to Austin, now, so we can probably move at a faster pace than we have previously. Thank you for all the work you've put into this. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Fri Jan 6 00:27:34 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 05 Jan 2006 22:27:34 -0700 Subject: [SciPy-dev] [Numpy-discussion] return type of ndarray + ma.array In-Reply-To: References: Message-ID: <43BDFFC6.4050200@ieee.org> Alexander Belopolsky wrote: > the following session demonstrates the difference between numpy.ma > and old MA behavior. > > Sum ndarray and ma.array is ndarray > > >>> type(numeric.array([1])+ma.array([1])) > > > However, taken in the different order, > >>> type(ma.array([1])+numeric.array([1])) > > > In the old MA module the sum is MaskedArray in any order: > >>> type(MA.array([1])+Numeric.array([1])) > > > >>> type(Numeric.array([1])+MA.array([1])) > > > I suspected that new ma module may not have __radd__ defined, but it > does. > > Can anyone explain this? I'm not sure. Because Numeric.add(Numeric.array([1]), MA.array([1])) works. I don't know why it's never called... But, the __array_wrap__ method should help here. I defined the __array_wrap__ method in ma.py so this will work now as expected. The ufunc does hand off calculation to an object that has an __radd__ method (but only if that object can only be converted to an *object* numpy array). We could change this, but I don't want to check for an __radd__ method all the time. This would slow ufuncs down. I don't think this is a problem, since the __array_wrap__ method uses the mask of self. This does not work quite right on ufuncs with more than 2 inputs, but I don't think masked array supports those anyway... -Travis From arnd.baecker at web.de Fri Jan 6 02:19:37 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 6 Jan 2006 08:19:37 +0100 (CET) Subject: [SciPy-dev] ReSt for new.scipy.org? In-Reply-To: <43BD9C11.8090605@astraw.com> References: <43BD9C11.8090605@astraw.com> Message-ID: Hi Andrew On Thu, 5 Jan 2006, Andrew Straw wrote: > Arnd Baecker wrote: > > >Hi, > > > >would it be possible to add restructured text support for the new > >http://new.scipy.org/Wiki/ ? > > > > > Done. Great - many thanks (for this and all your other work on the wiki)! > Use this in the wiki: > > {{{ > #!rst > > This is in ReStructuredText! > }}} Would a `#FORMAT rst` in the first line of a page (e.g. as I used here: http://new.scipy.org/Wiki/ArndBaecker ) be ok as well? Best, Arnd P.S.: Somehow the wiki seems a bit slow, not sure if it is the overall connection speed: ping new.scipy.org PING new.scipy.org (216.62.213.231) 56(84) bytes of data. 64 bytes from 216.62.213.231: icmp_seq=1 ttl=40 time=170 ms does not look that bad, or ? From eric at enthought.com Fri Jan 6 02:27:18 2006 From: eric at enthought.com (eric jones) Date: Fri, 06 Jan 2006 01:27:18 -0600 Subject: [SciPy-dev] Just saw this In-Reply-To: <43BD8885.5080300@ieee.org> References: <43BD8885.5080300@ieee.org> Message-ID: <43BE1BD6.6020404@enthought.com> Hey Travis, Hadn't seen this -- Thanks. The neat thing about Instant is that it uses SWIGs parser and type pattern matching to automatically generate the functions and arguments from the C code. Weave doesn't do that. The 2nd nice thing is that it is pure C (I think). One of the drawbacks is that it requires SWIG. It didn't find the installation on my windows machine (but I'm sure that can be remedied). But, are the examples below using weave so bad? For this case line count is about the same, and I might even argue that this is simpler, requiring the user to know less about include files, headers, init_code, etc. In weave, you do have to declare the functions explicitly in python. I've noticed that this has been a pain for large libraries (but not for things like these examples) and have thought about writing an auto-generator for them in weave. This wouldn't be so hard -- you just need something to parse C... Oh wait, that's hard. (its left as an exercise for the reader.) Dave Beazley and I have talked a couple of times about exposing the SWIG C++ parser as a python library so that other tools could parse C++ code. Weave could use this to autogenerate the function names. Dave raised an eye-brow though muttering something like " parser leaks like a sieve" -- not good for a scripting environment where it can be called over and over. Still, its a good idea. gcc-xml is another candidate here. Another comment. weave has a fairly nifty extension mechanism. The notion of "arrays" is not hard coded into the argument list of the extension creation method as in Instant. Instead it is a type spec that is registered with weave (array type specs are automatically registered). These type specs are fairly useful carrying around extra code, include information, etc. needed when a particular variable type is needed (arrays!). I prefer this because adding new type conversions doesn't require changing the libraries api. I think that supporting VTK objects might require editing the underlying library in Instant. Prabhu added vtk_spec.py and didn't have to touch any of ext_tools. Instant could probably benefit from a similar approach, but I haven't thought hard enough about whether it would work with SWIG that well. eric # example.py from Numeric import array, arange, sin, cos from weave.ext_tools import ext_function, ext_module a=b=c=array((1.0)) # type declaration of a,b, and c to double arrays mod = ext_module("test_ext") add = ext_function("add", """ for (int i=0; ihttp://heim.ifi.uio.no/~kent-and/software/Instant/doc/Instant.html > >I'd like to fix weave so it can be used *more easily* in this way >(creating an extension module on-the-fly). > >-Travis > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > From Fernando.Perez at colorado.edu Fri Jan 6 03:41:00 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 06 Jan 2006 01:41:00 -0700 Subject: [SciPy-dev] our wiki - it's getting closer... In-Reply-To: <43BDE61C.3040305@astraw.com> References: <43BDE61C.3040305@astraw.com> Message-ID: <43BE2D1C.50106@colorado.edu> Andrew Straw wrote: > Hi All, > > I've made some progress getting the wiki prettier, more organized, and > configured to our tastes. Well, actually, to my tastes, which is the > point of this email. Unless you're happy to suffer at my whims, please > adjust the website to suit your taste. If we don't like it, we'll change > it back! :) That's the point of a wiki. Excellent, many thanks for putting the time and energy behind this. Things are really shaping up very, very well... Cheers, f From pearu at scipy.org Fri Jan 6 03:29:30 2006 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 6 Jan 2006 02:29:30 -0600 (CST) Subject: [SciPy-dev] scipy trac, changing components In-Reply-To: <43BDEA31.1020804@gmail.com> References: <43BDE61C.3040305@astraw.com> <43BDEA31.1020804@gmail.com> Message-ID: Hi, I have a question: who is maintaining scipy/numpy trac pages? Joe? Robert? I'd like to change ticket components but I couldn't find how to do it. Thanks, Pearu From a.h.jaffe at gmail.com Fri Jan 6 04:40:19 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Fri, 06 Jan 2006 09:40:19 +0000 Subject: [SciPy-dev] Updated OS X build instructions -- test failures. In-Reply-To: <05961580-3BBA-4F58-86A6-28F2888CA30D__20136.290023993$1136509062$gmane$org@trichech.us> References: <05961580-3BBA-4F58-86A6-28F2888CA30D__20136.290023993$1136509062$gmane$org@trichech.us> Message-ID: Hi All- I've found the following test(10) errors under OS X (Bob Ippolito's Python 2.4.1). Test(1) succeeds. Should I be worried? Actually, it's gotten worse since version 1834: the mask failure is new. In [1]: import numpy In [2]: numpy.__version__ Out[2]: '0.9.3.1837' In [3]: numpy.test(10) Found 2 tests for numpy.core.umath Found 21 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 4 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ ............F.....................E................................................................................................................................................ ====================================================================== ERROR: check_singleton (numpy.lib.getlimits.test_getlimits.test_longdouble) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/tests/test_getlimits.py", line 33, in check_singleton ftype = finfo(longdouble) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", line 47, in __new__ obj = object.__new__(cls)._init(dtype) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", line 73, in _init "numpy longfloat precision floating "\ File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/machar.py", line 128, in __init__ raise RuntimeError, "could not determine machine tolerance " \ RuntimeError: could not determine machine tolerance for 'negep' ====================================================================== FAIL: Test of masked element ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/core/tests/test_ma.py", line 508, in check_testMasked self.failUnlessRaises(Exception, lambda x,y: x+y, masked, masked) AssertionError: Exception not raised ---------------------------------------------------------------------- Ran 179 tests in 2.880s FAILED (failures=1, errors=1) Out[3]: From pearu at scipy.org Fri Jan 6 04:11:59 2006 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 6 Jan 2006 03:11:59 -0600 (CST) Subject: [SciPy-dev] Updated OS X build instructions -- test failures. In-Reply-To: References: <05961580-3BBA-4F58-86A6-28F2888CA30D__20136.290023993$1136509062$gmane$org@trichech.us> Message-ID: On Fri, 6 Jan 2006, Andrew Jaffe wrote: > Hi All- > > I've found the following test(10) errors under OS X (Bob Ippolito's > Python 2.4.1). Test(1) succeeds. Should I be worried? Actually, it's > gotten worse since version 1834: the mask failure is new. I'd say things a better as there are more unittests;), the mask tests are new. > In [1]: import numpy > > In [2]: numpy.__version__ > Out[2]: '0.9.3.1837' > ====================================================================== > ERROR: check_singleton (numpy.lib.getlimits.test_getlimits.test_longdouble) > ---------------------------------------------------------------------- > ftype = finfo(longdouble) > File > "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", > line 47, in __new__ > obj = object.__new__(cls)._init(dtype) > File > "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/getlimits.py", > line 73, in _init > "numpy longfloat precision floating "\ > File > "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/lib/machar.py", > line 128, in __init__ > raise RuntimeError, "could not determine machine tolerance " \ > RuntimeError: could not determine machine tolerance for 'negep' >From the comment in the machar.py source it seems to be a gcc 4.0 issue (PPC). What compiler are you using? Could you resend the error report after updating numpy from svn? Or better file a ticket to http://projects.scipy.org/scipy/numpy Pearu From a.h.jaffe at gmail.com Fri Jan 6 05:34:06 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Fri, 06 Jan 2006 10:34:06 +0000 Subject: [SciPy-dev] Updated OS X build instructions -- test failures. In-Reply-To: References: <05961580-3BBA-4F58-86A6-28F2888CA30D__20136.290023993$1136509062$gmane$org@trichech.us> Message-ID: Pearu Peterson wrote: > > >>From the comment in the machar.py source it seems to be a gcc 4.0 issue > (PPC). What compiler are you using? Could you resend the error report > after updating numpy from svn? Or better file a ticket to > Pearu, thanks for the reminder: the old gcc 3.3 vs 4.0 problem. The good news is that switching to 3.3 fixes the machar problem. However, there are new failures, at least with the version 1840 from svn. See further down for another, unrelated problem in linalg. I will keep this thread going, but also post tickets. In [5]: numpy.test(1,1) Found 2 tests for numpy.core.umath Found 21 tests for numpy.core.ma Found 6 tests for numpy.core.records Found 3 tests for numpy.distutils.misc_util Found 3 tests for numpy.lib.getlimits Found 9 tests for numpy.lib.twodim_base Found 44 tests for numpy.lib.shape_base Found 4 tests for numpy.lib.index_tricks Found 42 tests for numpy.lib.type_check Found 3 tests for numpy.dft.helper Found 6 tests for numpy.core.defmatrix Found 33 tests for numpy.lib.function_base Found 0 tests for __main__ ............F..............................................................................................................................F...FF................................. ====================================================================== FAIL: Test of masked element ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/core/tests/test_ma.py", line 508, in check_testMasked self.failUnlessRaises(Exception, lambda x,y: x+y, masked, masked) AssertionError: Exception not raised ====================================================================== FAIL: check_basic (numpy.core.defmatrix.test_defmatrix.test_algebra) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 111, in check_basic assert allclose((mA ** -i).A, B) AssertionError ====================================================================== FAIL: check_basic (numpy.core.defmatrix.test_defmatrix.test_properties) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 34, in check_basic assert allclose(linalg.inv(A), mA.I) AssertionError ====================================================================== FAIL: check_comparisons (numpy.core.defmatrix.test_defmatrix.test_properties) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 50, in check_comparisons assert all(mB == matrix(A+0.1)) AssertionError ---------------------------------------------------------------------- Ran 178 tests in 1.136s FAILED (failures=4) Out[5]: > http://projects.scipy.org/scipy/numpy > > Pearu Also, I've found another problem, not in the unittests, with linalg.cholesky_decomposition: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/linalg/linalg.py in cholesky_decomposition(a) 127 if results['info'] > 0: 128 raise LinAlgError, 'Matrix is not positive definite - Cholesky decomposition cannot be computed' --> 129 return transpose(triu(a,k=0)).copy() 130 131 NameError: global name 'triu' is not defined triu() is found in numpy/lib/twodim_base; I suppose this means it's not being imported correctly here. Thanks for the great and quick work to all, especially Travis! A From a.h.jaffe at gmail.com Fri Jan 6 06:11:19 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Fri, 06 Jan 2006 11:11:19 +0000 Subject: [SciPy-dev] Updated OS X build instructions -- test failures. In-Reply-To: References: <05961580-3BBA-4F58-86A6-28F2888CA30D__20136.290023993$1136509062$gmane$org@trichech.us> Message-ID: Andrew Jaffe wrote: > Pearu Peterson wrote: >> >>>From the comment in the machar.py source it seems to be a gcc 4.0 issue >>(PPC). What compiler are you using? Could you resend the error report >>after updating numpy from svn? Or better file a ticket to >> > Pearu, thanks for the reminder: the old gcc 3.3 vs 4.0 problem. The good > news is that switching to 3.3 fixes the machar problem. However, there > are new failures, at least with the version 1840 from svn. > ... [[ error messages removed ]] > > Also, I've found another problem, not in the unittests, with > linalg.cholesky_decomposition: > > /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/linalg/linalg.py > in cholesky_decomposition(a) > --> 129 return transpose(triu(a,k=0)).copy() > NameError: global name 'triu' is not defined > > Thanks for the great and quick work to all, especially Travis! And, since he seems to have fixed all of these bugs within about 1/2 an hour of my posting them -- thanks also to Pearu!! Andrew From ndbecker2 at gmail.com Fri Jan 6 07:34:11 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Fri, 06 Jan 2006 07:34:11 -0500 Subject: [SciPy-dev] Patch for numpy-0.9.2 SRPM for x86_64 Message-ID: The following patch is needed to numpy-0.9.2 SPEC to build on Fedora x86_64. It is probably also needed on other linux x86_64. Note: on x86_64 the standard location for libraries is /usr/lib64!!! This is not unusual. diff -u numpy.spec numpy.spec.new --- numpy.spec 2006-01-05 15:06:26.000000000 -0500 +++ numpy.spec.new 2006-01-06 07:30:00.000000000 -0500 @@ -1,6 +1,6 @@ %define name numpy %define version 0.9.2 -%define release 1 +%define release 2 Summary: Numpy: array processing for numbers, strings, records, and objects. Name: %{name} @@ -30,7 +30,7 @@ %setup %build -env CFLAGS="$RPM_OPT_FLAGS" python setup.py build +env BLAS=%{_libdir} LAPACK=%{_libdir} ATLAS=%{_libdir} CFLAGS="$RPM_OPT_FLAGS" python setup.py build %install python setup.py install --root=$RPM_BUILD_ROOT --record=INSTALLED_FILES @@ -39,4 +39,12 @@ rm -rf $RPM_BUILD_ROOT %files -f INSTALLED_FILES +#for some reason this was missed! +%{python_sitearch}/numpy/core/_dotblas.so %defattr(-,root,root) + +%changelog + +* Fri Jan 6 2006 nbecker - 0.9.2-2 +- Fixes to build on Fedora x86_64 + From dd55 at cornell.edu Fri Jan 6 08:52:01 2006 From: dd55 at cornell.edu (Darren Dale) Date: Fri, 6 Jan 2006 08:52:01 -0500 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> Message-ID: <200601060852.01160.dd55@cornell.edu> I have some failures to report this morning, after updating to version 0.9.3.1842 on an AMD-64. I dont know how to force scipy to use fftw-2 with the new code in distutils.system_info, but based on previous experience I believe these failures are unique to fftw-3. Darren ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 85, in check_definition assert_array_almost_equal(diff(sin(x)),direct_diff(sin(x))) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ 1.0000000e+00 9.2387953e-01 7.0710678e-01 3.8268343e-01 3.0795887e-16 -3.8268343e-01 -7.0710678e-01 -9.23... Array 2: [ -0.0000000e+00 5.0000000e-01 7.7662794e-17 2.0816682e-16 1.2490009e-16 6.9388939e-17 1.4972165e-16 1.21... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 301, in check_definition assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ -1.0000000e+00 -9.2387953e-01 -7.0710678e-01 -3.8268343e-01 -1.5692464e-16 3.8268343e-01 7.0710678e-01 9.23... Array 2: [ -0.0000000e+00 +0.0000000e+00j -5.0000000e-01 +8.2791349e-17j -3.8831397e-17 +2.4238461e-17j -6.9388939e-17 -1.197... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 332, in check_random_even assert_array_almost_equal(direct_hilbert(direct_ihilbert(f)),f) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.0000000e+00-0.j -1.1497558e-03-0.000552j 1.2207179e-03-0.0015674j -2.5498647e-04+0.001493j -1.5716442... Array 2: [ 0.192799 -0.4534091 0.0363647 0.0992749 0.2119616 0.2137716 0.0982894 -0.5380505 0.1383454 0.1184524 0.28301... ====================================================================== FAIL: check_tilbert_relation (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 311, in check_tilbert_relation assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 62.5%): Array 1: [ -1.0000000e+00 -6.5328148e-01 -2.2262235e-16 2.7059805e-01 1.0626020e-16 -2.7059805e-01 -1.4411772e-16 6.53... Array 2: [ -0.0000000e+00 +0.0000000e+00j -2.5000000e-01 +4.0189477e-17j 1.1905260e-18 -4.4747319e-17j -2.5000000e-01 +9.451... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 370, in check_definition assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ 1.0000000e+00 9.2387953e-01 7.0710678e-01 3.8268343e-01 1.5692464e-16 -3.8268343e-01 -7.0710678e-01 -9.23... Array 2: [ 0.0000000e+00 -0.0000000e+00j 5.0000000e-01 -8.2791349e-17j 3.8831397e-17 -2.4238461e-17j 6.9388939e-17 +1.197... ====================================================================== FAIL: check_itilbert_relation (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 380, in check_itilbert_relation assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 62.5%): Array 1: [ 1.0000000e+00 6.5328148e-01 2.2262235e-16 -2.7059805e-01 -1.0626020e-16 2.7059805e-01 1.4411772e-16 -6.53... Array 2: [ 0.0000000e+00 -0.0000000e+00j 2.5000000e-01 -4.0189477e-17j -1.1905260e-18 +4.4747319e-17j 2.5000000e-01 -9.451... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_itilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 288, in check_definition assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ 9.9667995e-02 9.2081220e-02 7.0475915e-02 3.8141290e-02 2.8133730e-17 -3.8141290e-02 -7.0475915e-02 -9.20... Array 2: [ -0.0000000e+00 +0.0000000e+00j 4.9833997e-02 -8.2516478e-18j 7.6643594e-18 -4.7840739e-18j 2.0213873e-17 +3.487... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 390, in check_definition assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.0998334 0.1968802 0.2920308 0.3843691 0.4730057 0.5570869 0.6358031 0.7083962 0.7741671 0.8324823 0.88278... Array 2: [ 1.2321851e-17 4.9916708e-02 2.4654903e-19 1.5449667e-17 5.9413099e-18 1.3257948e-17 1.4776802e-17 1.22... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 227, in check_definition assert_array_almost_equal (y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ -1.0033311e+01 -9.2695708e+00 -7.0946223e+00 -3.8395819e+00 -1.4631040e-15 3.8395819e+00 7.0946223e+00 9.26... Array 2: [ -0.0000000e+00 +0.0000000e+00j -5.0166556e+00 +8.3067137e-16j -1.9673887e-16 +1.2280391e-16j -2.3819408e-16 -4.109... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 240, in check_random_even assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.0000000e+00 +0.0000000e+00j 2.7655784e-04 -6.3552645e-04j -2.2755949e-03 -1.8728675e-03j 1.1721599e-03 +1.363... Array 2: [-0.15641 -0.3708691 0.3433105 -0.1805961 0.3215564 -0.0533588 -0.195814 0.5770467 0.5615133 -0.0041373 0.11617... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 98, in check_definition assert_array_almost_equal(y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 1.+0.j 2.+0.j 3.+0.j 4.+1.j 1.+0.j 2.+0.j 3.+0.j 4. +2.j] Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_djbfft (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 129, in check_djbfft assert_array_almost_equal(y,y2) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.+0.j 1.+0.j 2.+0.j 3.+0.j] Array 2: [ 6.+0.j -2.+2.j -2.+0.j -2.-2.j] ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 429, in check_definition assert_array_almost_equal(fftn(x),direct_dftn(x)) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[[[ 5.9910836e+02 +0.0000000e+00j 7.3484733e+00 +1.6201391e+00j 3.8829030e+00 +1.1949615e+01j ..., -3.2999195... Array 2: [[[[ 1.4517008e+02 +0.j 3.9716331e+00 -2.235317j -1.2140614e+00 +4.3628705j ..., -9.6490138e+00 -4.37794... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 183, in check_definition assert_array_almost_equal(y,y1) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.125+0.j 0.25 +0.j 0.375+0.j 0.5 +0.125j 0.125+0.j 0.25 +0.j 0.375+0.j 0.5 +0.25j ] Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_djbfft (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 203, in check_djbfft assert_array_almost_equal(y,y2) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. +0.j 0.25+0.j 0.5 +0.j 0.75+0.j] Array 2: [ 1.5+0.j -0.5-0.5j -0.5+0.j -0.5+0.5j] ====================================================================== FAIL: check_random_complex (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 211, in check_random_complex assert_array_almost_equal (ifft(fft(x)),x) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.0085263+0.0142815j 0.0038842+0.0153712j 0.0048905+0.0104471j 0.009717 +0.0055459j 0.0053586+0.0041928j 0.01548... Array 2: [ 0.5456852+0.9140184j 0.2485875+0.9837593j 0.3129931+0.6686158j 0.6218878+0.3549405j 0.3429482+0.2683402j 0.99072... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 217, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.4524224+0.j -0.0193056-0.0426763j -0.0353319-0.0063j -0.0416728+0.0321898j 0.0292662-0.0090132j 0.0097217-... Array 2: [ 0.6619749 0.5213937 0.0992234 0.8767844 0.0551595 0.1341829 0.2002927 0.0263737 0.6770359 0.3640361 0.94903... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 598, in check_definition assert_array_almost_equal(ifftn(x),direct_idftn(x)) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[[[ 5.0073681e-01 +0.0000000e+00j 4.6799892e-03 +6.3481336e-03j -1.7710963e-03 +1.3469510e-03j ..., -1.5314469... Array 2: [[[[ 1.2594826e-01 +0.0000000e+00j 4.0639870e-03 +1.3265000e-03j 2.1862697e-03 +7.3150114e-04j ..., -8.1837834... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 341, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/usr/lib64/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 0.125+0.j 0.25 +0.375j 0.5 +0.125j 0.25 +0.375j 0.5 +0.j 0.25 -0.375j 0.5 -0.125j 0.25 -0.375j] ---------------------------------------------------------------------- Ran 1066 tests in 2.196s From gruben at bigpond.net.au Fri Jan 6 09:23:28 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 07 Jan 2006 01:23:28 +1100 Subject: [SciPy-dev] Is pkgload() missing? In-Reply-To: <200601060852.01160.dd55@cornell.edu> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601060852.01160.dd55@cornell.edu> Message-ID: <43BE7D60.1040808@bigpond.net.au> Just wondering what happened to the pkgload() method in _import_tools.py Was this implemented by Pearu and then removed? There's a comment [NotImplemented] in there. Gary R. From arnd.baecker at web.de Fri Jan 6 10:00:43 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 6 Jan 2006 16:00:43 +0100 (CET) Subject: [SciPy-dev] numpy - dot problem Message-ID: Hi, I have a weird problem which shows up indirectly in the tests of signm of scipy. After digging for quite a while (finally leading to funm and slice ;-), this short example demonstrates the problem import numpy a1=numpy.array([], dtype=numpy.complex128) a2=numpy.array([], dtype=numpy.complex128) numpy.dot(a1,a2) On an opteron, I get a nice Out[10]: 0j (no idea if this is the right result, but that's not the problem) But on an Itanium2 machine I get: In [1]: import numpy In [2]: numpy.__version__ Out[2]: '0.9.3.1842' In [3]: numpy.test(10) Ran 179 tests in 1.591s In [4]: a1=numpy.array([], dtype=numpy.complex128) In [5]: a2=numpy.array([], dtype=numpy.complex128) In [6]: numpy.dot(a1,a2) Out[6]: (2.681561587629943e+154+2.6815615859888243e+154j) Repeated calls give different numbers In [8]: numpy.dot(a1,a2) Out[8]: (2.6815615885274171e+154+2.6815615859888243e+154j) Clearly, this is not correct. I could not find anything suspicious in the build log... What should I do next on this one ? Best, Arnd P.S.: - gcc-3.4.5 - no BLAS library (no MKL, ATLAS, SCLS, ACML ...) The build log shows: gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-Inumpy/core/include -Ibuild/src/numpy/core -Inumpy/core/src -Inumpy/lib/../core/include -I/home/baecker/python2/include/python2.4 -c' gcc: build/src/numpy/core/src/_sortmodule.c /tmp/ccS5CjIG.s: Assembler messages: /tmp/ccS5CjIG.s:39713: Warning: Additional NOP may be necessary to workaround Itanium processor A/B step errata /tmp/ccS5CjIG.s:39852: Warning: Additional NOP may be necessary to workaround Itanium processor A/B step errata gcc -pthread -shared build/temp.linux-ia64-2.4/build/src/numpy/core/src/_sortmodule.o -lm -o build/lib.linux-ia64-2.4/numpy/core/_sort.so building 'numpy.lib._compiled_base' extension but that is irrelevant here, I suppose. From arnd.baecker at web.de Fri Jan 6 10:51:30 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 6 Jan 2006 16:51:30 +0100 (CET) Subject: [SciPy-dev] numpy - dot problem In-Reply-To: References: Message-ID: On Fri, 6 Jan 2006, Arnd Baecker wrote: [...] > But on an Itanium2 machine I get: > > In [1]: import numpy > In [2]: numpy.__version__ > Out[2]: '0.9.3.1842' > In [3]: numpy.test(10) > Ran 179 tests in 1.591s > In [4]: a1=numpy.array([], dtype=numpy.complex128) > In [5]: a2=numpy.array([], dtype=numpy.complex128) > In [6]: numpy.dot(a1,a2) > Out[6]: (2.681561587629943e+154+2.6815615859888243e+154j) > > Repeated calls give different numbers > In [8]: numpy.dot(a1,a2) > Out[8]: (2.6815615885274171e+154+2.6815615859888243e+154j) > > Clearly, this is not correct. In [1]: import numpy In [2]: a1=numpy.array([]) In [3]: a2=numpy.array([]) In [4]: numpy.dot(a1,a2) Out[4]: 2305843009217322368 In [5]: import numpy In [6]: a1=numpy.array([], dtype=numpy.int8) In [7]: a2=numpy.array([], dtype=numpy.int8) In [8]: numpy.dot(a1,a2) Out[8]: -112 So in spirit it does not depend on the type... From arnd.baecker at web.de Fri Jan 6 11:57:40 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 6 Jan 2006 17:57:40 +0100 (CET) Subject: [SciPy-dev] ruby narray vs. numpy Message-ID: Can numpy beat them (all)? ;-) http://narray.rubyforge.org/bench.html.en The code for the benchmarks is in the subdirectory speed (eg. of http://rubyforge.org/frs/download.php/7658/narray-0.5.8.tar.gz) Best, Arnd From perry at stsci.edu Fri Jan 6 12:34:13 2006 From: perry at stsci.edu (Perry Greenfield) Date: Fri, 6 Jan 2006 12:34:13 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: [SciPy-user] Long live to numpy (and its list as well) In-Reply-To: <200601052303.04502.faltet@carabos.com> References: <43BD6109.9030402@sympatico.ca> <43BD7A99.8070002@noaa.gov> <200601052303.04502.faltet@carabos.com> Message-ID: Perhaps all discussions of the core stuff could go to numpy-discussion and the applications and libraries go to scipy-discussion. That segregation often doesn't work and you still get all the cross-posting anyway. But I would think that it would be important to have the core element highlighted in the name since that user base will be bigger than the scipy one by necessity. Perry On Jan 5, 2006, at 5:03 PM, Francesc Altet wrote: > A Dijous 05 Gener 2006 20:59, Christopher Barker va escriure: >> Colin J. Williams wrote: >>> It seems to me that NumPy is the better way to go, the archives and >>> downloads are more readily available there, but it's up to Todd >>> Miller >>> and the folk who have been maintaining NumPy. >>> >>> NumPy has been around for a while, probably longer than SciPy. >> >> I, for one, am only subscribed to the NumPy list, and have been for >> years. I don't know how much non-NumPy stuff there is on the SciPy >> list, >> but while I do use NumPy for Scientific stuff, I don't use the rest of >> SciPy (because, frankly, it's always been a pain in the @#! to >> install) >> and I don't need another high-traffic list. > > Yeah, I was thinking in people like you. In fact, I'm myself in the > same case than you: I'm very interested in a basic array module > (Numeric/numarray/numpy) and not so much in more powerful (but > complex) packages like scipy. And I think there can be a lot of people > out there that can be in the same position. > > Accordingly, my vote is also: > > +1 numpy-discussion From strawman at astraw.com Fri Jan 6 12:54:06 2006 From: strawman at astraw.com (Andrew Straw) Date: Fri, 06 Jan 2006 09:54:06 -0800 Subject: [SciPy-dev] ReSt for new.scipy.org? In-Reply-To: References: <43BD9C11.8090605@astraw.com> Message-ID: <43BEAEBE.8070203@astraw.com> >Would a `#FORMAT rst` in the first line of a page >(e.g. as I used here: http://new.scipy.org/Wiki/ArndBaecker ) >be ok as well? > > Yes, clearly that works, too! > >P.S.: Somehow the wiki seems a bit slow, not sure >if it is the overall connection speed: > ping new.scipy.org > PING new.scipy.org (216.62.213.231) 56(84) bytes of data. > 64 bytes from 216.62.213.231: icmp_seq=1 ttl=40 time=170 ms >does not look that bad, or ? > > I'm a little concerned about this, too. The ping latency is probably nothing compared to the serious time-sink -- right now, the wiki is configured as a .cgi script, which means that every request starts a new Python process. There are several ways around this: FastCGI and mod_python, to name two. I've set up servers with both of these in other contexts, and either works fine but I'd recommend the FastCGI route simply because it separates the Python process from the apache process. When/if Enthought wants to tackle this, I'm happy to help! For more info, see here: http://moinmoin.wikiwikiweb.de/HelpOnInstalling/FastCgi From loredo at astro.cornell.edu Fri Jan 6 13:49:36 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Fri, 6 Jan 2006 13:49:36 -0500 (EST) Subject: [SciPy-dev] Updated OS X build instructions -- test Message-ID: <200601061849.k06InaB11979@laplace.astro.cornell.edu> > Thanks for the great and quick work to all, especially Travis! Amen! > And, since he seems to have fixed all of these bugs within about 1/2 an > hour of my posting them -- thanks also to Pearu!! Amen again! Everything is working pretty smoothly for me so far on FC3, and I really appreciate the hard effort by everyone over the holiday period. Just to clarify, since I'm about to try an install on OS X and have to provide tarballs to students in a few days---does this mean all OS X users should be using SVN numpy, not the most recent release? Is the problem only on Tiger (I use Panther as my G3 laptop won't support Tiger)? -Tom From chris at trichech.us Fri Jan 6 13:53:38 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Fri, 6 Jan 2006 13:53:38 -0500 Subject: [SciPy-dev] Updated OS X build instructions -- test In-Reply-To: <200601061849.k06InaB11979@laplace.astro.cornell.edu> References: <200601061849.k06InaB11979@laplace.astro.cornell.edu> Message-ID: <0F604999-D1F4-4F30-AD9B-EA844C0E90B9@trichech.us> On Jan 6, 2006, at 1:49 PM, Tom Loredo wrote: > Just to clarify, since I'm about to try an install on OS X and > have to provide tarballs to students in a few days---does this > mean all OS X users should be using SVN numpy, not the most recent > release? Is the problem only on Tiger (I use Panther as my > G3 laptop won't support Tiger)? I have Python eggs of both numpy and scipy from svn at http:// trichech.us, built with gcc 3.3 and Python 2.4.2, if this saves you any effort. I will be providing pretty regular updates for all these packages. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From chanley at stsci.edu Fri Jan 6 15:18:07 2006 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 06 Jan 2006 15:18:07 -0500 Subject: [SciPy-dev] chararray array method has vanished Message-ID: <43BED07F.5070800@stsci.edu> Hi Travis, The following no longer works in numpy: In [117]: import numpy.core.chararray as nc In [118]: a = nc.array('abcdefg'*10,itemsize=10) --------------------------------------------------------------------------- exceptions.AttributeError Traceback (most recent call last) /data/sparty1/dev/numpy/numpy/core/tests/ AttributeError: type object 'chararray' has no attribute 'array' Chris From charlesr.harris at gmail.com Fri Jan 6 16:20:59 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 6 Jan 2006 14:20:59 -0700 Subject: [SciPy-dev] NumPy import problem. Message-ID: If I try to import numpy from python in my home directory I get the following error: >>> from numpy import * Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.3/site-packages/numpy/__init__.py", line 59, in ? from core import * File "/usr/lib/python2.3/site-packages/numpy/core/__init__.py", line 27, in ? test = ScipyTest().test File "/usr/lib/python2.3/site-packages/numpy/testing/numpytest.py", line 210, in __init__ from numpy.distutils.misc_util import get_frame File "/usr/lib/python2.3/site-packages/numpy/distutils/__init__.py", line 5, in ? import ccompiler File "/usr/lib/python2.3/site-packages/numpy/distutils/ccompiler.py", line 32, in ? CCompiler.spawn = new.instancemethod(CCompiler_spawn,None,CCompiler) AttributeError: 'module' object has no attribute 'instancemethod' Oddly enough, this doesn't happen in ipython or if I run python in /usr/lib/python2.3/site-packages/numpy/ . There must be a path problem someplace. Numpy was installed from the version in svn as of 1/2 hour ago. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Jan 6 16:38:34 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 06 Jan 2006 15:38:34 -0600 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: References: Message-ID: <43BEE35A.3000100@gmail.com> Charles R Harris wrote: > If I try to import numpy from python in my home directory I get the > following error: > >>>> from numpy import * > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.3/site-packages/numpy/__init__.py", line 59, in ? > from core import * > File "/usr/lib/python2.3/site-packages/numpy/core/__init__.py", line > 27, in ? > test = ScipyTest().test > File "/usr/lib/python2.3/site-packages/numpy/testing/numpytest.py", > line 210, in __init__ > from numpy.distutils.misc_util import get_frame > File "/usr/lib/python2.3/site-packages/numpy/distutils/__init__.py", > line 5, in ? > import ccompiler > File "/usr/lib/python2.3/site-packages/numpy/distutils/ccompiler.py", > line 32, in ? > CCompiler.spawn = new.instancemethod(CCompiler_spawn,None,CCompiler) > AttributeError: 'module' object has no attribute 'instancemethod' Do you have a module named "new" in your home directory? -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charlesr.harris at gmail.com Fri Jan 6 16:46:51 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 6 Jan 2006 14:46:51 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEE35A.3000100@gmail.com> References: <43BEE35A.3000100@gmail.com> Message-ID: On 1/6/06, Robert Kern wrote: > > Charles R Harris wrote: > > If I try to import numpy from python in my home directory I get the > > following error > > Do you have a module named "new" in your home directory? > > Why, yes I did. I renamed it and things work fine now. I deduce that this problem has been seen before? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Jan 6 16:55:11 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 06 Jan 2006 15:55:11 -0600 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: References: <43BEE35A.3000100@gmail.com> Message-ID: <43BEE73F.5080802@gmail.com> Charles R Harris wrote: > > On 1/6/06, *Robert Kern* > wrote: > > Charles R Harris wrote: > > If I try to import numpy from python in my home directory I get the > > following error > > Do you have a module named "new" in your home directory? > > Why, yes I did. I renamed it and things work fine now. I deduce that > this problem has been seen before? We've seen something similar once or twice before. When I get an error that says some module doesn't have suchAndSuch attribute that really ought to be there, I always check to make sure that the module I think is being imported is the module that is actually being imported. The "new" module is somewhat notorious for this kind of error. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Fri Jan 6 16:26:10 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 14:26:10 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: References: Message-ID: <43BEE072.7000701@ieee.org> Charles R Harris wrote: > If I try to import numpy from python in my home directory I get the > following error: > > >>> from numpy import * > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.3/site-packages/numpy/__init__.py", line 59, in ? > from core import * > File "/usr/lib/python2.3/site-packages/numpy/core/__init__.py", line > 27, in ? > test = ScipyTest().test > File "/usr/lib/python2.3/site-packages/numpy/testing/numpytest.py", > line 210, in __init__ > from numpy.distutils.misc_util import get_frame > File "/usr/lib/python2.3/site-packages/numpy/distutils/__init__.py", > line 5, in ? > import ccompiler > File > "/usr/lib/python2.3/site-packages/numpy/distutils/ccompiler.py", line > 32, in ? > CCompiler.spawn = new.instancemethod(CCompiler_spawn,None,CCompiler) > AttributeError: 'module' object has no attribute 'instancemethod' > > Oddly enough, this doesn't happen in ipython or if I run python in > /usr/lib/python2.3/site-packages/numpy/ . There must be a path problem > someplace. Numpy was installed from the version in svn as of 1/2 hour ago. > do import pdb pdb.pm() p type(new) p new.__file__ This might help figure it out... -Travis From oliphant.travis at ieee.org Fri Jan 6 16:09:06 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 14:09:06 -0700 Subject: [SciPy-dev] chararray array method has vanished In-Reply-To: <43BED07F.5070800@stsci.edu> References: <43BED07F.5070800@stsci.edu> Message-ID: <43BEDC72.6050705@ieee.org> Christopher Hanley wrote: >Hi Travis, > >The following no longer works in numpy: > >In [117]: import numpy.core.chararray as nc > > You are getting the chararray object (not the chararray module --- perhaps a renaming is in order). The user is not really supposed to dive into the internals of numpy.core like you are doing here. Just do import numpy.core as nc nc.char.array -Travis From oliphant.travis at ieee.org Fri Jan 6 16:24:24 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 14:24:24 -0700 Subject: [SciPy-dev] chararray array method has vanished In-Reply-To: <43BED07F.5070800@stsci.edu> References: <43BED07F.5070800@stsci.edu> Message-ID: <43BEE008.2090806@ieee.org> Christopher Hanley wrote: >Hi Travis, > >The following no longer works in numpy: > >In [117]: import numpy.core.chararray as nc > >In [118]: a = nc.array('abcdefg'*10,itemsize=10) >--------------------------------------------------------------------------- >exceptions.AttributeError Traceback (most >recent call last) > > I just renamed the module defchararray to avoid confusion. numpy.core contains the class object chararray as well and that's what you were getting... -Travis From oliphant.travis at ieee.org Fri Jan 6 17:17:06 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 15:17:06 -0700 Subject: [SciPy-dev] No time for SciPy release today Message-ID: <43BEEC62.9060205@ieee.org> I'm not going to have time to make a release of (full) SciPy today. It builds find out of SVN. I'm starting classes on Monday (and have to get back to the seven other things I've left holding while putting NumPy out) so I'm not sure how much time I'll have for it over the next few weeks. If somebody else wants to take a stab at making a release of (full) SciPy, I can point you in the right direction. -Travis From Fernando.Perez at colorado.edu Fri Jan 6 17:17:12 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 06 Jan 2006 15:17:12 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEE73F.5080802@gmail.com> References: <43BEE35A.3000100@gmail.com> <43BEE73F.5080802@gmail.com> Message-ID: <43BEEC68.30107@colorado.edu> Robert Kern wrote: > Charles R Harris wrote: > >>On 1/6/06, *Robert Kern* >> wrote: >> >> Charles R Harris wrote: >> > If I try to import numpy from python in my home directory I get the >> > following error >> >> Do you have a module named "new" in your home directory? >> >>Why, yes I did. I renamed it and things work fine now. I deduce that >>this problem has been seen before? > > > We've seen something similar once or twice before. When I get an error that says > some module doesn't have suchAndSuch attribute that really ought to be there, I > always check to make sure that the module I think is being imported is the > module that is actually being imported. The "new" module is somewhat notorious > for this kind of error. This is precisely why pep-328, which I recently mentioned, is going to be at some point enforced. I've also been bitten more than once by this. Cheers, f From oliphant.travis at ieee.org Fri Jan 6 17:22:40 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 15:22:40 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEEC68.30107@colorado.edu> References: <43BEE35A.3000100@gmail.com> <43BEE73F.5080802@gmail.com> <43BEEC68.30107@colorado.edu> Message-ID: <43BEEDB0.8070100@ieee.org> Fernando Perez wrote: >Robert Kern wrote: > > >>Charles R Harris wrote: >> >> >> >>>On 1/6/06, *Robert Kern* >>> wrote: >>> >>> Charles R Harris wrote: >>> > If I try to import numpy from python in my home directory I get the >>> > following error >>> >>> Do you have a module named "new" in your home directory? >>> >>>Why, yes I did. I renamed it and things work fine now. I deduce that >>>this problem has been seen before? >>> >>> >>We've seen something similar once or twice before. When I get an error that says >>some module doesn't have suchAndSuch attribute that really ought to be there, I >>always check to make sure that the module I think is being imported is the >>module that is actually being imported. The "new" module is somewhat notorious >>for this kind of error. >> >> > >This is precisely why pep-328, which I recently mentioned, is going to be at >some point enforced. I've also been bitten more than once by this. > > This wouldn't really fix this problem, though because if $HOME is still on your sys.path then the module would be picked up from there anyway, right? Regarding PEP-328, my understanding is that the -dot- notation needed for sub-packages only works in Python 2.4, so we can't use it now unless we drop support for Python 2.3 (or play ugly games with version checking...) which I don't think we should until PEP-328 actually does get enforced. -Travis From robert.kern at gmail.com Fri Jan 6 17:26:10 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 06 Jan 2006 16:26:10 -0600 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEEC68.30107@colorado.edu> References: <43BEE35A.3000100@gmail.com> <43BEE73F.5080802@gmail.com> <43BEEC68.30107@colorado.edu> Message-ID: <43BEEE82.2080307@gmail.com> Fernando Perez wrote: > Robert Kern wrote: > >>Charles R Harris wrote: >> >> >>>On 1/6/06, *Robert Kern* >>> wrote: >>> >>> Charles R Harris wrote: >>> > If I try to import numpy from python in my home directory I get the >>> > following error >>> >>> Do you have a module named "new" in your home directory? >>> >>>Why, yes I did. I renamed it and things work fine now. I deduce that >>>this problem has been seen before? >> >>We've seen something similar once or twice before. When I get an error that says >>some module doesn't have suchAndSuch attribute that really ought to be there, I >>always check to make sure that the module I think is being imported is the >>module that is actually being imported. The "new" module is somewhat notorious >>for this kind of error. > > This is precisely why pep-328, which I recently mentioned, is going to be at > some point enforced. I've also been bitten more than once by this. I'm not sure PEP 328 affects this case. The "new" module is in the standard library, and Chuck's new.py file wasn't in a package. The problem would still exist in a PEP-328 world, I think. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Fri Jan 6 17:28:04 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 06 Jan 2006 15:28:04 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEEDB0.8070100@ieee.org> References: <43BEE35A.3000100@gmail.com> <43BEE73F.5080802@gmail.com> <43BEEC68.30107@colorado.edu> <43BEEDB0.8070100@ieee.org> Message-ID: <43BEEEF4.90109@colorado.edu> Travis Oliphant wrote: >>This is precisely why pep-328, which I recently mentioned, is going to be at >>some point enforced. I've also been bitten more than once by this. >> >> > > This wouldn't really fix this problem, though because if $HOME is still > on your sys.path then the module would be picked up from there anyway, > right? Right, the problem of 'import foo' acidentally picking up some other foo from the current directory can still bite you. Pep 328 helps, by requiring always fully qualified imports, but doesn't eliminate all possible cases. > Regarding PEP-328, my understanding is that the -dot- notation needed > for sub-packages only works in Python 2.4, so we can't use it now unless > we drop support for Python 2.3 (or play ugly games with version > checking...) which I don't think we should until PEP-328 actually does > get enforced. No, the only thing we should do right now is, even _inside_ of numpy, use import numpy.this_or_that instead of import this_or_that This is valid in all versions of python, and will continue to work when unqualified imports aren't supported anymore. When pep 328 is enforced and they settle on a syntax, this will still be OK. Cheers, f From Fernando.Perez at colorado.edu Fri Jan 6 17:29:43 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 06 Jan 2006 15:29:43 -0700 Subject: [SciPy-dev] NumPy import problem. In-Reply-To: <43BEEE82.2080307@gmail.com> References: <43BEE35A.3000100@gmail.com> <43BEE73F.5080802@gmail.com> <43BEEC68.30107@colorado.edu> <43BEEE82.2080307@gmail.com> Message-ID: <43BEEF57.2000401@colorado.edu> Robert Kern wrote: >>This is precisely why pep-328, which I recently mentioned, is going to be at >>some point enforced. I've also been bitten more than once by this. > > > I'm not sure PEP 328 affects this case. The "new" module is in the standard > library, and Chuck's new.py file wasn't in a package. The problem would still > exist in a PEP-328 world, I think. It would help if 'new' was in his $PWD, which is currently picked up by 'import'. But you are right, even after PEP 328, conflicts in PYTHONPATH are still possible, as I mentioned in my other post, that PEP only kills _some_ forms of this confusion, not all. Cheers, f From charlesr.harris at gmail.com Fri Jan 6 17:46:52 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 6 Jan 2006 15:46:52 -0700 Subject: [SciPy-dev] No time for SciPy release today In-Reply-To: <43BEEC62.9060205@ieee.org> References: <43BEEC62.9060205@ieee.org> Message-ID: On 1/6/06, Travis Oliphant wrote: > > > I'm not going to have time to make a release of (full) SciPy today. > > It builds find out of SVN. I'm starting classes on Monday (and have to > get back to the seven other things I've left holding while putting NumPy > out) so I'm not sure how much time I'll have for it over the next few > weeks. Travis, Thanks for all the great work over the holidays (ho, ho, ho). NumPy is looking great! Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From gruben at bigpond.net.au Fri Jan 6 18:58:37 2006 From: gruben at bigpond.net.au (Gary Ruben) Date: Sat, 07 Jan 2006 10:58:37 +1100 Subject: [SciPy-dev] No time for SciPy release today In-Reply-To: References: <43BEEC62.9060205@ieee.org> Message-ID: <43BF042D.8060309@bigpond.net.au> I second that. Much praise and lauding is in order. Gary R. Charles R Harris wrote: > Travis, > > Thanks for all the great work over the holidays (ho, ho, ho). NumPy is > looking great! > > Chuck From oliphant.travis at ieee.org Fri Jan 6 18:17:01 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 06 Jan 2006 16:17:01 -0700 Subject: [SciPy-dev] numpy lists In-Reply-To: References: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> <43BD5DFA.20906@colorado.edu> <43BD6109.9030402@sympatico.ca> <43BD7A99.8070002@noaa.gov> Message-ID: <43BEFA6D.7090305@ieee.org> Bruce Southey wrote: > I am also +1 to keep numpy for the same reasons as Colin and Chris. So > far there has been nothing in alpha SciPy versions that offer any > advantage over Numarray for what I use or develop. There are two new mailing lists for NumPy numpy-devel at lists.sourceforge.net numpy-user at lists.sourceforge.net These are for developers and users to talk about only NumPy The SciPy lists can be for SciPy itself. Two packages deserve separate lists. -Travis From schofield at ftw.at Fri Jan 6 19:42:25 2006 From: schofield at ftw.at (Ed Schofield) Date: Sat, 7 Jan 2006 00:42:25 +0000 Subject: [SciPy-dev] No time for SciPy release today In-Reply-To: <43BEEC62.9060205@ieee.org> References: <43BEEC62.9060205@ieee.org> Message-ID: On 06/01/2006, at 10:17 PM, Travis Oliphant wrote: > I'm not going to have time to make a release of (full) SciPy today. > > It builds find out of SVN. I'm starting classes on Monday (and > have to > get back to the seven other things I've left holding while putting > NumPy > out) so I'm not sure how much time I'll have for it over the next few > weeks. > > If somebody else wants to take a stab at making a release of (full) > SciPy, I can point you in the right direction. I'll take a stab at it :) -- Ed From strawman at astraw.com Fri Jan 6 20:09:58 2006 From: strawman at astraw.com (Andrew Straw) Date: Fri, 06 Jan 2006 17:09:58 -0800 Subject: [SciPy-dev] [SciPy-user] numpy lists In-Reply-To: <43BEFA6D.7090305@ieee.org> References: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> <43BD5DFA.20906@colorado.edu> <43BD6109.9030402@sympatico.ca> <43BD7A99.8070002@noaa.gov> <43BEFA6D.7090305@ieee.org> Message-ID: <43BF14E6.8070507@astraw.com> Travis Oliphant wrote: >There are two new mailing lists for NumPy > >numpy-devel at lists.sourceforge.net >numpy-user at lists.sourceforge.net > >These are for developers and users to talk about only NumPy > > You can subscribe to these lists from http://sourceforge.net/mail/?group_id=1369 From strawman at astraw.com Sat Jan 7 00:12:49 2006 From: strawman at astraw.com (Andrew Straw) Date: Fri, 06 Jan 2006 21:12:49 -0800 Subject: [SciPy-dev] [Numpy-discussion] Installation of numpy header files In-Reply-To: References: Message-ID: <43BF4DD1.2050908@astraw.com> Alexander Belopolsky wrote: >Numpy include files get installed in /lib/pythonX.X/site- >packages/numpy/base/include/numpy instead of /include/ >pythonX.X/numpy. > > >Does anyone rely on the current behavior? > > I do. There was discussion about this a couple of months ago. To find the headers, use the following: import numpy numpy.get_numpy_include() >PS: Using scipy core svn r1847 on a Linux system. > > I guess you're not actually using scipy core, but numpy. > The attached patch fixes the problem. Actually, the attached patch seems entire unrelated. :) From ndarray at mac.com Sat Jan 7 00:24:25 2006 From: ndarray at mac.com (Sasha) Date: Sat, 7 Jan 2006 00:24:25 -0500 Subject: [SciPy-dev] Installation of numpy header files In-Reply-To: References: Message-ID: Sorry for the wrong attachment. Installing include files under lib breaks modules that use python distutils. Is it recommended that modules that depend on numpy use numpy distutils instead? Thanks. -- sasha On 1/6/06, Alexander Belopolsky wrote: > Numpy include files get installed in /lib/pythonX.X/site- > packages/numpy/base/include/numpy instead of /include/ > pythonX.X/numpy. > > The attached patch fixes the problem. Does anyone rely on the current behavior? > > -- sasha > > PS: Using scipy core svn r1847 on a Linux system. > > > -------------- next part -------------- A non-text attachment was scrubbed... Name: numpy-core-setup-patch Type: application/octet-stream Size: 1505 bytes Desc: not available URL: From ndarray at mac.com Sat Jan 7 00:46:23 2006 From: ndarray at mac.com (Sasha) Date: Sat, 7 Jan 2006 00:46:23 -0500 Subject: [SciPy-dev] [Numpy-discussion] Installation of numpy header files In-Reply-To: <43BF4DD1.2050908@astraw.com> References: <43BF4DD1.2050908@astraw.com> Message-ID: On 1/7/06, Andrew Straw wrote: > ... There was discussion about this a couple of months ago. Could you, please give me some keywords for this discussion? I searched the archives and the only relevant thread was from 2002 (http://aspn.activestate.com/ASPN/Mail/Message/scipy-dev/1592700). I understand that at that time there was a solution involving sitecustomize.py, but that was replaced with site.cfg some time ago. The message that I cited above has a much better description of the problem that I am experiencing than what I wrote: > (1) include directories. > Distutils knows to include files from /usr/include/python2.2 (or > wherever it is installed) whenever building extension modules. > Numeric installs its header files inside this directory when installed > as root. However, when I install Numeric in /home/eric/linux, the > header files are installed in /home/eric/linux/python2.2. Distutils > doesn't know to look in hear for headers. To solve this, I'd have to > hack all the setup.py files for modules that rely on Numeric to use my > include_dirs. This isn't so nice. Thanks. -- sasha From ndarray at mac.com Sat Jan 7 01:05:17 2006 From: ndarray at mac.com (Sasha) Date: Sat, 7 Jan 2006 01:05:17 -0500 Subject: [SciPy-dev] [Numpy-discussion] Installation of numpy header files In-Reply-To: References: <43BF4DD1.2050908@astraw.com> Message-ID: Ok, I've found it: http://www.scipy.org/documentation/mailman?fn=scipy-dev/2005-September/003238.html Sorry for the extra traffic. Let me paraphrase the solution here to hopefully make it more discoverable: """ Extension modules that need to compile against numpy should use get_numpy_include function to locate the appropriate include directory. Using distutils: import numpy Extension('extension_name', ... include_dirs=[numpy.get_numpy_include()]) """ -- sasha On 1/7/06, Sasha wrote: > On 1/7/06, Andrew Straw wrote: > > ... There was discussion about this a couple of months ago. > > Could you, please give me some keywords for this discussion? I > searched the archives and the only relevant thread was from 2002 > (http://aspn.activestate.com/ASPN/Mail/Message/scipy-dev/1592700). I > understand that at that time there was a solution involving > sitecustomize.py, but that was replaced with site.cfg some time ago. > > The message that I cited above has a much better description of the > problem that I am experiencing than what I wrote: > > > (1) include directories. > > Distutils knows to include files from /usr/include/python2.2 (or > > wherever it is installed) whenever building extension modules. > > Numeric installs its header files inside this directory when installed > > as root. However, when I install Numeric in /home/eric/linux, the > > header files are installed in /home/eric/linux/python2.2. Distutils > > doesn't know to look in hear for headers. To solve this, I'd have to > > hack all the setup.py files for modules that rely on Numeric to use my > > include_dirs. This isn't so nice. > > Thanks. > > -- sasha > From arnd.baecker at web.de Sat Jan 7 01:55:46 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 7 Jan 2006 07:55:46 +0100 (CET) Subject: [SciPy-dev] numpy - dual.py problems Message-ID: Hi, I have now been hit the fifth time by what I would call the `dual.py dilemma/problem`. Let my try to explain, why I think that using `dual.py` in numpy (at least in the way it is done right now) causes problems: A typical scenario for "end-users" is the following: - people will have Numeric/numarray + old scipy/old new scipy on their machines. In many cases this is the system-wide installation as done by the root-user (eg. via some package manager) The "end-user" has no root rights. - The "end-user" hears about the great progress wrt numpy/scipy combo and wants to test it out. He downloads numpy and installs it to some place in his homedirectory via python setup.py install --prefix=<~/somewhere> and sets his PYTHONPATH accordingly - Then `import numpy` will work, but a `numpy.test(10)` will fail because `dual.py` picks his old scipy (which will be visible from the traceback, if he looks carefully at the path names). - Consequence, the "end-user" will either ask a question on the mailing list or just quit his experiment and continue to work with his old installation. I have just re-read the discussion on the mailing-list on this: Travis' remark on this was: """You need to remove scipy_base from your system (or at least from the site-packages directory you are running for Python). I'm pretty sure the new scipy won't work with the old scipy (which is what scipy_base comes from). This will need to be emphasized on install of the new system.... Get rid of the old system first... """ Here I disagree: the "end-user" is in general a) not able to remove the old install b) and also does not want to remove his old install (usually he will have a lot of code floating around which relies on the old behaviour). Later on Travis' wrote: """The solution is that now to get at functions that are in both numpy and scipy and you want the scipy ones first and default to the numpy ones if scipy is not installed, there is a numpy.dual module that must be loaded separately that contains all the overlapping functions.""" I think this is fine, if a user does this in his own code, but I have found the following `from numpy.dual import`s within numpy core/defmatrix.py: from numpy.dual import inv lib/polynomial.py: from numpy.dual import eigvals, lstsq lib/mlab.py:from numpy.dual import eig, svd lib/function_base.py: from numpy.dual import i0 I think Fernando's analysis is indeed very true (FWIW, yes, originally I had a different opinion on that ;-): """Travis Oliphant wrote: > But these need to be there. We need some way to update the functions in > numpy if scipy is installed (so that you can always call the numpy > functions but get the scipy functions if they are there)... Mmh, I'm not sure I agree with this approach. I think this may lead to surprising behavior, hard to find bugs, etc. I really don't think that the presence of scipy should change the behavior of numpy's internals, esp. when we're advertising the dependency chain to be numpy < scipy Finding out that in reality, there's a hidden |-> numpy scipy--| | | |------------------| cycle kind of threw me for a loop :) I would argue that users should know whether they want to call scipy's libraries or numpy's, and be clearly aware of the differences. """ In conclusion, I think that `dual.py` should not be used *inside* of `numpy`. Best, Arnd P.S.: Fernando, I am sure you are reading all this with a smile - yes I am converted... From nwagner at mecha.uni-stuttgart.de Sat Jan 7 07:06:50 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 07 Jan 2006 13:06:50 +0100 Subject: [SciPy-dev] 0.9.3.1848 Message-ID: Is this a known bug (see below) ? I was away from my desk for two weeks. So I missed many changes... Nils ====================================================================== ERROR: check_basic (numpy.core.defmatrix.test_defmatrix.test_algebra) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 111, in check_basic assert allclose((mA ** -i).A, B) File "/usr/local/lib/python2.4/site-packages/numpy/core/defmatrix.py", line 154, in __pow__ x = self.I File "/usr/local/lib/python2.4/site-packages/numpy/core/defmatrix.py", line 208, in getI from numpy.dual import inv File "/usr/local/lib/python2.4/site-packages/numpy/dual.py", line 11, in ? import scipy.linalg as linpkg File "/usr/local/lib/python2.4/site-packages/scipy/__init__.py", line 39, in ? pkgload(verbose=SCIPY_IMPORT_VERBOSE,postpone=True) File "/usr/local/lib/python2.4/site-packages/numpy/_import_tools.py", line 196, in __call__ self.warn('Overwriting %s=%s (was %s)' \ AttributeError: PackageLoader instance has no attribute '_obj2str' ====================================================================== ERROR: check_basic (numpy.core.defmatrix.test_defmatrix.test_properties) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/core/tests/test_defmatrix.py", line 34, in check_basic assert allclose(linalg.inv(A), mA.I) File "/usr/local/lib/python2.4/site-packages/numpy/core/defmatrix.py", line 208, in getI from numpy.dual import inv File "/usr/local/lib/python2.4/site-packages/numpy/dual.py", line 11, in ? import scipy.linalg as linpkg File "/usr/local/lib/python2.4/site-packages/scipy/__init__.py", line 39, in ? pkgload(verbose=SCIPY_IMPORT_VERBOSE,postpone=True) File "/usr/local/lib/python2.4/site-packages/numpy/_import_tools.py", line 196, in __call__ self.warn('Overwriting %s=%s (was %s)' \ AttributeError: PackageLoader instance has no attribute '_obj2str' ---------------------------------------------------------------------- Ran 183 tests in 4.979s FAILED (errors=2) >>> numpy.__version__ '0.9.3.1848' From pearu at scipy.org Sat Jan 7 08:37:01 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 7 Jan 2006 07:37:01 -0600 (CST) Subject: [SciPy-dev] 0.9.3.1848 In-Reply-To: References: Message-ID: On Sat, 7 Jan 2006, Nils Wagner wrote: > Is this a known bug (see below) ? I was away from my desk > for two weeks. > So I missed many changes... > > ====================================================================== > ERROR: check_basic > (numpy.core.defmatrix.test_defmatrix.test_algebra) > ---------------------------------------------------------------------- > AttributeError: PackageLoader instance has no attribute > '_obj2str' The bug is now fixed in svn. Pearu From nwagner at mecha.uni-stuttgart.de Sat Jan 7 11:31:38 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 07 Jan 2006 17:31:38 +0100 Subject: [SciPy-dev] Matrix functions in new scipy Message-ID: Hi all, How can I use matrix funtions (e.g. expm, signm, logm,...) with new scipy ? Nils From byrnes at bu.edu Sat Jan 7 15:54:14 2006 From: byrnes at bu.edu (John Byrnes) Date: Sat, 7 Jan 2006 15:54:14 -0500 Subject: [SciPy-dev] Updated OS X build instructions In-Reply-To: <05961580-3BBA-4F58-86A6-28F2888CA30D@trichech.us> References: <05961580-3BBA-4F58-86A6-28F2888CA30D@trichech.us> Message-ID: <20060107205414.GA14279@localhost.localdomain> Is Pearu's comment on the OSX build instructions still valid? "At the moment scipy.fftpack does not support FFTW 3. Forcing scipy.fftpack to use FFTW 3 as described above may crash Python when trying to use fftpack functions." Regards, John -- The human race has one really effective weapon, and that is laughter. -- Mark Twain From loredo at astro.cornell.edu Sat Jan 7 16:06:23 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Sat, 7 Jan 2006 16:06:23 -0500 Subject: [SciPy-dev] OS X: Things missing from scipy namespace; fft issues Message-ID: <1136667983.43c02d4f78fb3@astrosun2.astro.cornell.edu> Hi folks, I've installed the current SVN numpy and scipy. I also just installed both the last numarray release, and numarray from CVS, both with the same errors. This is on OS X. It seems some imports are missing. If I import scipy, basic numpy stuff that was always imported into scipy's namespace is missing. Thus I get AttributeError exceptions if I try to use anything like scipy.zeros, scipy.arange, scipy.pi, scipy.cos, scipy.sin... you get the idea. Numarray's last suite of tests gets 16 failures in 17 tests. This suite tests conversion between scipy and numarray. The errors are due to AttributeErrors as I've just described; in particular, numarray tries to use scipy.array and scipy.zeros a lot and raises an exception every time. There is also a problem with scipy's fft tests. The errors are copied below. A problem can be verified "by hand" by fft/ifft of a short sin(x) sequence; the real part of the inverse doesn't match the original. I presume these problems are OS X specific, or someone else would've seen them. I do have FFTW2 installed. I recall a similar problem appearing back in December that was quickly fixed. Perhaps that will jog someone's memory about it. Let me know if there is something I may have screwed up in the install, but an essentially identical install process with scipy_core and scipy last week didn't have these issues. I did upgrade from MacPython 2.4.1 to ActivePython 2.4.2 since then. -Tom FSkipping check_djbfft (failed to import FFT) ..F..F..FF.FSkipping check_djbfft (failed to import FFT) ..Skipping check_djbfft (failed to import FFT: No module named FFT) ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 98, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 20.+0.j 0.+0.j -4.+4.j 0.+0.j -4.+0.j 0.-0.j -4.-4.j 0.-0.j] Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 425, in check_definition assert_array_almost_equal(y,direct_dftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+7.7942286j 0. +0.j 0. +0.j ] [-13.5-7.79... Array 2: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+0.j -0. +0.j -0. -0.j ] [-13.5+0.j ... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 183, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.5+0.j 0. -0.j -0.5-0.5j 0. -0.j -0.5+0.j 0. +0.j -0.5+0.5j 0. +0.j ] Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 217, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0392156863%): Array 1: [ 0.4084044 +0.0000000e+00j 0.3922044 +1.5238355e-17j 0.5022578 +1.0884539e-18j 0.2854401 -4.4163695e-18j 0.771945... Array 2: [ 0.4084044 0.6151218 0.1139266 0.2933692 0.8680635 0.3443912 0.7993071 0.360068 0.4277811 0.1723109 0.16786... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 594, in check_definition assert_array_almost_equal(y,direct_idftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5-0.8660254j 0. +0.j 0. +0.j ] [-1.5+0.8660254j ... Array 2: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5+0.j -0. -0.j -0. +0.j ] [-1.5+0.j ... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_basic.py", line 341, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 2.625+0.j -0.375-0.j -0.375-0.j -0.375-0.j 0.625+0.j -0.375+0.j -0.375+0.j -0.375+0.j] ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_pseudo_diffs.py", line 86, in check_definition assert_array_almost_equal(diff(sin(x),2),direct_diff(sin(x),2)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ -6.3519652e-15 -3.8268343e-01 -7.0710678e-01 -9.2387953e-01 -1.0000000e+00 -9.2387953e-01 -7.0710678e-01 -3.82... Array 2: [ -7.3854931e-15 6.5259351e-15 -2.4942634e-15 -7.5636114e-17 1.4745663e-15 -1.9133685e-15 2.2804788e-16 8.70... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_pseudo_diffs.py", line 332, in check_random_even assert_array_almost_equal(direct_hilbert(direct_ihilbert(f)),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.1315163e-19+0.j 0.0000000e+00-0.j 1.5541025e-18-0.j 0.0000000e+00-0.j 2.1849431e-18-0.j 0.0000000e+00-0.... Array 2: [-0.1272037 -0.1873667 -0.0639312 -0.029223 0.058697 0.1326041 -0.0476338 -0.4124686 -0.0012274 -0.2744091 -0.28491... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_pseudo_diffs.py", line 390, in check_definition assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 88.8888888889%): Array 1: [ 0.0998334 0.4341242 0.7160532 0.9116156 0.9972237 0.9625519 0.8117822 0.5630995 0.2464987 -0.0998334 -0.43412... Array 2: [ 0.0998334 0.0938127 0.0764768 0.0499167 0.0173359 -0.0173359 -0.0499167 -0.0764768 -0.0938127 -0.0998334 -0.09381... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/ test_pseudo_diffs.py", line 240, in check_random_even assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.+0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.... Array 2: [ 0.3665402 -0.106089 0.2154525 -0.1109742 0.0947835 0.408259 0.4949894 0.2839439 0.5027918 -0.0850018 -0.453880... ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From nwagner at mecha.uni-stuttgart.de Sat Jan 7 16:26:42 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 07 Jan 2006 22:26:42 +0100 Subject: [SciPy-dev] Matrix functions in new scipy In-Reply-To: References: Message-ID: On Sat, 07 Jan 2006 17:31:38 +0100 "Nils Wagner" wrote: > Hi all, > > How can I use matrix funtions (e.g. expm, signm, >logm,...) > with new scipy ? > For example linalg.signm(a) results in AttributeError: 'module' object has no attribute 'signm' >>> a array([[ 29.2, -24.2, 69.5, 49.8, 7. ], [ -9.2, 5.2, -18. , -16.8, -2. ], [-10. , 6. , -20. , -18. , -2. ], [ -9.6, 9.6, -25.5, -15.4, -2. ], [ 9.8, -4.8, 18. , 18.2, 2. ]]) > Nils > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From chris at trichech.us Sat Jan 7 16:36:23 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Sat, 7 Jan 2006 16:36:23 -0500 Subject: [SciPy-dev] Updated OS X build instructions In-Reply-To: <20060107205414.GA14279@localhost.localdomain> References: <05961580-3BBA-4F58-86A6-28F2888CA30D@trichech.us> <20060107205414.GA14279@localhost.localdomain> Message-ID: On Jan 7, 2006, at 3:54 PM, John Byrnes wrote: > Is Pearu's comment on the OSX build instructions still valid? > > "At the moment scipy.fftpack does not support FFTW 3. Forcing > scipy.fftpack to use FFTW 3 as described above may crash Python when > trying to use fftpack functions." I am trying to figure this out myself. I know of at least one person who has been able to build scipy using both gcc4 and fftw3, but I have yet to replicate it. There ends up being errors in the cephes module that I cannot seem to eradicate. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From stefan at sun.ac.za Sat Jan 7 17:55:28 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Sun, 8 Jan 2006 00:55:28 +0200 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <43BA190C.7050809@colorado.edu> References: <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> <43BA0564.8030402@enthought.com> <43BA14A4.7050303@colorado.edu> <43BA17C9.80005@ieee.org> <43BA190C.7050809@colorado.edu> Message-ID: <20060107225528.GE5824@alpha> On Mon, Jan 02, 2006 at 11:26:20PM -0700, Fernando Perez wrote: > Travis Oliphant wrote: > > > Doh... Weave *was* working, but I changed the C-API a couple of > > iterations ago for consistency and this function is called something > > else. This is an example of why I'm uncomfortable with it in the core: > > it's just not on my radar... > > Putting even something as simple as > > weave.inline('std::cout << "hello\\n";') > > and other one-liners like > > x=linspace(0,1,10) > weave.blitz('x+1;') # this doesn't work either What is needed here exactly? If you only want the weave examples updated/fixed, I can take a look at it this coming week. That being said, I am not familiar with the current code, but I'm willing to give it a go. St?fan From chris at trichech.us Sat Jan 7 18:03:27 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Sat, 7 Jan 2006 18:03:27 -0500 Subject: [SciPy-dev] Updated OS X build instructions In-Reply-To: References: <05961580-3BBA-4F58-86A6-28F2888CA30D@trichech.us> <20060107205414.GA14279@localhost.localdomain> Message-ID: On Jan 7, 2006, at 4:36 PM, Christopher Fonnesbeck wrote: > On Jan 7, 2006, at 3:54 PM, John Byrnes wrote: > >> Is Pearu's comment on the OSX build instructions still valid? >> >> "At the moment scipy.fftpack does not support FFTW 3. Forcing >> scipy.fftpack to use FFTW 3 as described above may crash Python when >> trying to use fftpack functions." > > I am trying to figure this out myself. I know of at least one > person who has been able to build scipy using both gcc4 and fftw3, > but I have yet to replicate it. There ends up being errors in the > cephes module that I cannot seem to eradicate. I was able to build working numpy/scipy from svn with fftw3, and will post them on http://trichech.us. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From hgamboa at gmail.com Sat Jan 7 18:11:43 2006 From: hgamboa at gmail.com (Hugo Gamboa) Date: Sat, 7 Jan 2006 23:11:43 +0000 Subject: [SciPy-dev] Example of power of new data-type descriptors. In-Reply-To: <43AFB124.9060507@ieee.org> References: <43AFB124.9060507@ieee.org> Message-ID: <86522b1a0601071511h2ae3a21es79876b47ba1181c7@mail.gmail.com> I was looking at these new datatype mechanisms, and was unable to make an array string compare. example: (using Travis code) a['name'] Out[49]: array([Bill, Fred], dtype=(string,30)) In [50]: a['name']=='Bill' Out[50]: False In [51]: a['name'].__eq__('Bill') Out[51]: NotImplemented I expected that a['name']=='Bill' would return [True, False] Am I trying something in a wrong way? Is this related to chararray and I should use methods from that class? Hugo Gamboa On 12/26/05, Travis Oliphant wrote: > > > I'd like more people to know about the new power that is in scipy core > due to the general data-type descriptors that can now be used to define > numeric arrays. Towards that effort here is a simple example (be sure > to use latest SVN -- there were a coupld of minor changes that improve > usability made recently). Notice this example does not use a special > "record" array subclass. This is just a regular array. I'm kind of > intrigued (though not motivated to pursue) the possibility of accessing > (or defining) databases directly into scipy_core arrays using the record > functionality. > > # Define a new data-type descriptor > >>> import scipy > > >>> dtype = scipy.dtypedescr({'names': ['name', 'age', 'weight'], > 'formats': ['S30', 'i2', 'f4']}) > >>> a = scipy.array([('Bill',31,260),('Fred',15,135)], dtype=dtype) > # the argument to dtypedescr could have also been placed here as the > argument to dtype > > >>> print a['name'] > [Bill Fred] > > >>> print a['age'] > [31 15] > > >>> print a['weight'] > [ 260. 135.] > > >>> print a[0] > ('Bill', 31, 260.0) > > >>> print a[1] > ('Fred', 15, 135.0) > > It seems to me there are some very interesting possibilities with this > new ability. The record array subclass adds an improved scalar type > (the record) and attribute access to get at the fields: (e.g. a.name, > a.age, and a.weight). But, if you don't need attribute access you can > use regular arrays to do a lot of what you might need a record array to > accomplish for you. I'd love to see what people come up with using this > new facility. > > The new array PEP for Python basically proposes adding a very simple > array object (just the basic PyArrayObject * of Numeric with a > bare-bones type-object table) plus this new data-type descriptor object > to Python and a very few builtin data-type descriptors (perhaps just > object initially). This would basically add the array interface to > Python directly and allow people to start using it generally. The PEP > is slow going because it is not on my priority list right now because it > is not essential to making scipy_core work well. But, I would love to > have more people ruminating on the basic ideas which I think are > crystallizing. > > Best wishes for a new year, > > -Travis Oliphant > > > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From eric at enthought.com Sat Jan 7 18:32:11 2006 From: eric at enthought.com (eric jones) Date: Sat, 07 Jan 2006 17:32:11 -0600 Subject: [SciPy-dev] ANN: Robert Kern joins Enthought (yee haw.) Message-ID: <43C04F7B.9060001@enthought.com> Hey folks, I just wanted to announce something that I am extremely excited about. Robert Kern joined Enthought at the beginning of the year. If I could dance a jig, I would. You'll have to settle for a smile. :-) We're amazingly lucky to have him around, and I'm looking forward to working with him even more closely than we have in the past. Most importantly, now there's someone around here who can explain Nils questions to me... :-) Robert will be working on a variety of things including SciPy/NumPy. He'll be as active as he ever has in the SciPy community and in a great position to help push the project forward even more. Here's to hoping your new year went as well as ours, eric From oliphant.travis at ieee.org Sat Jan 7 22:20:28 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 20:20:28 -0700 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> Message-ID: <43C084FC.3070308@ieee.org> Joe Harrington wrote: >[this is not on the lists] > >Hi Travis, > > >Second, why start two *more* lists? We have two too many now. > This is the way I see it. 1) There is a need for list separation between NumPy and SciPy (there are enough people not interested in full SciPy who want to be able to discuss NumPy). 2) I'd like to keep Numeric / Numarray traffic on the list they are on now. So, I see only 4 lists of forward-going value numpy-user numpy-devel for NumPy development and use scipy-user scipy-dev for SciPy development and use The numpy-discussions list is for traffic about Numarray and Numeric as it always has been. I think this will let people filter what they read a bit better. I think we should also inform that if you are subscribed to the dev lists then you should also be subscribed to the user list so that traffic that should go to both only goes to the user lists (i.e. no cross-posting). I think some clear guidelines are definitely in order. Still waiting for that new website to start to get used.... From robert.kern at gmail.com Sat Jan 7 22:33:23 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 07 Jan 2006 21:33:23 -0600 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <43C084FC.3070308@ieee.org> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> <43C084FC.3070308@ieee.org> Message-ID: <43C08803.8040508@gmail.com> Travis Oliphant wrote: > I think some clear guidelines are definitely in order. Still waiting > for that new website to start to get used.... I think guidelines would be much simpler if we had only 3 lists: 1) numpy 2) scipy 3) numpy-discussion (for Numeric and numarray; unfortunate name) I've never seen the -user/-devel split as particularly useful for scipy, nor have I found the single list for Numeric users and developers as harmful. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Sat Jan 7 22:36:04 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 20:36:04 -0700 Subject: [SciPy-dev] Example of power of new data-type descriptors. In-Reply-To: <86522b1a0601071511h2ae3a21es79876b47ba1181c7@mail.gmail.com> References: <43AFB124.9060507@ieee.org> <86522b1a0601071511h2ae3a21es79876b47ba1181c7@mail.gmail.com> Message-ID: <43C088A4.6090208@ieee.org> Hugo Gamboa wrote: >I was looking at these new datatype mechanisms, and was unable to make >an array string compare. > >example: (using Travis code) > >a['name'] >Out[49]: array([Bill, Fred], dtype=(string,30)) > >In [50]: a['name']=='Bill' >Out[50]: False > >In [51]: a['name'].__eq__('Bill') >Out[51]: NotImplemented > > >I expected that a['name']=='Bill' would return [True, False] > > Problem is nobody has implemented that yet for strings. Comparisons go through universal functions. And there is no support for flexible-length arrays in the universal functions (ufuncs) right now. We could special-case the (rich) comparisons for strings and unicodes rather easily (and I think we should), but that hasn't been done yet. Right now, the chararray does implement equality testing (in Python so more slowly). Use a.view(numpy.chararray) to get a chararray. But, note the chararray has not been well-tested, yet. -Travis From oliphant.travis at ieee.org Sat Jan 7 22:38:38 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 20:38:38 -0700 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <43C08803.8040508@gmail.com> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> <43C084FC.3070308@ieee.org> <43C08803.8040508@gmail.com> Message-ID: <43C0893E.5050104@ieee.org> Robert Kern wrote: >Travis Oliphant wrote: > > > >>I think some clear guidelines are definitely in order. Still waiting >>for that new website to start to get used.... >> >> > >I think guidelines would be much simpler if we had only 3 lists: > >1) numpy >2) scipy >3) numpy-discussion (for Numeric and numarray; unfortunate name) > >I've never seen the -user/-devel split as particularly useful for scipy, nor >have I found the single list for Numeric users and developers as harmful. > > > I don't have a problem with that. I suppose we could just post to the single lists and keep a separate dev list silent until such time that developers are overwhelemed with the number of postings to the list ;-) (that seems to be what Python itself did). -Travis From robert.kern at gmail.com Sat Jan 7 22:50:05 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 07 Jan 2006 21:50:05 -0600 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <43C0893E.5050104@ieee.org> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> <43C084FC.3070308@ieee.org> <43C08803.8040508@gmail.com> <43C0893E.5050104@ieee.org> Message-ID: <43C08BED.6030401@gmail.com> Travis Oliphant wrote: > Robert Kern wrote: >>I think guidelines would be much simpler if we had only 3 lists: >> >>1) numpy >>2) scipy >>3) numpy-discussion (for Numeric and numarray; unfortunate name) >> >>I've never seen the -user/-devel split as particularly useful for scipy, nor >>have I found the single list for Numeric users and developers as harmful. > > I don't have a problem with that. > > I suppose we could just post to the single lists and keep a separate dev > list silent until such time that developers are overwhelemed with the > number of postings to the list ;-) (that seems to be what Python itself > did). Lists that exist will be posted to by somebody, so if we do go with a 3-list strategy, we probably need to prune the extras. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Sat Jan 7 22:32:41 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 20:32:41 -0700 Subject: [SciPy-dev] poll for renaming scipy_core online In-Reply-To: <20060107225528.GE5824@alpha> References: <43B9AC19.1050606@astraw.com> <43B9B00E.5040806@colorado.edu> <43B9BB54.60103@astraw.com> <43B9BE80.7000901@colorado.edu> <43B9E369.1070701@enthought.com> <43B9ED37.7050207@colorado.edu> <43BA0564.8030402@enthought.com> <43BA14A4.7050303@colorado.edu> <43BA17C9.80005@ieee.org> <43BA190C.7050809@colorado.edu> <20060107225528.GE5824@alpha> Message-ID: <43C087D9.4040606@ieee.org> Stefan van der Walt wrote: >On Mon, Jan 02, 2006 at 11:26:20PM -0700, Fernando Perez wrote: > > >>Travis Oliphant wrote: >> >> >> >>>Doh... Weave *was* working, but I changed the C-API a couple of >>>iterations ago for consistency and this function is called something >>>else. This is an example of why I'm uncomfortable with it in the core: >>>it's just not on my radar... >>> >>> >>Putting even something as simple as >> >>weave.inline('std::cout << "hello\\n";') >> >>and other one-liners like >> >>x=linspace(0,1,10) >>weave.blitz('x+1;') # this doesn't work either >> >> > >What is needed here exactly? If you only want the weave examples >updated/fixed, I can take a look at it this coming week. That being >said, I am not familiar with the current code, but I'm willing to give >it a go. > > I think Fernando is suggesting some unit tests for weave that actually try to run and use some of the examples. This is a good idea... But, note that weave is now in SciPy (not NumPy). -travis From ndarray at mac.com Sat Jan 7 23:10:03 2006 From: ndarray at mac.com (Sasha) Date: Sat, 7 Jan 2006 23:10:03 -0500 Subject: [SciPy-dev] How to handle a[...] in numpy? Message-ID: In Numeric a[...] would return an array unless a was 0-rank and a python type othewise. What is the right way to do the same in numpy? The problem is that in numpy subscripting always fails on 0-rank arrays: >>> array(0)[...] Traceback (most recent call last): File "", line 1, in ? IndexError: 0-d arrays can't be indexed. Note that in numpy I would like to get numpy scalars rather than python type from a 0-rank array. A related question and a proposal: Queston: Why do ascalar() and item() return python types rather than numpy scalar types? Proposal: Although I like a lot that 0-rank arrays and numpy scalar types non-iterable, it may be reasonable to allow a[...]. This way ellipsis can be interpereted as any number of ":"s including zero. Another subscript operation that makes sense for scalars would be a[...,newaxis] or even a[{newaxis, }* ..., {newaxis,}*], where {newaxis,}* stands for any number of comma-separated newaxis tokens. This will allow one to use ellipsis in generic code that would work on any numpy type. I will contribute code if there is any interest. -- sasha PS: I've just realized that ellipsis does include "zero number of ':'s", only not for zero rank objects: >>> array([0,0])[...,:] array([0, 0]) From oliphant.travis at ieee.org Sat Jan 7 22:58:32 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 20:58:32 -0700 Subject: [SciPy-dev] numpy - dual.py problems In-Reply-To: References: Message-ID: <43C08DE8.6000909@ieee.org> Arnd Baecker wrote: >Hi, > > >A typical scenario for "end-users" is the following: >- people will have Numeric/numarray + old scipy/old new scipy > on their machines. > > In many cases this is the system-wide installation as done by > the root-user (eg. via some package manager) > > The "end-user" has no root rights. >- The "end-user" hears about the great progress wrt numpy/scipy combo > and wants to test it out. > > He downloads numpy and installs it to some place in his > homedirectory via > > python setup.py install --prefix=<~/somewhere> > > and sets his PYTHONPATH accordingly >- Then `import numpy` will work, but a > `numpy.test(10)` will fail because `dual.py` > picks his old scipy (which will be visible from the > traceback, if he looks carefully at the path names). > >- Consequence, the "end-user" will either ask a question on the > mailing list or just quit his experiment and continue > to work with his old installation. > > This has been fixed now so that it will only use scipy if it can find version 0.4.4 or higher... >Later on Travis' wrote: >"""The solution is that now to get at functions that are in both numpy and >scipy and you want the scipy ones first and default to the numpy ones if >scipy is not installed, there is a numpy.dual module that must be >loaded separately that contains all the overlapping functions.""" > >I think this is fine, if a user does this in his own code, >but I have found the following `from numpy.dual import`s within numpy > core/defmatrix.py: from numpy.dual import inv > lib/polynomial.py: from numpy.dual import eigvals, lstsq > lib/mlab.py:from numpy.dual import eig, svd > lib/function_base.py: from numpy.dual import i0 > > > BTW, these are all done inside of a function call. I want to be able to use the special.i0 method when its available inside numpy (for kaiser window). I want to be able to use a different inverse for matrix inverses and better eig and svd for polynomial root finding. So, I don't see this concept of enhancing internal functions going away. Now, I don't see the current numpy.dual approach as the *be-all*. I think it can be improved on. In fact, I suppose some mechanism for registering replacement functions should be created instead of giving special place to SciPy. SciPy could then call these functions. This could all be done inside of numpy.dual. So, I think the right structure is there.... -Travis From oliphant.travis at ieee.org Sat Jan 7 23:11:02 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 21:11:02 -0700 Subject: [SciPy-dev] No time for SciPy release today In-Reply-To: References: <43BEEC62.9060205@ieee.org> Message-ID: <43C090D6.9010108@ieee.org> Ed Schofield wrote: >On 06/01/2006, at 10:17 PM, Travis Oliphant wrote: > > > >>I'm not going to have time to make a release of (full) SciPy today. >> >>It builds find out of SVN. I'm starting classes on Monday (and >>have to >>get back to the seven other things I've left holding while putting >>NumPy >>out) so I'm not sure how much time I'll have for it over the next few >>weeks. >> >>If somebody else wants to take a stab at making a release of (full) >>SciPy, I can point you in the right direction. >> >> > >I'll take a stab at it :) > > This is what I do (check this set of instructions for errors, it's off the top of my head). 1) Tag the tree by copying to a new directory in the tags svn cp http://svn.scipy.org/svn/scipy/trunk http://svn.scipy.org/svn/scipy/tags/version.major.minor 2) Switch to that new tag branch svn switch http://svn.scipy.org/svn/scipy/tags/version.major.minor 3) Remove the svn versioning magic from the version.py file (may not be necessary???) 4) Build source distribution python setup.py sdist 5) *Go to dist directory* and untar the created distribution... Do some binary distribution from this freshly untarred package (this can help pick up errors in the setup.py files) e.g. python setup.py bdist_rpm. You should also install to your machine from this build and run tests... 6) Copy the resulting packages back up to the main scipy/dist directory 7) Check-out the SVN release tag on a windows box. 8) Build for Python 2.4 and Python 2.3 using python setup.py bdist_wininst # with appropriately setup distutils.cfg file for the compiler you will be using. 9) Install the resulting windows binaries and run the tests... 10) Put the resulting files somewhere (probably the new scipy.org site...) --- sourceforge is actually a pain to release files to.... All of this takes anywhere from 1-3 hours, depending on problems. Perhaps somebody clever could right a series of programs to do it all ;-) -Travis From ndarray at mac.com Sun Jan 8 01:12:59 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 01:12:59 -0500 Subject: [SciPy-dev] Shoud ma.filled change dtype? Message-ID: In the current version of numpy.core.ma, masked values can be filled by arbitrary value. If the fill value cannot be converted to an appropriate type, an object array is returned: >>> from numpy.core.ma import * >>> x = array([1],mask=[1]) >>> x.filled(1j) array([1j], dtype=object) Note that in the example the resulting array is an object array with complex numbers in it. This creature is very similar in behavior to a complex array, but is slower and takes many times more memory. If real x was filled with a complex number by mistake, this mistake is likely to go unnoticed. I don't think this feature is useful and I've just spent a day chasing a bug related to this feature. If one really wants to create an object array from a masked array of numbers, it is easy to convert the array explicitely. (With the current implementation this will also be a faster option.) There is probably some history behind this feature. If so, can someone educate me? Thanks. -- sasha From ndarray at mac.com Sun Jan 8 01:22:59 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 01:22:59 -0500 Subject: [SciPy-dev] Fancy indexing curiosity Message-ID: >>> x = array([1]) >>> m = array([1], bool) >>> x[m] = '' >>> x array([12998744]) >>> x[m] = '' >>> x array([12998752]) It looks like x gets the address of the '' object (note the number change). This "feature" is present in 0.9.2 release and in svn 0.9.3.1831 -- sasha From oliphant.travis at ieee.org Sun Jan 8 01:10:51 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sat, 07 Jan 2006 23:10:51 -0700 Subject: [SciPy-dev] SciPy now loads all of numpy namespace... Message-ID: <43C0ACEB.20002@ieee.org> Given some valid comments that (old) scipy subsumed the Numeric namespace and the recent separation undid that, I've re-added back to SciPy the new NumPy namespace. This should fix code that relied on that behavior. I think that behavior should probably be deprecated, but I don't think we can just cut it for the moment. -Travis From oliphant.travis at ieee.org Sun Jan 8 02:09:28 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 00:09:28 -0700 Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: Message-ID: <43C0BAA8.3060704@ieee.org> Sasha wrote: >>>>x = array([1]) >>>>m = array([1], bool) >>>>x[m] = '' >>>>x >>>> >>>> >array([12998744]) > > >>>>x[m] = '' >>>>x >>>> >>>> >array([12998752]) > >It looks like x gets the address of the '' object (note the number change). >This "feature" is present in 0.9.2 release and in svn 0.9.3.1831 > > Thanks for this corner case. Actually, it's getting whatever is in the memory allocated. Try b = array('',x.dtype) b should be a size-0 array, (there is actually 1 elementsize of memory allocated for it though). The mapping code is iterating over b and resetting b whenever it runs out of elements of b (every time in this case), except the data pointed to by the iterator is never valid so this should not be allowed.... I suppose instead, creating an iterator from a size-0 array could raise an error. This would be a great and simple place to catch this in the code as well. A size-0 array has nothing to iterate over... -Travis From oliphant.travis at ieee.org Sun Jan 8 02:14:46 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 00:14:46 -0700 Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: <43C0BAA8.3060704@ieee.org> References: <43C0BAA8.3060704@ieee.org> Message-ID: <43C0BBE6.9070204@ieee.org> Travis Oliphant wrote: >Sasha wrote: > > > >>>>>x = array([1]) >>>>>m = array([1], bool) >>>>>x[m] = '' >>>>>x >>>>> >>>>> >>>>> >>>>> >>array([12998744]) >> >> >> >> >>>>>x[m] = '' >>>>>x >>>>> >>>>> >>>>> >>>>> >>array([12998752]) >> >>It looks like x gets the address of the '' object (note the number change). >>This "feature" is present in 0.9.2 release and in svn 0.9.3.1831 >> >> >> >> > >Thanks for this corner case. Actually, it's getting whatever is in the >memory allocated. > >Try > >b = array('',x.dtype) > >b should be a size-0 array, (there is actually 1 elementsize of memory >allocated for it though). > >The mapping code is iterating over b and resetting b whenever it runs >out of elements of b (every time in this case), except the data pointed >to by the iterator is never valid so this should not be allowed.... > >I suppose instead, creating an iterator from a size-0 array could raise >an error. This would be a great and simple place to catch this in the >code as well. > > > This now raises an error.... -Travis From pearu at scipy.org Sun Jan 8 02:52:17 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Jan 2006 01:52:17 -0600 (CST) Subject: [SciPy-dev] SciPy now loads all of numpy namespace... In-Reply-To: <43C0ACEB.20002@ieee.org> References: <43C0ACEB.20002@ieee.org> Message-ID: On Sat, 7 Jan 2006, Travis Oliphant wrote: > > Given some valid comments that (old) scipy subsumed the Numeric > namespace and the recent separation undid that, I've re-added back to > SciPy the new NumPy namespace. This should fix code that relied on > that behavior. > > I think that behavior should probably be deprecated, but I don't think > we can just cut it for the moment. I think this behavior will then last forever. The usage of `from scipy import *` to get numpy symbols can only get "worse" unless such an import will emit some depreciation warning (this would be possible via scipy.__all__ list that contains a name to a module that only emits a warning when imported). If we don't cut it now, we can never cut it. I'd vote for 0 on cutting. Pearu From nwagner at mecha.uni-stuttgart.de Sun Jan 8 04:52:20 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sun, 08 Jan 2006 10:52:20 +0100 Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) Message-ID: ====================================================================== ERROR: Test of other odd features ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/core/tests/test_ma.py", line 382, in check_testOddFeatures assert eq(atest,ctest) File "/usr/local/lib/python2.4/site-packages/numpy/core/tests/test_ma.py", line 6, in eq result = allclose(v,w) File "/usr/local/lib/python2.4/site-packages/numpy/core/ma.py", line 1396, in allclose d2 = filled(b) File "/usr/local/lib/python2.4/site-packages/numpy/core/ma.py", line 216, in filled return a.filled(value) File "/usr/local/lib/python2.4/site-packages/numpy/core/ma.py", line 1221, in filled result[m] = value ValueError: Cannot iterate over a size-0 array ====================================================================== ERROR: check_index_split_high_bound (numpy.lib.shape_base.test_shape_base.test_array_split) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 120, in check_index_split_high_bound compare_results(res,desired) File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 361, in compare_results assert_array_equal(res[i],desired[i]) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 159, in assert_array_equal raise ValueError, msg ValueError: Arrays are not equal ====================================================================== ERROR: check_index_split_low_bound (numpy.lib.shape_base.test_shape_base.test_array_split) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 113, in check_index_split_low_bound compare_results(res,desired) File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 361, in compare_results assert_array_equal(res[i],desired[i]) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 159, in assert_array_equal raise ValueError, msg ValueError: Arrays are not equal ====================================================================== ERROR: check_integer_split (numpy.lib.shape_base.test_shape_base.test_array_split) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 79, in check_integer_split compare_results(res,desired) File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 361, in compare_results assert_array_equal(res[i],desired[i]) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 159, in assert_array_equal raise ValueError, msg ValueError: Arrays are not equal ====================================================================== ERROR: This will fail if we change default axis ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 98, in check_integer_split_2D_default compare_results(res,desired) File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 361, in compare_results assert_array_equal(res[i],desired[i]) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 159, in assert_array_equal raise ValueError, msg ValueError: Arrays are not equal ====================================================================== ERROR: check_integer_split_2D_rows (numpy.lib.shape_base.test_shape_base.test_array_split) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 84, in check_integer_split_2D_rows compare_results(res,desired) File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_shape_base.py", line 361, in compare_results assert_array_equal(res[i],desired[i]) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 159, in assert_array_equal raise ValueError, msg ValueError: Arrays are not equal ====================================================================== FAIL: Doctest: numpy.lib.polynomial.test_polynomial ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/doctest.py", line 2152, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for numpy.lib.polynomial.test_polynomial File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_polynomial.py", line 0, in test_polynomial ---------------------------------------------------------------------- File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_polynomial.py", line 60, in numpy.lib.polynomial.test_polynomial Failed example: p.deriv(2) Exception raised: Traceback (most recent call last): File "/usr/local/lib/python2.4/doctest.py", line 1243, in __run compileflags, 1) in test.globs File "", line 1, in ? p.deriv(2) File "/usr/local/lib/python2.4/site-packages/numpy/lib/polynomial.py", line 535, in deriv return poly1d(polyder(self.coeffs, m=m)) File "/usr/local/lib/python2.4/site-packages/numpy/lib/polynomial.py", line 163, in polyder val = polyder(y, m-1) File "/usr/local/lib/python2.4/site-packages/numpy/lib/polynomial.py", line 163, in polyder val = polyder(y, m-1) File "/usr/local/lib/python2.4/site-packages/numpy/lib/polynomial.py", line 157, in polyder y = p[:-1] * NX.arange(n, 0, -1) ValueError: Cannot iterate over a size-0 array ---------------------------------------------------------------------- File "/usr/local/lib/python2.4/site-packages/numpy/lib/tests/test_polynomial.py", line 72, in numpy.lib.polynomial.test_polynomial Failed example: polydiv(poly1d([1,0,-1]), poly1d([1,1])) Exception raised: Traceback (most recent call last): File "/usr/local/lib/python2.4/doctest.py", line 1243, in __run compileflags, 1) in test.globs File "", line 1, in ? polydiv(poly1d([1,0,-1]), poly1d([1,1])) File "/usr/local/lib/python2.4/site-packages/numpy/lib/polynomial.py", line 386, in __repr__ vals = repr(self.coeffs) File "/usr/local/lib/python2.4/site-packages/numpy/core/numeric.py", line 223, in array_repr ', ', "array(") File "/usr/local/lib/python2.4/site-packages/numpy/core/arrayprint.py", line 196, in array2string separator, prefix) File "/usr/local/lib/python2.4/site-packages/numpy/core/arrayprint.py", line 158, in _array2string format = _floatFormat(data, precision, suppress_small) File "/usr/local/lib/python2.4/site-packages/numpy/core/arrayprint.py", line 273, in _floatFormat non_zero = _uf.absolute(data.compress(_uf.not_equal(data, 0))) ValueError: Cannot iterate over a size-0 array ---------------------------------------------------------------------- Ran 183 tests in 2.905s From pearu at scipy.org Sun Jan 8 03:58:08 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Jan 2006 02:58:08 -0600 (CST) Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: <43C0BBE6.9070204@ieee.org> References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Sun, 8 Jan 2006, Travis Oliphant wrote: >> Thanks for this corner case. Actually, it's getting whatever is in the >> memory allocated. >> >> Try >> >> b = array('',x.dtype) >> >> b should be a size-0 array, (there is actually 1 elementsize of memory >> allocated for it though). >> >> The mapping code is iterating over b and resetting b whenever it runs >> out of elements of b (every time in this case), except the data pointed >> to by the iterator is never valid so this should not be allowed.... >> >> I suppose instead, creating an iterator from a size-0 array could raise >> an error. This would be a great and simple place to catch this in the >> code as well. >> > This now raises an error.... I guess the following behaviour is related to recent changes on svn, as well as causing some of the unittests to fail: In [1]: from numpy import * In [2]: a=array([]) In [3]: equal(a,a) --------------------------------------------------------------------------- exceptions.ValueError Traceback (most recent call last) /home/pearu/ ValueError: Cannot iterate over a size-0 array In [5]: a==a Out[5]: False In [6]: a is a Out[6]: True I would expect that empty sets are always equal to each other. Pearu From arnd.baecker at web.de Sun Jan 8 06:38:30 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 8 Jan 2006 12:38:30 +0100 (CET) Subject: [SciPy-dev] numpy - dual.py problems In-Reply-To: <43C08DE8.6000909@ieee.org> References: <43C08DE8.6000909@ieee.org> Message-ID: Hi Travis, On Sat, 7 Jan 2006, Travis Oliphant wrote: > Arnd Baecker wrote: > > >Hi, > > > >A typical scenario for "end-users" is the following: > >- people will have Numeric/numarray + old scipy/old new scipy > > on their machines. > > > > In many cases this is the system-wide installation as done by > > the root-user (eg. via some package manager) > > > > The "end-user" has no root rights. > >- The "end-user" hears about the great progress wrt numpy/scipy combo > > and wants to test it out. > > > > He downloads numpy and installs it to some place in his > > homedirectory via > > > > python setup.py install --prefix=<~/somewhere> > > > > and sets his PYTHONPATH accordingly > >- Then `import numpy` will work, but a > > `numpy.test(10)` will fail because `dual.py` > > picks his old scipy (which will be visible from the > > traceback, if he looks carefully at the path names). > > > >- Consequence, the "end-user" will either ask a question on the > > mailing list or just quit his experiment and continue > > to work with his old installation. > > > This has been fixed now so that it will only use scipy if it can find > version 0.4.4 or higher... Great - I think this will prevent a lot of problems! > >Later on Travis' wrote: > >"""The solution is that now to get at functions that are in both numpy and > >scipy and you want the scipy ones first and default to the numpy ones if > >scipy is not installed, there is a numpy.dual module that must be > >loaded separately that contains all the overlapping functions.""" > > > >I think this is fine, if a user does this in his own code, > >but I have found the following `from numpy.dual import`s within numpy > > core/defmatrix.py: from numpy.dual import inv > > lib/polynomial.py: from numpy.dual import eigvals, lstsq > > lib/mlab.py:from numpy.dual import eig, svd > > lib/function_base.py: from numpy.dual import i0 > > > BTW, these are all done inside of a function call. Yes - I saw that. > I want to be able > to use the special.i0 method when its available inside numpy (for kaiser > window). I want to be able to use a different inverse for matrix > inverses and better eig and svd for polynomial root finding. Are these numerically better, or just faster? I very much understand your point of view on this (until Fernando's mail I would have silently agreed ;-). On the other I think that Fernando's point, that the mere installation of scipy will change the behaviour of numpy implicitely, without the user being aware of this or having asked for the change. Now, it could be that this works fine in 99.9% of the cases, but if it does not, it might be very hard to track down. So I am still thinking that something like a numpy.enable_scipy_functions() might be a better approach. Let me give another example: when we use highly optimized routines for parallel computers (namely `scsl`) for linear algebra, the whole things screws up (just hangs), if the number of CPUs is not set before by the environment variable OMP_NUM_THREADS. Assume we have some code which should run on just one cpu and should just use numpy. Then the implicit imports brings in the other library and will hang. > So, I don't see this concept of enhancing internal functions going > away. Now, I don't see the current numpy.dual approach as the > *be-all*. I think it can be improved on. In fact, I suppose some > mechanism for registering replacement functions should be created > instead of giving special place to SciPy. SciPy could then call these > functions. This could all be done inside of numpy.dual. So, I think > the right structure is there.... Anyway, sorry if I am wasting your time with this discussion, I don't feel too strongly about this point (especially after the version check), maybe Fernando would like to add something - also I have to move on to other stuff (but whom do I tell that ;-). Best, Arnd From a.h.jaffe at gmail.com Sun Jan 8 07:28:33 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Sun, 08 Jan 2006 12:28:33 +0000 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <43C08BED.6030401@gmail.com> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> <43C084FC.3070308@ieee.org> <43C08803.8040508@gmail.com> <43C0893E.5050104@ieee.org> <43C08BED.6030401@gmail.com> Message-ID: Robert Kern wrote: > Travis Oliphant wrote: >>Robert Kern wrote: > >>>I think guidelines would be much simpler if we had only 3 lists: >>>1) numpy >>>2) scipy >>>3) numpy-discussion (for Numeric and numarray; unfortunate name) >>> >>>I've never seen the -user/-devel split as particularly useful for scipy, nor >>>have I found the single list for Numeric users and developers as harmful. >> >>I don't have a problem with that. >> >>I suppose we could just post to the single lists and keep a separate dev >>list silent until such time that developers are overwhelemed with the >>number of postings to the list ;-) (that seems to be what Python itself >>did). > > Lists that exist will be posted to by somebody, so if we do go with a 3-list > strategy, we probably need to prune the extras. Just to reiterate part of a previous, related, discussion: it would be great if all of the lists were accesible as newsgroups through (e.g.) gmane. Andrew From ndarray at mac.com Sun Jan 8 09:54:39 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 09:54:39 -0500 Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) Message-ID: I've fixed the first error in svn. Will look is the others could be related to ma . -- sasha From ndarray at mac.com Sun Jan 8 10:41:06 2006 From: ndarray at mac.com (ndarray at mac.com) Date: Sun, 8 Jan 2006 10:41:06 -0500 Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) In-Reply-To: References: Message-ID: <27725A75-0AF3-49B9-A73C-BC72E78ECCB4@mac.com> I was too quick accepting the blame for the test_ma failure. The following session log demonstrates the underlying problem in fancy subscript assignment: >>> x = array([[1]]) >>> m = array([[0]], bool) >>> x[m] = 2 Traceback (most recent call last): File "", line 1, in ? ValueError: Cannot iterate over a size-0 array Travis, I can look into this if your hands are full. -- sasha On Jan 8, 2006, at 9:54 AM, Sasha wrote: > I've fixed the first error in svn. Will look is the others could be > related to ma . > > -- sasha > From schofield at ftw.at Sun Jan 8 11:37:03 2006 From: schofield at ftw.at (Ed Schofield) Date: Sun, 08 Jan 2006 16:37:03 +0000 Subject: [SciPy-dev] not *more* lists! In-Reply-To: <43C084FC.3070308@ieee.org> References: <200601070358.k073wMgd028198@oobleck.astro.cornell.edu> <43C084FC.3070308@ieee.org> Message-ID: <43C13FAF.30205@ftw.at> Travis Oliphant wrote: >This is the way I see it. > >1) There is a need for list separation between NumPy and SciPy (there >are enough people not interested in full SciPy who want to be able to >discuss NumPy). > >2) I'd like to keep Numeric / Numarray traffic on the list they are on now. > > Why not merge the numpy-user and numpy-discussion lists? This would be another step towards unifying the Numeric/numarray/numpy communities. Then the 'numpy-discussions' name would be an assert, rather than an unfortunate by-product of a bygone era. The traffic on numpy-discussions is quite light, and some of it is already about comparisons and distinctions between the three packages. I doubt many of the current numpy-discussions subscribers would mind if the discussion gradually became more focused on numpy, especially since Numeric is now dead and, well, the list *is* called numpy-discussion ;) -- Ed From Fernando.Perez at colorado.edu Sun Jan 8 11:44:25 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 08 Jan 2006 09:44:25 -0700 Subject: [SciPy-dev] numpy - dual.py problems In-Reply-To: References: <43C08DE8.6000909@ieee.org> Message-ID: <43C14169.3070707@colorado.edu> Arnd Baecker wrote: > Hi Travis, > > On Sat, 7 Jan 2006, Travis Oliphant wrote: >>>Later on Travis' wrote: >>>"""The solution is that now to get at functions that are in both numpy and >>>scipy and you want the scipy ones first and default to the numpy ones if >>>scipy is not installed, there is a numpy.dual module that must be >>>loaded separately that contains all the overlapping functions.""" >>> >>>I think this is fine, if a user does this in his own code, >>>but I have found the following `from numpy.dual import`s within numpy >>> core/defmatrix.py: from numpy.dual import inv >>> lib/polynomial.py: from numpy.dual import eigvals, lstsq >>> lib/mlab.py:from numpy.dual import eig, svd >>> lib/function_base.py: from numpy.dual import i0 >>> >> >>BTW, these are all done inside of a function call. > > > Yes - I saw that. > > >>I want to be able >>to use the special.i0 method when its available inside numpy (for kaiser >>window). I want to be able to use a different inverse for matrix >>inverses and better eig and svd for polynomial root finding. > > > Are these numerically better, or just faster? > > I very much understand your point of view on this > (until Fernando's mail I would have silently agreed ;-). > On the other I think that Fernando's point, > that the mere installation of scipy > will change the behaviour of numpy implicitely, > without the user being aware of this > or having asked for the change. > > Now, it could be that this works fine in 99.9% of > the cases, but if it does not, it might > be very hard to track down. > > So I am still thinking that something like a > numpy.enable_scipy_functions() > might be a better approach. [...] >>So, I don't see this concept of enhancing internal functions going >>away. Now, I don't see the current numpy.dual approach as the >>*be-all*. I think it can be improved on. In fact, I suppose some >>mechanism for registering replacement functions should be created >>instead of giving special place to SciPy. SciPy could then call these >>functions. This could all be done inside of numpy.dual. So, I think >>the right structure is there.... > > > Anyway, sorry if I am wasting your time with this > discussion, I don't feel too strongly about this point > (especially after the version check), > maybe Fernando would like to add something - > also I have to move on to other stuff > (but whom do I tell that ;-). Well, I do think that having code like (current SVN): abdul[numpy]> egrep -r 'from numpy.dual' * | grep -v '\.svn/' core/defmatrix.py: from numpy.dual import inv dual.py:# Usage --- from numpy.dual import fft, inv lib/function_base.py: from numpy.dual import i0 lib/polynomial.py: from numpy.dual import eigvals, lstsq lib/mlab.py:from numpy.dual import eig, svd random/mtrand/mtrand.pyx: from numpy.dual import svd sort of defeats the whole purpose of dual, doesn't it? dual is meant to isolate the contributions from full scipy, so that the _existence_ of scipy isn't a hidden side-effect for numpy. If this is the case, then I think there shouldn't be code in numpy ever doing 'from dual import...'. Otherwise, we might as well go back to simply writing 'from scipy import ...' as before, no? Just like in scipy not all packages are auto-loaded (see Pearu's response on that today) and you have to call scipy.pkgload() if you want the whole thing in memory, I do think that numpy should be strict about the no-hidden side-effects policy suggested by dual. Providing a numpy.load_dual() or numpy.enable_scipy() or something would be OK, but I think it should be done explicitly. Note that if an explicit call is made, it should set a global (numpy-level) flag, so that any code can check for this condition: In numpy.__init__, we should have scipy_dual_loaded = False def dual_load(): global scipy_dual_loaded from dual import ... scipy_dual_loaded = True This will at least let you check whether this thing was called by other libraries you may be importing. I am trying to ensure that we have a mechanism for tracking this kind of side effect, because the call could be made by code you didn't write yourself. With this, at least you can do something like: if _something_weird and numpy.scipy_dual_loaded: print 'scipy effects, check for conflicts in that direction' Ultimately, I think that this should be reversible, with a dual_unload() matching routine, but that's icing on the cake. I do feel that at least the explicit dual_load() is the technically correct solution, even at a (minor) loss of convenience. Cheers, f From Fernando.Perez at colorado.edu Sun Jan 8 11:50:46 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 08 Jan 2006 09:50:46 -0700 Subject: [SciPy-dev] =?windows-1252?q?=5BSlightly_OT=5D_Karl_Fogel=27s_=93?= =?windows-1252?q?Producing_Open_Source_Software=94?= Message-ID: <43C142E6.3020605@colorado.edu> Hi all, a few months ago, Robert Kern recommended this book: Producing Open Source Software How to Run a Successful Free Software Project By Karl Fogel http://producingoss.com/ and I've found it to be a very useful collection of tips and ideas regarding the running of an OSS project. A groklaw.net review came out today http://www.groklaw.net/article.php?story=20060105010817642 which made me think this might be of interest to some in our group. While I think that scipy in general is running great, there are many good ideas in that book in terms of things to keep in mind to make the environment around a project operate better. Back to your regularly scheduled namespace discussions :) Cheers, f From chris at trichech.us Sun Jan 8 12:07:12 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Sun, 8 Jan 2006 12:07:12 -0500 Subject: [SciPy-dev] CBLAS on OSX Message-ID: I was under the impression that both CBLAS and LAPACK were provided by the Apple Developer Tools. However, during a test of scipy, I get the following message: **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** Is this message in error, or is something really missing? Thanks, C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From ndarray at mac.com Sun Jan 8 10:56:35 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 15:56:35 +0000 (UTC) Subject: [SciPy-dev] Fancy indexing curiosity References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: Pearu Peterson scipy.org> writes: ... > > In [5]: a==a > Out[5]: False > > In [6]: a is a > Out[6]: True > > I would expect that empty sets are always equal to each other. I would think that since "==" is element-wise, it should return an empty array when applied to two empty arrays, not "True". - sasha From ndarray at mac.com Sun Jan 8 13:58:44 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 13:58:44 -0500 Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) In-Reply-To: <27725A75-0AF3-49B9-A73C-BC72E78ECCB4@mac.com> References: <27725A75-0AF3-49B9-A73C-BC72E78ECCB4@mac.com> Message-ID: Attached patch fixes the problem and passes numpy.test(). -- sasha On 1/8/06, ndarray at mac.com wrote: > I was too quick accepting the blame for the test_ma failure. The > following session log demonstrates the underlying problem in fancy > subscript assignment: > > >>> x = array([[1]]) > >>> m = array([[0]], bool) > >>> x[m] = 2 > Traceback (most recent call last): > File "", line 1, in ? > ValueError: Cannot iterate over a size-0 array > > Travis, I can look into this if your hands are full. > > -- sasha > > On Jan 8, 2006, at 9:54 AM, Sasha wrote: > > > I've fixed the first error in svn. Will look is the others could be > > related to ma . > > > > -- sasha > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: iterate-over-size-0-patch.txt URL: From pearu at scipy.org Sun Jan 8 13:10:31 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Jan 2006 12:10:31 -0600 (CST) Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) In-Reply-To: References: <27725A75-0AF3-49B9-A73C-BC72E78ECCB4@mac.com> Message-ID: On Sun, 8 Jan 2006, Sasha wrote: > Attached patch fixes the problem and passes numpy.test(). Sure. But this just undoes Travis recent addition: http://projects.scipy.org/scipy/numpy/changeset/1853 I'm not familiar with the implications but shouldn't PyArray_IterNew return an "empty" iterator for empty arrays. Pearu From ndarray at mac.com Sun Jan 8 14:18:23 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 14:18:23 -0500 Subject: [SciPy-dev] 0.9.3.1853 FAILED (failures=1, errors=6) In-Reply-To: References: <27725A75-0AF3-49B9-A73C-BC72E78ECCB4@mac.com> Message-ID: On 1/8/06, Pearu Peterson wrote: > > Sure. But this just undoes Travis recent addition: > > http://projects.scipy.org/scipy/numpy/changeset/1853 > I know. That's why I did not commit the change. Note, however that my patch undoes only part of Travis' change to arrayobject.c . The if (it->size == 0) condition if triggered also seems to introduce a memory leak. I guess we should wait to hear from Travis. -- sasha From loredo at astro.cornell.edu Sun Jan 8 15:18:03 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Sun, 8 Jan 2006 15:18:03 -0500 (EST) Subject: [SciPy-dev] Advice on numpy/scipy namespace issue (ipython etc.) Message-ID: <200601082018.k08KI3v14362@laplace.astro.cornell.edu> Hi folks, Okay, as one who complained about the lack of numpy top-level objects in scipy's top-level namespace, I'm pursuaded by Travis and Pearu that a separation should be seriously considered, and that now is the time. I'd like some advice about handling the namespace split properly. I know that "from ... import *" is bad behavior, but particularly for interactive use it's sometimes irresistable. If I do from numpy import * from scipy import * is there any namespace collision I should be aware of? Are there any scipy modules or top-level functions with the same name as numpy's (or vice versa)? Both packages have "fft" at the top level, for example. "ipython -p scipy" imports top-level scipy into its namespace, but not top-level numpy. This doesn't seem right to me---surely one will be using array and zeros more often than fftpack---but the "import *" issue has to be settled if we want "-p scipy" to have all the handy numpy stuff at its top level. As a more specific example, what exactly is the difference between numpy.fft and scipy.fft? As a quick exploration of the issue, I ran this: import numpy, scipy nall = numpy.__all__ for name in scipy.__all__: if name in nall: print name The result was only 3 names: lib, fft, ifft. lib in particular is quite different in scipy vs. numpy. If the common name is kept, perhaps some convention should be recommended for * use, as in ipython, so users can count on lib meaning the same thing at an interactive prompt. It would probably be better to rename one of the libs (e.g., "mlib" for scipy's since it's a matrix library, vs. numpy's more general-purpose lib), but I suppose it's late for that. I don't know if the fft name collision is an issue. Thanks, Tom From schofield at ftw.at Sun Jan 8 15:23:54 2006 From: schofield at ftw.at (Ed Schofield) Date: Sun, 08 Jan 2006 20:23:54 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release -- final fixes Message-ID: <43C174DA.6060004@ftw.at> Hi all, I've been updating the text file docs before the 0.4.4 release. I'd like to request that someone knowledgeable (Pearu? :) go over the instructions, particularly in INSTALL.txt and DEVELOPERS.txt, to make sure they're okay now. Now's also the time for any last-minute documentation updates before the release :) -- Ed From pearu at scipy.org Sun Jan 8 14:42:59 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Jan 2006 13:42:59 -0600 (CST) Subject: [SciPy-dev] Advice on numpy/scipy namespace issue (ipython etc.) In-Reply-To: <200601082018.k08KI3v14362@laplace.astro.cornell.edu> References: <200601082018.k08KI3v14362@laplace.astro.cornell.edu> Message-ID: On Sun, 8 Jan 2006, Tom Loredo wrote: > Okay, as one who complained about the lack of numpy top-level > objects in scipy's top-level namespace, I'm pursuaded by Travis > and Pearu that a separation should be seriously considered, and > that now is the time. I'd like some advice about handling > the namespace split properly. > > I know that "from ... import *" is bad behavior, but particularly > for interactive use it's sometimes irresistable. If I do > > from numpy import * > from scipy import * > > is there any namespace collision I should be aware of? The above import statements are safe (at least in theory). > Are there any scipy modules or top-level functions with the same name > as numpy's (or vice versa)? Both packages have "fft" at the top level, > for example. > > "ipython -p scipy" imports top-level scipy into its namespace, but not > top-level numpy. This doesn't seem right to me---surely one will be > using array and zeros more often than fftpack---but the "import *" > issue has to be settled if we want "-p scipy" to have all the > handy numpy stuff at its top level. ipython profiles should be here updated for numpy-0.9/scipy-0.4.4. > As a more specific example, what exactly is the difference > between numpy.fft and scipy.fft? > > As a quick exploration of the issue, I ran this: > > import numpy, scipy > > nall = numpy.__all__ > > for name in scipy.__all__: > if name in nall: print name > > The result was only 3 names: lib, fft, ifft. lib in particular is > quite different in scipy vs. numpy. If the common name is kept, > perhaps some convention should be recommended for * use, as in ipython, > so users can count on lib meaning the same thing at an interactive > prompt. It would probably be better to rename one of the libs (e.g., > "mlib" for scipy's since it's a matrix library, vs. numpy's more > general-purpose lib), but I suppose it's late for that. I don't know > if the fft name collision is an issue. You must be using older numpy/scipy, numpy.__all__ does not contain 'lib' anymore. In addtion, numpy namespace (as defined by numpy.__all__) *-imports numpy.core and numpy.lib names. So, one should not use numpy.lib. but numpy. instead. So, the name `lib` in numpy will not cause any problems. numpy and scipy have the following public objects with the same name (before Travis patch on importing numpy symbols to scipy name space): fft, ifft, linalg. I think numpy.linalg should be renamed, say, to numpy.corelinalg (need better name here), similar as numpy.fftpack was renamed to numpy.dft. It appears that with the Travis patch, scipy.fft is actually numpy.dft.fft. This is a bug and has a easy fix (I am starting to think that importing numpy to scipy namespace is still a bad idea). Othervise fft, ifft should not cause any problems if both scipy and numpy define them. Thanks, Pearu From pearu at scipy.org Sun Jan 8 15:17:28 2006 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Jan 2006 14:17:28 -0600 (CST) Subject: [SciPy-dev] SciPy 0.4.4 release -- final fixes In-Reply-To: <43C174DA.6060004@ftw.at> References: <43C174DA.6060004@ftw.at> Message-ID: On Sun, 8 Jan 2006, Ed Schofield wrote: > Hi all, > > I've been updating the text file docs before the 0.4.4 release. I'd > like to request that someone knowledgeable (Pearu? :) go over the > instructions, particularly in INSTALL.txt and DEVELOPERS.txt, to make > sure they're okay now. I quickly went over scipy/*.txt files, removed some old text and fixed comments here and there.. I'm sure there are bugs but hopefully they will reveal themselfs;) > Now's also the time for any last-minute documentation updates before the > release :) Thanks, Pearu From ndarray at mac.com Sun Jan 8 18:20:02 2006 From: ndarray at mac.com (Sasha) Date: Sun, 8 Jan 2006 18:20:02 -0500 Subject: [SciPy-dev] Core dump in PyArray_Return Message-ID: I cannot reproduce the problem with a small example, but the following change fixes it: Index: numpy/core/src/arrayobject.c =================================================================== --- numpy/core/src/arrayobject.c (revision 1855) +++ numpy/core/src/arrayobject.c (working copy) @@ -915,8 +915,7 @@ Py_XDECREF(mp); return NULL; } - - if (mp->nd == 0) { + if (mp->nd == 0 && !PyArray_IsScalar(mp, Generic)) { PyObject *ret; ret = PyArray_ToScalar(mp->data, mp); Py_DECREF(mp); Apparently in my program PyArray_Return somehow gets an object that is already a scalar and calls PyArray_ToScalar on it. Is it a legitimate scenario, or a symptom of a problem elsewhere? If PyArray_Return can legitimately receive scalars, maybe the code should be changed as above. Alternatively PyArray_IsScalar could be modified to pass scalars unchanged. -- sasha From schofield at ftw.at Sun Jan 8 20:29:56 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 9 Jan 2006 01:29:56 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release -- final fixes In-Reply-To: References: <43C174DA.6060004@ftw.at> Message-ID: <69CDA0EC-1E39-4F6C-B0F2-EFAD1693AE57@ftw.at> On 08/01/2006, at 8:17 PM, Pearu Peterson wrote: > On Sun, 8 Jan 2006, Ed Schofield wrote: > >> Hi all, >> >> I've been updating the text file docs before the 0.4.4 release. I'd >> like to request that someone knowledgeable (Pearu? :) go over the >> instructions, particularly in INSTALL.txt and DEVELOPERS.txt, to make >> sure they're okay now. > > I quickly went over scipy/*.txt files, removed some old text and fixed > comments here and there.. I'm sure there are bugs but hopefully > they will > reveal themselfs;) Thanks, Pearu :) -- Ed From chris at trichech.us Sun Jan 8 21:45:08 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Sun, 8 Jan 2006 21:45:08 -0500 Subject: [SciPy-dev] f2py numpy array error Message-ID: <07C3DDA6-83FB-4214-8F8F-DFE16F267153@trichech.us> I have some f2py code that takes arrays as arguments. Unfortunately, the didnt seem to survive the move to numpy: (Pdb) p p array([ 0.1, 0.1]) (Pdb) _binomial(x, n, p) *** TypeError: flib.binomial() 3rd argument (p) can't be converted to float As you can see, the third argument (p) is clearly a float, yet it does not work. Here is the corresponding FORTRAN: SUBROUTINE binomial(x,n,p,m,like) c Binomial log-likelihood function cf2py integer dimension(m),intent(in) :: x cf2py integer intent(in) :: n cf2py real intent(in) :: p cf2py integer intent(hide),depend(x) :: m=len(x) cf2py real intent(out) :: like REAL like,p INTEGER n,m,i INTEGER x(m) like = 0.0 do i=1,m like = like + x(i)*log(p) + (n-x(i))*log(1.-p) like = like + factln(n)-factln(x(i))-factln(n-x(i)) enddo return END -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From oliphant.travis at ieee.org Sun Jan 8 22:03:53 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 20:03:53 -0700 Subject: [SciPy-dev] Core dump in PyArray_Return In-Reply-To: References: Message-ID: <43C1D299.6050306@ieee.org> Sasha wrote: >I cannot reproduce the problem with a small example, but the following >change fixes it: > >Index: numpy/core/src/arrayobject.c >=================================================================== >--- numpy/core/src/arrayobject.c (revision 1855) >+++ numpy/core/src/arrayobject.c (working copy) >@@ -915,8 +915,7 @@ > Py_XDECREF(mp); > return NULL; > } >- >- if (mp->nd == 0) { >+ if (mp->nd == 0 && !PyArray_IsScalar(mp, Generic)) { > PyObject *ret; > ret = PyArray_ToScalar(mp->data, mp); > Py_DECREF(mp); > >Apparently in my program PyArray_Return somehow gets an object that is >already a scalar and calls PyArray_ToScalar on it. > I've seen this before during development. It typically is due to a PyArray_Return on something that already returns a scalar. It would be nice to know where the core dump occurred. I think a check like this is probably in order though... just added it. I also undid the size-0 check. It was added to fix the "problem" that several places in the code where an array can be copied into another array many times. That is a size-10 array can be used to fill a size-100 array by repeating over it 10 times. The code uses iterators to do it, and there are a few places where you can even put a size-20 array into a size-13 array and a fraction of it will be iterated over. I guess the universal check is to stringent and the size-0 check should be placed in those places where copying data from a size-0 array makes no sense... -Travis From loredo at astro.cornell.edu Sun Jan 8 22:08:25 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Sun, 8 Jan 2006 22:08:25 -0500 Subject: [SciPy-dev] f2py numpy array error In-Reply-To: <371B1F87-EFA4-45AF-B5C4-982AD5D884CA@trichech.us> References: <200601081956.k08JuAj14354@laplace.astro.cornell.edu> <371B1F87-EFA4-45AF-B5C4-982AD5D884CA@trichech.us> Message-ID: <1136776105.43c1d3a92e9c4@astrosun2.astro.cornell.edu> > I have some f2py code that takes arrays as arguments. Unfortunately, > the didnt seem to survive the move to numpy: I'm not sure if this is the same problem, or even whether it is a "real" problem or a result of my own misunderstanding, but through all the recent revisions of scipy_core I have found that I cannot send a Float *scalar array* to an f2py-wrapped function that requires a real*8 scalar argument. I have to manually convert it via "float(value)". It seems to me the wrapper should automatically handle this, but perhaps I'm missing something. If I just use "float(value)" the wrapped function works fine through the latest revision of scipy_core. I'm just about to try all this stuff on the new numpy.... -Tom ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From oliphant.travis at ieee.org Sun Jan 8 23:00:26 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 21:00:26 -0700 Subject: [SciPy-dev] f2py numpy array error In-Reply-To: <07C3DDA6-83FB-4214-8F8F-DFE16F267153@trichech.us> References: <07C3DDA6-83FB-4214-8F8F-DFE16F267153@trichech.us> Message-ID: <43C1DFDA.5000002@ieee.org> Christopher Fonnesbeck wrote: > I have some f2py code that takes arrays as arguments. Unfortunately, > the didnt seem to survive the move to numpy: > > (Pdb) p p > array([ 0.1, 0.1]) > (Pdb) _binomial(x, n, p) > *** TypeError: flib.binomial() 3rd argument (p) can't be converted to > float > > As you can see, the third argument (p) is clearly a float, yet it > does not work. Here is the corresponding FORTRAN: The third argument is not a float it is an *array* of floats. Did you expect an array of floats to be interpreted as a single float. What value did you expect it to be interpreted as? > > SUBROUTINE binomial(x,n,p,m,like) > > c Binomial log-likelihood function > > cf2py integer dimension(m),intent(in) :: x > cf2py integer intent(in) :: n > cf2py real intent(in) :: p It looks like p should be a float, *not* an array. I'm confused as to what the problem is here... -Travis From oliphant.travis at ieee.org Sun Jan 8 23:02:27 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 21:02:27 -0700 Subject: [SciPy-dev] f2py numpy array error In-Reply-To: <1136776105.43c1d3a92e9c4@astrosun2.astro.cornell.edu> References: <200601081956.k08JuAj14354@laplace.astro.cornell.edu> <371B1F87-EFA4-45AF-B5C4-982AD5D884CA@trichech.us> <1136776105.43c1d3a92e9c4@astrosun2.astro.cornell.edu> Message-ID: <43C1E053.2010904@ieee.org> Tom Loredo wrote: >>I have some f2py code that takes arrays as arguments. Unfortunately, >>the didnt seem to survive the move to numpy: >> >> > >I'm not sure if this is the same problem, or even whether it is >a "real" problem or a result of my own misunderstanding, but >through all the recent revisions of scipy_core I have found >that I cannot send a Float *scalar array* to an f2py-wrapped >function that requires a real*8 scalar argument. I have >to manually convert it via "float(value)". It seems to me >the wrapper should automatically handle this, but perhaps >I'm missing something. > > This should work. Please post the error you are seeing and give details about the function to be wrapped... -Travis From schofield at ftw.at Sun Jan 8 23:31:56 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 9 Jan 2006 04:31:56 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43C090D6.9010108@ieee.org> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> Message-ID: <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> On 08/01/2006, at 4:11 AM, Travis Oliphant wrote: > This is what I do (check this set of instructions for errors, it's off > the top of my head). > > ... Hi Travis, Thanks for the tips! They were very helpful, and I'm getting there. I still have some questions: 1. I uninstalled Python 2.4 and installed Python 2.3, figuring that SciPy binaries built against Python 2.3 would be upwardly compatible, but perhaps not vice versa. Is this true? I noticed you built separate binaries for NumPy for Python 2.3 and 2.4 on Windows but just one set on Linux -- why? 2. I've been getting various LAPACK errors since I moved away my custom-rolled ATLAS/LAPACK libraries, but things are looking better now. I presume that the Linux binaries should be built without an ATLAS dependency? If I link against /ed's/own/atlas/library.so, the binaries will be useless to anyone but me, right? ... It makes me wonder if we need Linux binaries at all. I've noticed that numarray only distributes Windows binaries, probably since the source tarball is generally far more useful, and because Linux distributors will soon package SciPy themselves anyway. Could we do this too? 3. Are there any Windows linking traps to be aware of? I presume we link SciPy against an ATLAS DLL that gets rolled up into the .exe file ... ---------- Just out of interest, it seems Mandrake 10 ships with an incomplete LAPACK (4.3 MB, rather than Debian's 5.3 MB), with various missing symbols. Why do they *do* this?! And the NumPy / SciPy setup scripts are really very smart -- good job, guys! Witness this: atlas_blas_info: FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/home/schofield/Tools/lib/atlas_moved_away'] language = c Doh! ;) -- Ed From robert.kern at gmail.com Sun Jan 8 23:42:12 2006 From: robert.kern at gmail.com (Robert Kern) Date: Sun, 08 Jan 2006 22:42:12 -0600 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> Message-ID: <43C1E9A4.30508@gmail.com> Ed Schofield wrote: > 1. I uninstalled Python 2.4 and installed Python 2.3, figuring that > SciPy binaries built against Python 2.3 would be upwardly compatible, > but perhaps not vice versa. Is this true? I noticed you built > separate binaries for NumPy for Python 2.3 and 2.4 on Windows but > just one set on Linux -- why? You need to compile separate versions for Python 2.3 and 2.4. Python 2.x binaries are not compatible for different "x". Python 2.x.y binaries *are* compatible for the same "x" but different "y". > 2. I've been getting various LAPACK errors since I moved away my > custom-rolled ATLAS/LAPACK libraries, but things are looking better > now. I presume that the Linux binaries should be built without an > ATLAS dependency? If I link against /ed's/own/atlas/library.so, the > binaries will be useless to anyone but me, right? ... Pretty much. If you use static libraries, then they would be useful, but generally Linux binaries aren't terribly useful if they aren't tailored for a specific distribution. > It makes me wonder if we need Linux binaries at all. I've noticed > that numarray only distributes Windows binaries, probably since the > source tarball is generally far more useful, and because Linux > distributors will soon package SciPy themselves anyway. Could we do > this too? Probably. > 3. Are there any Windows linking traps to be aware of? I presume we > link SciPy against an ATLAS DLL that gets rolled up into the .exe > file ... You should use the static libraries provided here: http://www.scipy.org/download/atlasbinaries/ -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Mon Jan 9 01:38:20 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Sun, 08 Jan 2006 23:38:20 -0700 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> Message-ID: <43C204DC.2030902@ieee.org> Ed Schofield wrote: >On 08/01/2006, at 4:11 AM, Travis Oliphant wrote: > > > >>This is what I do (check this set of instructions for errors, it's off >>the top of my head). >> >>... >> >> > >Hi Travis, >Thanks for the tips! They were very helpful, and I'm getting there. >I still have some questions: > >1. I uninstalled Python 2.4 and installed Python 2.3, figuring that >SciPy binaries built against Python 2.3 would be upwardly compatible, >but perhaps not vice versa. Is this true? I noticed you built >separate binaries for NumPy for Python 2.3 and 2.4 on Windows but >just one set on Linux -- why? > > Most people on Linux build it themselves or one of the distribution packagers builds it for the distribution, I just put the binary rpm up because I need to build a binary to test the distribution so it's there. I'm not sure how useful it is... >2. I've been getting various LAPACK errors since I moved away my >custom-rolled ATLAS/LAPACK libraries, but things are looking better >now. I presume that the Linux binaries should be built without an >ATLAS dependency? If I link against /ed's/own/atlas/library.so, the >binaries will be useless to anyone but me, right? ... > > Actually, ATLAS is linked statically (AFAIK) so it is actually one benefit of a binary... >It makes me wonder if we need Linux binaries at all. I've noticed >that numarray only distributes Windows binaries, probably since the >source tarball is generally far more useful, and because Linux >distributors will soon package SciPy themselves anyway. Could we do >this too? > > Yes... I wouldn't stress about the Linux binary. But, I do think you need to test the binary build from the newly-created source distribution before uploading the tarball. >3. Are there any Windows linking traps to be aware of? I presume we >link SciPy against an ATLAS DLL that gets rolled up into the .exe >file ... > > Again, I think it's statically linked. I just do a bdist_wininst command. If you don't have a distutils.cfg file, you do need to make sure and use config --compiler=mingw32 and build --compiler=mingw32 (unless you have the same compiler that built Python, then great...) Best, -travis From Fernando.Perez at colorado.edu Mon Jan 9 02:24:44 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 09 Jan 2006 00:24:44 -0700 Subject: [SciPy-dev] Advice on numpy/scipy namespace issue (ipython etc.) In-Reply-To: <200601082018.k08KI3v14362@laplace.astro.cornell.edu> References: <200601082018.k08KI3v14362@laplace.astro.cornell.edu> Message-ID: <43C20FBC.1090201@colorado.edu> Tom Loredo wrote: > "ipython -p scipy" imports top-level scipy into its namespace, but not > top-level numpy. This doesn't seem right to me---surely one will be > using array and zeros more often than fftpack---but the "import *" > issue has to be settled if we want "-p scipy" to have all the > handy numpy stuff at its top level. Note that, while Pearu did address this issue already, it's worth mentioning that profiles live in a user-writeable directory (~/.ipython), so you can customize the default scipy profile (~/.ipython/ipythonrc-scipy) to do exactly what you need it to. I will release ipython 0.7.0 on Tuesday, but I don't think I'll change the scipy profile yet, as this is still far too experimental. Hopefully when 0.7.1 rolls around with the inevitable bugfixes (0.7.0 has a TON of new features), numpy/scipy will have settled enough that I'll switch the default profile to use them. This isn't really much of a restriction, though, since with the namespace update things work again much as they used to. Best, f From nwagner at mecha.uni-stuttgart.de Mon Jan 9 04:43:23 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 09 Jan 2006 10:43:23 +0100 Subject: [SciPy-dev] Import strategy Message-ID: <43C2303B.7000409@mecha.uni-stuttgart.de> Hi all, I am still confused by the new import method. Who can explain the difference between the "old" core/scipy and the latest numpy/scipy ? And where can I find some documentation wrt this topic ? In the past I have simply used from scipy import *. How should I modify my programs such that I can still use linalg.inv linalg.signm etc. Nils From pearu at scipy.org Mon Jan 9 03:47:31 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 02:47:31 -0600 (CST) Subject: [SciPy-dev] Import strategy In-Reply-To: <43C2303B.7000409@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> Message-ID: On Mon, 9 Jan 2006, Nils Wagner wrote: > Hi all, > > I am still confused by the new import method. > Who can explain the difference between the "old" core/scipy and the > latest numpy/scipy ? > And where can I find some documentation wrt this topic ? See scipy.pkgload.__doc__. > In the past I have simply used > > from scipy import *. > > How should I modify my programs such that I can still use > > linalg.inv > linalg.signm etc. You still can use simple from scipy import * to get all symbols that scipy packages provide + numpy symbols. If not, then it is a bug. Pearu From nwagner at mecha.uni-stuttgart.de Mon Jan 9 04:54:28 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 09 Jan 2006 10:54:28 +0100 Subject: [SciPy-dev] Import strategy In-Reply-To: References: <43C2303B.7000409@mecha.uni-stuttgart.de> Message-ID: <43C232D4.1060400@mecha.uni-stuttgart.de> Pearu Peterson wrote: >On Mon, 9 Jan 2006, Nils Wagner wrote: > > >>Hi all, >> >>I am still confused by the new import method. >>Who can explain the difference between the "old" core/scipy and the >>latest numpy/scipy ? >>And where can I find some documentation wrt this topic ? >> > >See scipy.pkgload.__doc__. > > >>In the past I have simply used >> >>from scipy import *. >> >>How should I modify my programs such that I can still use >> >>linalg.inv >>linalg.signm etc. >> > >You still can use simple > > from scipy import * > >to get all symbols that scipy packages provide + numpy symbols. If not, >then it is a bug. > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > And how can I suspend all the messages beginning with Overwriting ... ? Nils From pearu at scipy.org Mon Jan 9 03:57:53 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 02:57:53 -0600 (CST) Subject: [SciPy-dev] Import strategy In-Reply-To: <43C232D4.1060400@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> Message-ID: On Mon, 9 Jan 2006, Nils Wagner wrote: > And how can I suspend all the messages beginning with Overwriting ... ? Define environment variable export SCIPY_IMPORT_VERBOSE=-1 to suspend import warnings. Pearu From cimrman3 at ntc.zcu.cz Mon Jan 9 05:03:17 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 09 Jan 2006 11:03:17 +0100 Subject: [SciPy-dev] [SciPy-user] numpy lists In-Reply-To: <43BF14E6.8070507@astraw.com> References: <200601051625.k05GPixf021407@oobleck.astro.cornell.edu> <43BD5DFA.20906@colorado.edu> <43BD6109.9030402@sympatico.ca> <43BD7A99.8070002@noaa.gov> <43BEFA6D.7090305@ieee.org> <43BF14E6.8070507@astraw.com> Message-ID: <43C234E5.4090601@ntc.zcu.cz> Andrew Straw wrote: > Travis Oliphant wrote: > > >>There are two new mailing lists for NumPy >> >>numpy-devel at lists.sourceforge.net >>numpy-user at lists.sourceforge.net >> >>These are for developers and users to talk about only NumPy >> >> > > You can subscribe to these lists from > http://sourceforge.net/mail/?group_id=1369 (Dizzy after going through tons of messages of this weekend,) I do not see where to subscribe to numpy-devel using the link above - all I see is numpy-discussion and numpy-user. Otherwise I would support the Robert Kern's idea of having only numpy, scipy and numpy-discussion trio. r. From arnd.baecker at web.de Mon Jan 9 05:11:38 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 9 Jan 2006 11:11:38 +0100 (CET) Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: <43C0BBE6.9070204@ieee.org> References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Sun, 8 Jan 2006, Travis Oliphant wrote: > Travis Oliphant wrote: > > >Sasha wrote: > > > >>>>>x = array([1]) > >>>>>m = array([1], bool) > >>>>>x[m] = '' > >>>>>x > >>>>> > >>array([12998744]) > >>>>>x[m] = '' > >>>>>x > >>array([12998752]) > >> > >>It looks like x gets the address of the '' object (note the number change). > >>This "feature" is present in 0.9.2 release and in svn 0.9.3.1831 > > > >Thanks for this corner case. Actually, it's getting whatever is in the > >memory allocated. > > > >Try > > > >b = array('',x.dtype) > > > >b should be a size-0 array, (there is actually 1 elementsize of memory > >allocated for it though). > > > >The mapping code is iterating over b and resetting b whenever it runs > >out of elements of b (every time in this case), except the data pointed > >to by the iterator is never valid so this should not be allowed.... > > > >I suppose instead, creating an iterator from a size-0 array could raise > >an error. This would be a great and simple place to catch this in the > >code as well. > > > This now raises an error.... Is the above related to the following? In [7]: import numpy In [8]: a1=numpy.array([], dtype=numpy.complex128) In [9]: a2=numpy.array([], dtype=numpy.complex128) In [10]: numpy.dot(a1,a2) Out[10]: (2.6815615888703731e+154+2.6815615859888243e+154j) In [11]: import numpy In [12]: a1=numpy.array([]) In [13]: a2=numpy.array([]) In [14]: numpy.dot(a1,a2) Out[14]: 6917529027645027104 In [15]: import numpy In [16]: a1=numpy.array([], dtype=numpy.int8) In [17]: a2=numpy.array([], dtype=numpy.int8) In [18]: numpy.dot(a1,a2) Out[18]: 0 In [19]: numpy.dot(a1,a2) Out[19]: 48 In [20]: numpy.dot(a1,a2) Out[20]: 0 In [21]: numpy.dot(a1,a2) Out[21]: 32 I have no idea, what dot(a1,a2) should return in this case, but the dot of empty slices happens in `scipy.linalg.signm` and is revealed by FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) for full scipy. This is on an Itanium2, gcc 3.4.4, numpy, no linear algebra packages selected, numpy.__version__ '0.9.3.1857' I also see this on an Opteron - it could be a 64 Bit issue? Best, Arnd > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From nwagner at mecha.uni-stuttgart.de Mon Jan 9 05:21:27 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 09 Jan 2006 11:21:27 +0100 Subject: [SciPy-dev] Import strategy In-Reply-To: <43C232D4.1060400@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> Message-ID: <43C23927.7070509@mecha.uni-stuttgart.de> Nils Wagner wrote: >Pearu Peterson wrote: > >>On Mon, 9 Jan 2006, Nils Wagner wrote: >> >> >> >>>Hi all, >>> >>>I am still confused by the new import method. >>>Who can explain the difference between the "old" core/scipy and the >>>latest numpy/scipy ? >>>And where can I find some documentation wrt this topic ? >>> >>> >>See scipy.pkgload.__doc__. >> >> >> >>>In the past I have simply used >>> >>> >>>from scipy import *. >> >>>How should I modify my programs such that I can still use >>> >>>linalg.inv >>>linalg.signm etc. >>> >>> >>You still can use simple >> >> from scipy import * >> >>to get all symbols that scipy packages provide + numpy symbols. If not, >>then it is a bug. >> >> Am I missing something ? Traceback (most recent call last): File "import.py", line 5, in ? a_logm = linalg.logm(a) AttributeError: 'module' object has no attribute 'logm' Nils >>Pearu >> >>_______________________________________________ >>Scipy-dev mailing list >>Scipy-dev at scipy.net >>http://www.scipy.net/mailman/listinfo/scipy-dev >> >> >And how can I suspend all the messages beginning with Overwriting ... ? > >Nils > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- A non-text attachment was scrubbed... Name: test_import.py Type: text/x-python Size: 92 bytes Desc: not available URL: From pearu at scipy.org Mon Jan 9 04:27:51 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 03:27:51 -0600 (CST) Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Mon, 9 Jan 2006, Arnd Baecker wrote: > In [7]: import numpy > In [8]: a1=numpy.array([], dtype=numpy.complex128) > In [9]: a2=numpy.array([], dtype=numpy.complex128) > In [10]: numpy.dot(a1,a2) > Out[10]: (2.6815615888703731e+154+2.6815615859888243e+154j) > In [11]: import numpy > In [12]: a1=numpy.array([]) > In [13]: a2=numpy.array([]) > In [14]: numpy.dot(a1,a2) > Out[14]: 6917529027645027104 > In [15]: import numpy > In [16]: a1=numpy.array([], dtype=numpy.int8) > In [17]: a2=numpy.array([], dtype=numpy.int8) > In [18]: numpy.dot(a1,a2) > Out[18]: 0 > In [19]: numpy.dot(a1,a2) > Out[19]: 48 > In [20]: numpy.dot(a1,a2) > Out[20]: 0 > In [21]: numpy.dot(a1,a2) > Out[21]: 32 > > I have no idea, what dot(a1,a2) should return in this > case, but the dot of empty slices happens in `scipy.linalg.signm` > and is revealed by > FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) > for full scipy. > > This is on an Itanium2, gcc 3.4.4, numpy, no linear algebra packages > selected, numpy.__version__ '0.9.3.1857' > > I also see this on an Opteron - it could be a 64 Bit issue? Here numpy.dot returns 0 on empty arrays. I think it should raise an exception (ValueError?) instead like ddot in scipy.linalg.blas.fblas does. Pearu From pearu at scipy.org Mon Jan 9 04:32:17 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 03:32:17 -0600 (CST) Subject: [SciPy-dev] Import strategy In-Reply-To: <43C23927.7070509@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> <43C23927.7070509@mecha.uni-stuttgart.de> Message-ID: On Mon, 9 Jan 2006, Nils Wagner wrote: > Traceback (most recent call last): > File "import.py", line 5, in ? > a_logm = linalg.logm(a) > AttributeError: 'module' object has no attribute 'logm' What linalg module are you using in import.py? May be it comes from numpy? How do you import scipy/numpy in import.py? Otherwise >>> from scipy import * >>> linalg.logm works fine here. Pearu From nwagner at mecha.uni-stuttgart.de Mon Jan 9 05:40:23 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 09 Jan 2006 11:40:23 +0100 Subject: [SciPy-dev] Import strategy In-Reply-To: References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> <43C23927.7070509@mecha.uni-stuttgart.de> Message-ID: <43C23D97.5090601@mecha.uni-stuttgart.de> Pearu Peterson wrote: >On Mon, 9 Jan 2006, Nils Wagner wrote: > > >>Traceback (most recent call last): >> File "import.py", line 5, in ? >> a_logm = linalg.logm(a) >>AttributeError: 'module' object has no attribute 'logm' >> > >What linalg module are you using in import.py? May be it comes from >numpy? How do you import scipy/numpy in import.py? > >Otherwise > > >>>>from scipy import * >>>>linalg.logm >>>> > > >works fine here. > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > Strange. Here I get from scipy import * ... Overwriting linalg= from /usr/local/lib/python2.4/site-packages/scipy/basic/linalg.pyc (was from /usr/local/lib/python2.4/site-packages/scipy/linalg/__init__.pyc) Overwriting fftpack= from /usr/local/lib/python2.4/site-packages/scipy/basic/fftpack.pyc (was from /usr/local/lib/python2.4/site-packages/scipy/fftpack/__init__.pyc) >>> linalg.logm Traceback (most recent call last): File "", line 1, in ? AttributeError: 'module' object has no attribute 'logm' >>> import numpy >>> numpy.__version__ '0.9.3.1860' >>> import scipy >>> scipy.__version__ '0.4.4.1540' BTW, I haven't uninstalled the "old" core but installed numpy. Nils From pearu at scipy.org Mon Jan 9 04:48:23 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 03:48:23 -0600 (CST) Subject: [SciPy-dev] Import strategy In-Reply-To: <43C23D97.5090601@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> <43C23D97.5090601@mecha.uni-stuttgart.de> Message-ID: On Mon, 9 Jan 2006, Nils Wagner wrote: > Strange. Here I get > > from scipy import * > ... > Overwriting linalg= '/usr/local/lib/python2.4/site-packages/scipy/basic/linalg.pyc'> from > /usr/local/lib/python2.4/site-packages/scipy/basic/linalg.pyc (was > '/usr/local/lib/python2.4/site-packages/scipy/linalg/__init__.pyc'> from > /usr/local/lib/python2.4/site-packages/scipy/linalg/__init__.pyc) > Overwriting fftpack= '/usr/local/lib/python2.4/site-packages/scipy/basic/fftpack.pyc'> from > /usr/local/lib/python2.4/site-packages/scipy/basic/fftpack.pyc (was > '/usr/local/lib/python2.4/site-packages/scipy/fftpack/__init__.pyc'> > from /usr/local/lib/python2.4/site-packages/scipy/fftpack/__init__.pyc) >>>> linalg.logm > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: 'module' object has no attribute 'logm' >>>> import numpy >>>> numpy.__version__ > '0.9.3.1860' >>>> import scipy >>>> scipy.__version__ > '0.4.4.1540' > > > BTW, I haven't uninstalled the "old" core but installed numpy. That could be it, all kind of strange things can happen when not removing (hiding, whatever) old scipy before installing recent numpy/scipy packages. I suggest also using scipy from recent svn (1542 and up). Pearu From schofield at ftw.at Mon Jan 9 05:59:21 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 9 Jan 2006 10:59:21 +0000 Subject: [SciPy-dev] Import strategy In-Reply-To: <43C23D97.5090601@mecha.uni-stuttgart.de> References: <43C2303B.7000409@mecha.uni-stuttgart.de> <43C232D4.1060400@mecha.uni-stuttgart.de> <43C23927.7070509@mecha.uni-stuttgart.de> <43C23D97.5090601@mecha.uni-stuttgart.de> Message-ID: <08B1B9D9-3F39-4AFB-9F58-E5753DDC7504@ftw.at> On 09/01/2006, at 10:40 AM, Nils Wagner wrote: > BTW, I haven't uninstalled the "old" core but installed numpy. Do this! Blast away site-packages/numpy, site-packages/scipy, and all your build directories, then start again. If you're tracking SVN, doing this every time in your build scripts will save you lots of time. -- Ed From pearu at scipy.org Mon Jan 9 05:18:52 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 04:18:52 -0600 (CST) Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Mon, 9 Jan 2006, Pearu Peterson wrote: > Here numpy.dot returns 0 on empty arrays. I think it should raise an > exception (ValueError?) instead like ddot in scipy.linalg.blas.fblas does. Ok, I can reproduce dot returning nonsense also here: In [2]: from numpy import core In [4]: core.multiarray.dot([],[]) Out[4]: 1075591804 So, it's a multiarray.dot bug. Pearu From faltet at carabos.com Mon Jan 9 06:54:09 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 9 Jan 2006 12:54:09 +0100 Subject: [SciPy-dev] How to handle a[...] in numpy? In-Reply-To: References: Message-ID: <200601091254.10861.faltet@carabos.com> A Diumenge 08 Gener 2006 05:10, Sasha va escriure: > In Numeric a[...] would return an array unless a was 0-rank and a > python type othewise. What is the right way to do the same in numpy? [snip] > Proposal: Although I like a lot that 0-rank arrays and numpy scalar > types non-iterable, it may be reasonable to allow a[...]. This way > ellipsis can be interpereted as any number of ":"s including zero. > Another subscript operation that makes sense for scalars would be > a[...,newaxis] or even a[{newaxis, }* ..., {newaxis,}*], where > {newaxis,}* stands for any number of comma-separated newaxis tokens. > This will allow one to use ellipsis in generic code that would work on > any numpy type. I will contribute code if there is any interest. +1 More specifically, provided that: In [65]: type(numpy.array([0])[...]) Out[65]: In [66]: type(numpy.array([0])[0]) Out[66]: In [67]: type(numpy.array([0]).item()) Out[67]: I'd propose the next behaviour for 0-rank arrays: In [65]: type(numpy.array(0)[...]) Out[65]: In [66]: type(numpy.array(0)[()]) # Indexing a la numarray Out[66]: In [67]: type(numpy.array(0).item()) # already works Out[67]: -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From pearu at scipy.org Mon Jan 9 06:34:25 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Jan 2006 05:34:25 -0600 (CST) Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Mon, 9 Jan 2006, Arnd Baecker wrote: > In [7]: import numpy > In [8]: a1=numpy.array([], dtype=numpy.complex128) > In [9]: a2=numpy.array([], dtype=numpy.complex128) > In [10]: numpy.dot(a1,a2) > Out[10]: (2.6815615888703731e+154+2.6815615859888243e+154j) I have commited a patch to numpy svn so that numpy.core.multiarray.dot([],[]) -> 0 Note that currently numpy.core.multiarray.dot([],2) -> [] for instance. Is this desired behaviour? Pearu From cookedm at physics.mcmaster.ca Mon Jan 9 07:36:14 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Mon, 9 Jan 2006 07:36:14 -0500 Subject: =?WINDOWS-1252?Q?Re:_[SciPy-dev]_[Slightly_OT]_Karl_Fogel's_=93P?= =?WINDOWS-1252?Q?roducing_Open_Source_Software=94?= In-Reply-To: <43C142E6.3020605@colorado.edu> References: <43C142E6.3020605@colorado.edu> Message-ID: <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> On Jan 8, 2006, at 11:50 , Fernando Perez wrote: > Hi all, > > a few months ago, Robert Kern recommended this book: > > Producing Open Source Software > How to Run a Successful Free Software Project > By Karl Fogel > http://producingoss.com/ > > and I've found it to be a very useful collection of tips and ideas > regarding > the running of an OSS project. A groklaw.net review came out today > > http://www.groklaw.net/article.php?story=20060105010817642 > > which made me think this might be of interest to some in our > group. While I > think that scipy in general is running great, there are many good > ideas in > that book in terms of things to keep in mind to make the > environment around a > project operate better. Is there anything in particular you'd suggest? I've had a quick look before, and the thing that stuck to me was code review. I think something like a checkins list would help cover that. I'd also like an RSS feed of checkins, b/c I check those a lot :). I think there's a plugin for Trac (0.9 at least) that'll do that... -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cimrman3 at ntc.zcu.cz Mon Jan 9 07:42:38 2006 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 09 Jan 2006 13:42:38 +0100 Subject: [SciPy-dev] ANN: arraysetops added to numpy.lib Message-ID: <43C25A3E.1030408@ntc.zcu.cz> Dear developers, 'arraysetops' was just transfered from the scipy.sandbox to numpy.lib (with blessings of the great TEO). You can now try and test this module in numpy proper. It is the initial release, so expect the usual amount of bugs. What is it? ----------- 1D array set operations ======================= Set operations for 1D numeric arrays based on sort() function. ediff1d -- Array difference (auxiliary function). unique1d -- Unique elements of 1D array. intersect1d -- Intersection of 1D arrays with unique elements. intersect1d_nu -- Intersection of 1D arrays with any elements. setxor1d -- Set exclusive-or of 1D arrays with unique elements. setmember1d -- Return an array of shape of ar1 containing 1 where the elements of ar1 are in ar2 and 0 otherwise. union1d -- Union of 1D arrays with unique elements. setdiff1d -- Set difference of 1D arrays with unique elements. Why? ---- 1. speed - e.g. unique1d() is about 30 times faster than the standard dictionary-based unique(); try the code below import numpy.lib.arraysetops as aset aset.test_unique1d_speed( plotResults = False ) # set to True if you have pylab 2. added functions present in other popular environments (matlab, ...) but missing in scipy r. From aisaac at american.edu Mon Jan 9 08:55:09 2006 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 9 Jan 2006 08:55:09 -0500 Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: > On Mon, 9 Jan 2006, Arnd Baecker wrote: >> In [7]: import numpy >> In [8]: a1=numpy.array([], dtype=numpy.complex128) >> In [9]: a2=numpy.array([], dtype=numpy.complex128) >> In [10]: numpy.dot(a1,a2) >> Out[10]: (2.6815615888703731e+154+2.6815615859888243e+154j) On Mon, 9 Jan 2006, (CST) Pearu Peterson apparently wrote: > I have commited a patch to numpy svn so that > numpy.core.multiarray.dot([],[]) -> 0 > Note that currently > numpy.core.multiarray.dot([],2) -> [] > for instance. Is this desired behaviour? Is there is a hurry to fix this situation without a clear statement of the underlying principle? Maybe I just overlook the principle? This seems right: numpy.core.multiarray.dot([],2) -> [] It is a natural corner case for a scalar product, which 'dot' already supports. The principle is that scalar multiplication returns the array of the same dimensions containing scalar multiple of each element of the original array. This seems quite wrong: numpy.core.multiarray.dot([],[]) -> 0 I expect an exception. Otherwise two empty arrays are judged orthogonal, which seems extremely odd. (Not as odd as the previous behavior, of course.) If returning 0 is the right thing to do, what is the reason for this being right? fwiw, Alan Isaaac From chris at trichech.us Mon Jan 9 09:01:35 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Mon, 9 Jan 2006 09:01:35 -0500 Subject: [SciPy-dev] numpy changes generator behaviour Message-ID: I'm not sure this is intended, but it appears that numpy changes the behaviour of the new python 2.4 generators. Here is what is supposed to happen: >>> sum(x*x for x in range(10)) 285 But if I import numpy, and try again I get: >>> from numpy import * >>> sum(x*x for x in range(10)) Why should it no longer sum? C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From cookedm at physics.mcmaster.ca Mon Jan 9 11:10:55 2006 From: cookedm at physics.mcmaster.ca (David M.Cooke) Date: Mon, 9 Jan 2006 11:10:55 -0500 Subject: [SciPy-dev] numpy changes generator behaviour In-Reply-To: References: Message-ID: On Jan 9, 2006, at 09:01 , Christopher Fonnesbeck wrote: > I'm not sure this is intended, but it appears that numpy changes > the behaviour of the new python 2.4 generators. Here is what is > supposed to happen: > > >>> sum(x*x for x in range(10)) > 285 > > But if I import numpy, and try again I get: > > >>> from numpy import * > >>> sum(x*x for x in range(10)) > > > Why should it no longer sum? Because the function 'sum' is now the one from numpy, not the builtin one. (Numeric had 'sum' before it was added to Python.) You could do something like bsum = sum from numpy import * bsum(x*x for x in range(10)) Or, to get the builtin sum after overwriting it: import __builtin__ __builtin__.sum(x*x for x in range(10)) -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ndarray at mac.com Mon Jan 9 11:24:50 2006 From: ndarray at mac.com (Sasha) Date: Mon, 9 Jan 2006 11:24:50 -0500 Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: ---------- Forwarded message ---------- From: Alexander Belopolsky Date: Jan 9, 2006 11:21 AM Subject: Re: [SciPy-dev] Fancy indexing curiosity To: aisaac at american.edu, SciPy Developers List On 1/9/06, Alan G Isaac wrote: > ... > This seems quite wrong: > numpy.core.multiarray.dot([],[]) -> 0 > I expect an exception. The dot product of two rank-1, length n vectors A and B is in python notation sum(a*b for (a,b) in zip(A, B)) If n = 0, the sum of zero elements is zero. > Otherwise two empty arrays are judged orthogonal, > which seems extremely odd. Why is this odd? These empty arrays are not scalars, they have dimension 0 and rank 1. They don't have to be orthogonal to have zero dot product because they have zero norm. > (Not as odd as the previous behavior, of course.) > If returning 0 is the right thing to do, > what is the reason for this being right? The reason is to avoid a special case. The same definition that works for non-zero length arrays can be applied to empty sequences and will yield 0 as I explained above. -- sasha From aisaac at american.edu Mon Jan 9 12:37:19 2006 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 9 Jan 2006 12:37:19 -0500 Subject: [SciPy-dev] Fancy indexing curiosity In-Reply-To: References: <43C0BAA8.3060704@ieee.org> <43C0BBE6.9070204@ieee.org> Message-ID: On Mon, 9 Jan 2006, Alexander Belopolsky apparently wrote: > The dot product of two rank-1, length n vectors A and B is > in python notation > sum(a*b for (a,b) in zip(A, B)) > If n = 0, the sum of zero elements is zero. OK. I'm sold. Algorithmically it's an obvious and consistent choice. And against my initial intuition, it seems to work fine as an inner product on a vector space (with a single element [], which is the additive identity). Apologies for the noise, and thanks. Alan From robert.kern at gmail.com Mon Jan 9 13:13:05 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 09 Jan 2006 12:13:05 -0600 Subject: [SciPy-dev] [Slightly OT] Karl Fogel's =?UTF-8?B?4oCcUHJvZHVj?= =?UTF-8?B?aW5nIE9wZW4gU291cmNlIFNvZnR3YXJl4oCd?= In-Reply-To: <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> Message-ID: <43C2A7B1.7050107@gmail.com> David M. Cooke wrote: > Is there anything in particular you'd suggest? I've had a quick look > before, and the thing that stuck to me was code review. I think > something like a checkins list would help cover that. I'd also like > an RSS feed of checkins, b/c I check those a lot :). I think there's > a plugin for Trac (0.9 at least) that'll do that... Look at the bottom of http://projects.scipy.org/scipy/scipy/timeline http://projects.scipy.org/scipy/numpy/timeline Hopefully these URLs will be stable. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ndarray at mac.com Mon Jan 9 13:19:18 2006 From: ndarray at mac.com (Sasha) Date: Mon, 9 Jan 2006 13:19:18 -0500 Subject: [SciPy-dev] How to handle a[...] in numpy? In-Reply-To: <200601091254.10861.faltet@carabos.com> References: <200601091254.10861.faltet@carabos.com> Message-ID: On 1/9/06, Francesc Altet wrote: > ... > I'd propose the next behaviour for 0-rank arrays: > > In [65]: type(numpy.array(0)[...]) > Out[65]: > > In [66]: type(numpy.array(0)[()]) # Indexing a la numarray > Out[66]: > I like the idea of supporting [()] on zero rank ndarrays, but I think it should be equivalent to [...]. I view [...] as [(slice(None),) * rank], and thus for rank=0, [...] is the same as [()]. Furthermore, I don't see any use for [...] that always returns the same array. As far as I remember in some old version of Numeric, [...] was a way to make a contiguous copy, but in numpy this is not the case (one needs to use copy method for that). >>> x[::2][...].flags {'WRITEABLE': True, 'UPDATEIFCOPY': False, 'CONTIGUOUS': False, 'FORTRAN': False, 'ALIGNED': True, 'OWNDATA': False} -- sasha From cookedm at physics.mcmaster.ca Mon Jan 9 13:42:18 2006 From: cookedm at physics.mcmaster.ca (David M.Cooke) Date: Mon, 9 Jan 2006 13:42:18 -0500 Subject: =?WINDOWS-1252?Q?Re:_[SciPy-dev]_[Slightly_OT]_Karl_Fogel's_=93P?= =?WINDOWS-1252?Q?roducing_Open_Source_Software=94?= In-Reply-To: <43C2A7B1.7050107@gmail.com> References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> <43C2A7B1.7050107@gmail.com> Message-ID: On Jan 9, 2006, at 13:13 , Robert Kern wrote: > David M. Cooke wrote: > >> Is there anything in particular you'd suggest? I've had a quick look >> before, and the thing that stuck to me was code review. I think >> something like a checkins list would help cover that. I'd also like >> an RSS feed of checkins, b/c I check those a lot :). I think there's >> a plugin for Trac (0.9 at least) that'll do that... > > Look at the bottom of > > http://projects.scipy.org/scipy/scipy/timeline > http://projects.scipy.org/scipy/numpy/timeline > > Hopefully these URLs will be stable. Thanks! Not perfect -- you only get the change comment in the feed, not the diff or at least the files changed, but it's better than nothing :-) -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From Fernando.Perez at colorado.edu Mon Jan 9 13:55:28 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 09 Jan 2006 11:55:28 -0700 Subject: =?windows-1252?Q?Re=3A_=5BSciPy-dev=5D_=5BSlightly_OT=5D_?= =?windows-1252?Q?Karl_Fogel=27s_=93Producing_Open_Source_Softw?= =?windows-1252?Q?are=94?= In-Reply-To: References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> <43C2A7B1.7050107@gmail.com> Message-ID: <43C2B1A0.3010908@colorado.edu> David M.Cooke wrote: > On Jan 9, 2006, at 13:13 , Robert Kern wrote: > > >>David M. Cooke wrote: >> >> >>>Is there anything in particular you'd suggest? I've had a quick look >>>before, and the thing that stuck to me was code review. I think >>>something like a checkins list would help cover that. I'd also like >>>an RSS feed of checkins, b/c I check those a lot :). I think there's >>>a plugin for Trac (0.9 at least) that'll do that... >> >>Look at the bottom of >> >>http://projects.scipy.org/scipy/scipy/timeline >>http://projects.scipy.org/scipy/numpy/timeline >> >>Hopefully these URLs will be stable. > > > Thanks! Not perfect -- you only get the change comment in the feed, > not the diff or at least the files changed, but it's better than > nothing :-) Yes, we could also enable a post-commit hook for svn which sends emails, and create a pair of numpy-svn and a scipy-svn mailing lists as the target. Those with a need for staying really on top of the development can then subscribe to these lists. I did that for ipython a while ago, so I can help here if need be (ipython is hosted on the same setup that numpy and scipy are, we are all under the Virtualmin host system). Other good things I found in the book: - make sure wikis are well organized: wikis, as a side effect of the very flexibility that keeps makes them so useful, have a tendency to grow in a rather 'organic' fashion. This is nice-speak for 'turning into a disorganized, un-navigable mess'. So good super-structure and navigational aids (Table of Contents wiki macros help a lot) are key for a wiki to be truly useful. - avoid private discussions, keep things on-list: I am _very_ guilty of this with ipython. While I'm sure I'll continue doing this (it's just too convenient and I'm lazy), it's true that keeping the lists 'in the loop' is an important practice for the long run. - In general, I think that chapter 6 (Communications) had a fair amount of common sense, but still well organized and thought out, advice. At least I found it useful. Regards, f From faltet at carabos.com Mon Jan 9 14:03:35 2006 From: faltet at carabos.com (Francesc Altet) Date: Mon, 9 Jan 2006 20:03:35 +0100 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> Message-ID: <200601092003.35687.faltet@carabos.com> A Dilluns 09 Gener 2006 19:19, Sasha va escriure: > On 1/9/06, Francesc Altet wrote: > > ... > > I'd propose the next behaviour for 0-rank arrays: > > > > In [65]: type(numpy.array(0)[...]) > > Out[65]: > > > > In [66]: type(numpy.array(0)[()]) # Indexing a la numarray > > Out[66]: > > I like the idea of supporting [()] on zero rank ndarrays, but I think > it should be equivalent to [...]. I view [...] as [(slice(None),) * > rank], and thus for rank=0, [...] is the same as [()]. However, the original aim of the "..." (ellipsis) operator is [taken for the numarray manual]: """ One final way of slicing arrays is with the keyword `...' This keyword is somewhat complicated. It stands for "however many `:' I need depending on the rank of the object I'm indexing, so that the indices I do specify are at the end of the index list as opposed to the usual beginning". So, if one has a rank-3 array A, then A[...,0] is the same thing as A[:,:,0], but if B is rank-4, then B[...,0] is the same thing as: B[:,:,:,0]. Only one `...' is expanded in an index expression, so if one has a rank-5 array C, then C[...,0,...] is the same thing as C[:,:,:,0,:]. """ > Furthermore, I don't see any use for [...] that always returns the > same array. As far as I remember in some old version of Numeric, > [...] was a way to make a contiguous copy, but in numpy this is not > the case (one needs to use copy method for that). I don't know for old versions of Numeric, but my impression is that the ellipsis meaning is clearly stated above. In fact, in a 4-dimensional array, say a, a[...] should be equivalent to a[:,:,:,:] and this does not necessarily implies a copy. Cheers, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From Fernando.Perez at colorado.edu Mon Jan 9 14:07:40 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 09 Jan 2006 12:07:40 -0700 Subject: [SciPy-dev] numpy changes generator behaviour In-Reply-To: References: Message-ID: <43C2B47C.3020602@colorado.edu> David M.Cooke wrote: > On Jan 9, 2006, at 09:01 , Christopher Fonnesbeck wrote: > > >>I'm not sure this is intended, but it appears that numpy changes >>the behaviour of the new python 2.4 generators. Here is what is >>supposed to happen: >> >> >>>>>sum(x*x for x in range(10)) >> >>285 >> >>But if I import numpy, and try again I get: >> >> >>>>>from numpy import * >>>>>sum(x*x for x in range(10)) >> >> >> >>Why should it no longer sum? > > > Because the function 'sum' is now the one from numpy, not the builtin > one. (Numeric had 'sum' before it was added to Python.) I know that in this case Numeric has historical precedence, but I'm wondering if it wouldn't be a good idea to rename numpy.{sum,product} to asum/aproduct. While I find it a bit unpleasant (and it is backwards-incompatible), I think that staying clear of the python builtins is probably a good idea. This would also allow people to easily use both the builtin sum and the numpy one when they use 'import *', without having to jump through __builtin__ hoops or pre-binding 'sum' to a local name. We do, after all, live on top of the python language, so my take on this is that we should be willing to accomodate our practices a little as the language evolves. Since we are in this case making a big break, I think it's an unusually good opportunity to do some compatibility-breaking cleanup work. Along these lines, do we really need both prod and product? In [8]: N.prod?? Type: function Base Class: String Form: Namespace: Interactive File: /home/fperez/usr/local/lib/python2.3/site-packages/numpy/core/oldnumeric.py Definition: N.prod(a, axis=0) Source: def prod(a, axis=0): """Return the product of the elements along the given axis """ return asarray(a).prod(axis) In [9]: N.product?? Type: function Base Class: String Form: Namespace: Interactive File: /home/fperez/usr/local/lib/python2.3/site-packages/numpy/core/oldnumeric.py Definition: N.product(x, axis=0, dtype=None) Source: def product (x, axis=0, dtype=None): """Product of the array elements over the given axis.""" return asarray(x).prod(axis, dtype) These two are nearly identical, let's get rid of one of them (less namespace noise). Cheers, f From cookedm at physics.mcmaster.ca Mon Jan 9 14:31:39 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Mon, 09 Jan 2006 14:31:39 -0500 Subject: [SciPy-dev] [Slightly OT] Karl Fogel's =?utf-8?Q?=C2=93Produc?= =?utf-8?Q?ing?= Open Source =?utf-8?Q?Software=C2=94?= In-Reply-To: <43C2B1A0.3010908@colorado.edu> (Fernando Perez's message of "Mon, 09 Jan 2006 11:55:28 -0700") References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> <43C2A7B1.7050107@gmail.com> <43C2B1A0.3010908@colorado.edu> Message-ID: Fernando Perez writes: > David M.Cooke wrote: >> On Jan 9, 2006, at 13:13 , Robert Kern wrote: >> >>>David M. Cooke wrote: >>> >>> >>>>Is there anything in particular you'd suggest? I've had a quick look >>>>before, and the thing that stuck to me was code review. I think >>>>something like a checkins list would help cover that. I'd also like >>>>an RSS feed of checkins, b/c I check those a lot :). I think there's >>>>a plugin for Trac (0.9 at least) that'll do that... >>> >>>Look at the bottom of >>> >>>http://projects.scipy.org/scipy/scipy/timeline >>>http://projects.scipy.org/scipy/numpy/timeline >>> >>>Hopefully these URLs will be stable. >> >> >> Thanks! Not perfect -- you only get the change comment in the feed, >> not the diff or at least the files changed, but it's better than >> nothing :-) > > Yes, we could also enable a post-commit hook for svn which sends emails, and > create a pair of numpy-svn and a scipy-svn mailing lists as the target. Those > with a need for staying really on top of the development can then subscribe to > these lists. > > I did that for ipython a while ago, so I can help here if need be (ipython is > hosted on the same setup that numpy and scipy are, we are all under the > Virtualmin host system). This could also be useful: http://cheeseshop.python.org/pypi/SvnReporter/ > Other good things I found in the book: > > - make sure wikis are well organized: wikis, as a side effect of the very > flexibility that keeps makes them so useful, have a tendency to grow in a > rather 'organic' fashion. This is nice-speak for 'turning into a > disorganized, un-navigable mess'. So good super-structure and navigational > aids (Table of Contents wiki macros help a lot) are key for a wiki to be truly > useful. Should we start pushing to fill the trac wiki with developer-like stuff -- reasons behind decisions, rough areas that need to be worked on, what file does what, that kind of thing? -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From robert.kern at gmail.com Mon Jan 9 14:36:59 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 09 Jan 2006 13:36:59 -0600 Subject: [SciPy-dev] [Slightly OT] Karl Fogel's =?UTF-8?B?wpNQcm9kdWNp?= =?UTF-8?B?bmcgT3BlbiBTb3VyY2UgU29mdHdhcmXClA==?= In-Reply-To: References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> <43C2A7B1.7050107@gmail.com> <43C2B1A0.3010908@colorado.edu> Message-ID: <43C2BB5B.2060008@gmail.com> David M. Cooke wrote: > Should we start pushing to fill the trac wiki with developer-like > stuff -- reasons behind decisions, rough areas that need to be worked > on, what file does what, that kind of thing? That would be excellent. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Mon Jan 9 14:39:44 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 09 Jan 2006 12:39:44 -0700 Subject: [SciPy-dev] [Slightly OT] Karl Fogel's =?UTF-8?B?wpNQcm9kdWNp?= =?UTF-8?B?bmcgT3BlbiBTb3VyY2UgU29mdHdhcmXClA==?= In-Reply-To: <43C2BB5B.2060008@gmail.com> References: <43C142E6.3020605@colorado.edu> <3C638B74-EBAC-4704-8F68-8A784A861DED@physics.mcmaster.ca> <43C2A7B1.7050107@gmail.com> <43C2B1A0.3010908@colorado.edu> <43C2BB5B.2060008@gmail.com> Message-ID: <43C2BC00.8070200@colorado.edu> Robert Kern wrote: > David M. Cooke wrote: > > >>Should we start pushing to fill the trac wiki with developer-like >>stuff -- reasons behind decisions, rough areas that need to be worked >>on, what file does what, that kind of thing? > > > That would be excellent. +1 Cheers, f From loredo at astro.cornell.edu Mon Jan 9 15:58:38 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Mon, 9 Jan 2006 15:58:38 -0500 (EST) Subject: [SciPy-dev] FC3 successes and OS X issues with current revisions Message-ID: <200601092058.k09KwcL15376@laplace.astro.cornell.edu> Just installed the current SVN revisions (numpy 1862, scipy 1544) on FC3 and OS X (10.3.9), with FFTW2 on both platforms. On both platforms, numpy.test(1,1) runs without errors---yippee! On both platforms, "import scipy" complains about overwriting fft and ifft. >>> import scipy scipy.Overwriting fft= from scipy.fftpack.basic (was from numpy.dft.fftpack) Overwriting ifft= from scipy.fftpack.basic (was from numpy.dft.fftpack) On FC3, scipy.test() showed one failure with stats, in test_gamma. The failure message indicated it was attempting a small pvalue calculation. I thought it could be a statistical fluke (do enough significance tests and you expect one to fail); a repeated scipy.test() has no failures. Yippee! Perhaps this slight complexity with stats tests should be mentioned somewhere in the install docs. I've seen it once before. On OS X, scipy.test() gives the "empty cblas, clapack" warnings Chris mentioned earlier (in comparison, numarray verifies it is using "external linear algebra support). It gives 10 failures, copied below. The fft failures are notable because of the import complaint. A simple "by hand" test shows that numpy.fft and numpy.ifft appear to work fine, but scipy.fft and scipy.ifft have problems (the real part of ifft(fft(x)) does not equal x; largest differences are order unity). scipy passed this "by hand" test yesterday on OS X; it also passes it today on FC3. I also installed numarray on both platforms from CVS. On FC3, the numarray tests stall at "fft" . Ctl-C gets no response; I had to stop the job and kill it via "kill". The tests all ran an passed last week. I don't know if there is any interaction with numpy/scipy in this regard. In contrast, the tests all *pass* on OS X, even the fft test. It's weird that on FC 3, scipy passes but numarray fails fft tests, while on OS X the reverse is true. FWIW I reinstalled FFTW2 yesterday on OS X (*before* I installed yesterday's scipy svn, which *passed* my "by hand" fft test). I also installed mpl-0.86 from PyPI on FC3. Apart from the fft overwrite complaints, it appears to work fine on a selection of example files. Even the specgram example works, producing a plot looking like it always has. The boxplot demo fails, but I haven't looked into the problem; I doubt it is a scipy issue. The OS X install is still running (it's an old, slow G3!). -Tom ~~~~~~~~~~~~~~~~~ OS X scipy.test() output (selected) ..FSkipping check_djbfft (failed to import FFT) ...F..F..FF.FSkipping check_djbfft (failed to import FFT) ...Skipping check_djbfft (failed to import FFT: No module named FFT) ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 86, in check_definition assert_array_almost_equal(diff(sin(x),2),direct_diff(sin(x),2)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 87.5%): Array 1: [ -6.3519652e-15 -3.8268343e-01 -7.0710678e-01 -9.2387953e-01 -1.0000000e+00 -9.2387953e-01 -7.0710678e-01 -3.82... Array 2: [ -7.3854931e-15 6.5259351e-15 -2.4942634e-15 -7.5636114e-17 1.4745663e-15 -1.9133685e-15 2.2804788e-16 8.70... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 332, in check_random_even assert_array_almost_equal(direct_hilbert(direct_ihilbert(f)),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 6.9388939e-18+0.j 0.0000000e+00-0.j 4.9065389e-18-0.j 0.0000000e+00-0.j 0.0000000e+00-0.j 0.0000000e+00-0.... Array 2: [-0.3331645 -0.0799056 0.0523087 0.1059192 0.0973615 0.3480509 0.0293455 0.1978315 -0.2975818 0.1814041 0.47725... ====================================================================== FAIL: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 390, in check_definition assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 88.8888888889%): Array 1: [ 0.0998334 0.4341242 0.7160532 0.9116156 0.9972237 0.9625519 0.8117822 0.5630995 0.2464987 -0.0998334 -0.43412... Array 2: [ 0.0998334 0.0938127 0.0764768 0.0499167 0.0173359 -0.0173359 -0.0499167 -0.0764768 -0.0938127 -0.0998334 -0.09381... ====================================================================== FAIL: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 240, in check_random_even assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0.+0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.j 0.-0.... Array 2: [-0.0340787 0.1267954 0.1694414 -0.2523478 -0.5171198 0.0171597 0.428051 -0.4324979 0.3333228 -0.0747354 -0.18897... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 98, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 20.+0.j 0.+0.j -4.+4.j 0.+0.j -4.+0.j 0.-0.j -4.-4.j 0.-0.j] Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 425, in check_definition assert_array_almost_equal(y,direct_dftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+7.7942286j 0. +0.j 0. +0.j ] [-13.5-7.79... Array 2: [[ 45. +0.j -4.5+2.5980762j -4.5-2.5980762j] [-13.5+0.j -0. +0.j -0. -0.j ] [-13.5+0.j ... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 183, in check_definition assert_array_almost_equal(y,y1) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.5+0.j 0. -0.j -0.5-0.5j 0. -0.j -0.5+0.j 0. +0.j -0.5+0.5j 0. +0.j ] Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 217, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0392156863%): Array 1: [ 0.9217905 +0.0000000e+00j 0.3038689 +8.7076316e-18j 0.7451523 +1.0449158e-16j 0.3900691 -1.6948749e-17j 0.501015... Array 2: [ 0.9217905 0.3507541 0.5222262 0.2261792 0.886283 0.0956462 0.3059122 0.2456351 0.9575797 0.4694345 0.32742... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 594, in check_definition assert_array_almost_equal(y,direct_idftn(x)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5-0.8660254j 0. +0.j 0. +0.j ] [-1.5+0.8660254j ... Array 2: [[ 5. +0.j -0.5-0.2886751j -0.5+0.2886751j] [-1.5+0.j -0. -0.j -0. +0.j ] [-1.5+0.j ... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 341, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 2.625+0.j -0.375-0.j -0.375-0.j -0.375-0.j 0.625+0.j -0.375+0.j -0.375+0.j -0.375+0.j] ---------------------------------------------------------------------- Ran 1208 tests in 28.166s FAILED (failures=10) From schofield at ftw.at Mon Jan 9 16:24:35 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 09 Jan 2006 21:24:35 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43C204DC.2030902@ieee.org> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> Message-ID: <43C2D493.8060108@ftw.at> Hi Travis, I've now built the tarball and Windows binaries for the 0.4.4 SciPy release. They seem to pass all the unit tests under Ubuntu 5.10 and Windows 2000. I could release them on Sourceforge and/or new.scipy.org tonight, but I still need access. Thanks for your guidance. You were right about ATLAS being statically linkable. That makes life much easier ;) -- Ed From dalcinl at gmail.com Mon Jan 9 16:46:31 2006 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 9 Jan 2006 18:46:31 -0300 Subject: [SciPy-dev] bug in 'frombuffer' Message-ID: In latest numpy from SF, file multiarraymodule.c static char doc_frombuffer[] = \ "frombuffer(buffer=, dtype=int, count=-1, offset=0)\n"\ /* ... */ static PyObject * array_frombuffer(PyObject *ignored, PyObject *args, PyObject *keywds) { /* ... */ static char *kwlist[] = {"buffer", "dtype", "count", NULL}; /* ... */ if (!PyArg_ParseTupleAndKeywords(args, keywds, "O|O&LL", /* .. */ It seems that 'kwlist' is missing an "offset" entry. This causes an exception in Python: >>> a = frombuffer(buff) Traceback (most recent call last): File "", line 1, in ? RuntimeError: more argument specifiers than keyword list entries -- Lisandro Dalc?n --------------- Centro Internacional de M?todos Computacionales en Ingenier?a (CIMEC) Instituto de Desarrollo Tecnol?gico para la Industria Qu?mica (INTEC) Consejo Nacional de Investigaciones Cient?ficas y T?cnicas (CONICET) PTLC - G?emes 3450, (3000) Santa Fe, Argentina Tel/Fax: +54-(0)342-451.1594 From eric at enthought.com Mon Jan 9 18:28:51 2006 From: eric at enthought.com (eric jones) Date: Mon, 09 Jan 2006 17:28:51 -0600 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43C2D493.8060108@ftw.at> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> Message-ID: <43C2F1B3.8040903@enthought.com> Are you talking about access to our machines? If so, Joe or Travis, could you please get this set up? thanks, eric Ed Schofield wrote: >Hi Travis, > >I've now built the tarball and Windows binaries for the 0.4.4 SciPy >release. They seem to pass all the unit tests under Ubuntu 5.10 and >Windows 2000. I could release them on Sourceforge and/or new.scipy.org >tonight, but I still need access. > >Thanks for your guidance. You were right about ATLAS being statically >linkable. That makes life much easier ;) > >-- Ed > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From oliphant at ee.byu.edu Mon Jan 9 18:32:36 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 09 Jan 2006 16:32:36 -0700 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43C2D493.8060108@ftw.at> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> Message-ID: <43C2F294.2090706@ee.byu.edu> Ed Schofield wrote: >Hi Travis, > >I've now built the tarball and Windows binaries for the 0.4.4 SciPy >release. They seem to pass all the unit tests under Ubuntu 5.10 and >Windows 2000. I could release them on Sourceforge and/or new.scipy.org >tonight, but I still need access. > > Great. What do you need access to? I'm not sure how to grant it. Aren't you already a user on the new.scipy.org site (if you have SVN access you are)? Can you edit the wiki? Thanks for doing the release. Very, very helpful... -Travis From schofield at ftw.at Mon Jan 9 20:01:41 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 10 Jan 2006 01:01:41 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43C2F294.2090706@ee.byu.edu> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: On 09/01/2006, at 11:32 PM, Travis Oliphant wrote: > Ed Schofield wrote: > >> Hi Travis, >> >> I've now built the tarball and Windows binaries for the 0.4.4 SciPy >> release. They seem to pass all the unit tests under Ubuntu 5.10 and >> Windows 2000. I could release them on Sourceforge and/or >> new.scipy.org >> tonight, but I still need access. >> >> > Great. What do you need access to? I'm not sure how to grant it. > Aren't you already a user on the new.scipy.org site (if you have SVN > access you are)? Can you edit the wiki? Well, to upload to the SourceForge project site I need admin access, which one of scipy's current SF administrators has to grant. My SF login name is 'edschofield'. I can edit the wiki fine, but I have no idea how to upload file attachments with MoinMoin. In theory it should be possible, right? ;) But I may as well release scipy onto sourceforge like you did for numpy ... > Thanks for doing the release. Very, very helpful... You're welcome. It was a bit frustrating, mainly with the Windows compilations, because it's so slooow, especially over a remote filesystem ... ;) -- Ed From schofield at ftw.at Mon Jan 9 20:32:56 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 10 Jan 2006 01:32:56 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: On 10/01/2006, at 1:01 AM, Ed Schofield wrote: > I can edit the wiki fine, but I have no > idea how to upload file attachments with MoinMoin. In theory it > should be possible, right? ;) Okay, I worked out how to include attachments. But it doesn't seem a satisfactory way to distribute files ... it seems to require users to click on the "More actions" list box to actually see the files (and doesn't have SF's nice mirroring system). So I suppose SF would be better, unless there's some other file release facility on the new website I'm missing ... -- Ed From schofield at ftw.at Mon Jan 9 20:49:43 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 10 Jan 2006 01:49:43 +0000 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: <6A9D648C-3818-4329-8B01-F0AE81D38142@ftw.at> On 10/01/2006, at 1:01 AM, Ed Schofield wrote: > Well, to upload to the SourceForge project site I need admin access, > which one of scipy's current SF administrators has to grant. No, wait, that's wrong. 'Developer' status should suffice, with the appropriate 'file release' permissions ... -- Ed From ndarray at mac.com Mon Jan 9 22:22:08 2006 From: ndarray at mac.com (Sasha) Date: Mon, 9 Jan 2006 22:22:08 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> Message-ID: Attached patch implements indexing of zero rank arrays: >>> x = array(42) >>> x[...] 42 >>> x[()] 42 >>> x[()].dtype >>> type(x[()]) Ellipsis and empty tuple are the only two valid indices. So far there was only one +1 vote for the feature (from Francesc Altet, who also suggested that x[()] be allowed). Does anyone object to this feature? I have also proposed to support x[...,newaxis] (see comment by sdementen at http://www.scipy.org/wikis/numdesign/). I will add this feature if there is interest. Finally, if we make x[...] valid for rank-0 arrays, should x[...] = value be allowed as well? I think it should, applying the principle of least surprized. An additional consideration is that rank-0 arrays are unhashable, but there is no obvious way to change their values. The following example show that rank-0 arrays are in fact mutable (in a non-obvious way): >>> x = array(1) >>> x.shape=(1,); x[0] = 42; x.shape=() >>> x Please comment on the following proposals: 1. x[...] 2. x[...] = value 3. x[..., newaxis] -- sasha From ndarray at mac.com Mon Jan 9 22:28:11 2006 From: ndarray at mac.com (Sasha) Date: Mon, 9 Jan 2006 22:28:11 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> Message-ID: Sorry, forgot to attach. On 1/9/06, Sasha wrote: > Attached patch implements indexing of zero rank arrays: > > >>> x = array(42) > >>> x[...] > 42 > >>> x[()] > 42 > >>> x[()].dtype > > >>> type(x[()]) > > > Ellipsis and empty tuple are the only two valid indices. > > So far there was only one +1 vote for the feature (from Francesc > Altet, who also suggested that x[()] be allowed). Does anyone object > to this feature? > > I have also proposed to support x[...,newaxis] (see comment by > sdementen at http://www.scipy.org/wikis/numdesign/). I will add this > feature if there is interest. > > Finally, if we make x[...] valid for rank-0 arrays, should x[...] = > value be allowed as well? I think it should, applying the principle of > least surprized. An additional consideration is that rank-0 arrays > are unhashable, but there is no obvious way to change their values. > > The following example show that rank-0 arrays are in fact mutable (in > a non-obvious way): > > >>> x = array(1) > >>> x.shape=(1,); x[0] = 42; x.shape=() > >>> x > > Please comment on the following proposals: > > 1. x[...] > 2. x[...] = value > 3. x[..., newaxis] > > > -- sasha > -------------- next part -------------- Index: numpy/core/src/arrayobject.c =================================================================== --- numpy/core/src/arrayobject.c (revision 1863) +++ numpy/core/src/arrayobject.c (working copy) @@ -1796,6 +1796,9 @@ return NULL; } if (self->nd == 0) { + if (op == Py_Ellipsis || + PyTuple_Check(op) && 0 == PyTuple_GET_SIZE(op)) + return PyArray_ToScalar(self->data, self); PyErr_SetString(PyExc_IndexError, "0-d arrays can't be indexed."); return NULL; Index: numpy/core/tests/test_multiarray.py =================================================================== --- numpy/core/tests/test_multiarray.py (revision 1863) +++ numpy/core/tests/test_multiarray.py (working copy) @@ -69,5 +69,32 @@ d2 = dtypedescr('f8') assert_equal(d2, dtypedescr(float64)) +class test_zero_rank(ScipyTestCase): + def setUp(self): + self.d = array(0), array('x', object) - + def check_ellipsis_subscript(self): + a,b = self.d + + self.failUnlessEqual(a[...], 0) + self.failUnlessEqual(b[...].item(), 'x') + self.failUnless(type(a[...]) is a.dtype) + self.failUnless(type(b[...]) is b.dtype) + + def check_empty_subscript(self): + a,b = self.d + + self.failUnlessEqual(a[()], 0) + self.failUnlessEqual(b[()].item(), 'x') + self.failUnless(type(a[()]) is a.dtype) + self.failUnless(type(b[()]) is b.dtype) + + def check_invalid_subscript(self): + a,b = self.d + self.failUnlessRaises(IndexError, lambda x: x[0], a) + self.failUnlessRaises(IndexError, lambda x: x[0], b) + self.failUnlessRaises(IndexError, lambda x: x[array([], int)], a) + self.failUnlessRaises(IndexError, lambda x: x[array([], int)], b) + +if __name__ == "__main__": + ScipyTest('numpy.core.multiarray').run() From strawman at astraw.com Mon Jan 9 23:20:27 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 09 Jan 2006 20:20:27 -0800 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: <43C3360B.1000909@astraw.com> Ed Schofield wrote: >On 10/01/2006, at 1:01 AM, Ed Schofield wrote: > > > >>I can edit the wiki fine, but I have no >>idea how to upload file attachments with MoinMoin. In theory it >>should be possible, right? ;) >> >> > >Okay, I worked out how to include attachments. But it doesn't seem a >satisfactory way to distribute files ... it seems to require users to >click on the "More actions" list box to actually see the files (and >doesn't have SF's nice mirroring system). So I suppose SF would be >better, unless there's some other file release facility on the new >website I'm missing ... > > On the new wiki, you should be able to make a link to an attachment with "attachment:scipy-0.4.4.tar.gz". But I'm not sure how good the wiki software (MoinMoin)is for serving large files. It's probably best to make a link to something posted elsewhere like "[http://fast.server.somewhere.org/scipy/scipy-0.4.4.tar.gz scipy-0.4.4.tar.gz]" Cheers! Andrew From sebastien.dementen at alumni.insead.edu Tue Jan 10 01:12:42 2006 From: sebastien.dementen at alumni.insead.edu (DE MENTEN Sebastien) Date: Tue, 10 Jan 2006 14:12:42 +0800 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] innumpy? Message-ID: <521836FA9A33D147A2C81D0B95480CA70B68D9@MEDUSA3.SGP.insead.intra> > Please comment on the following proposals: > > 1. x[...] > 2. x[...] = value > 3. x[..., newaxis] +1 for all of them as they provide a more generic way to write algorithms (n-dimensional FFT/interpolation/etc...) without worrying about corner cases. From faltet at carabos.com Tue Jan 10 03:00:48 2006 From: faltet at carabos.com (Francesc Altet) Date: Tue, 10 Jan 2006 09:00:48 +0100 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601092003.35687.faltet@carabos.com> Message-ID: <200601100900.50025.faltet@carabos.com> A Dilluns 09 Gener 2006 21:36, Alexander Belopolsky va escriure: > On 1/9/06, Francesc Altet wrote: > > However, the original aim of the "..." (ellipsis) operator is [taken > > for the numarray manual]: > > > > """ > > One final way of slicing arrays is with the keyword `...' This keyword > > is somewhat complicated. It stands for "however many `:' I need > > This is precisely my motivation for making a[...] the same as a[()] for > zero rank a. In this case "however many" is zero. In other words, a[...] > is a short-hand > for a[(slice(None),)*len(a.shape)]. Specifically, if a.shape = (), > then a[...] = a[()]. I see your point. I was just proposing a way to return different objects for [...] and [()] that were consistent with higher dimensionality. So if one has a non-0 dimensional array (tensor) called a, then a[...] will return exactly the same object, so the same should be done with 0-d arrays (tensors) IMO. Now, the () index in 0d objects is just a way to substitute integer indexes in higher dimensional objects, and should return a scalar type, as it does in the later case. I recognize that this is my interpretation of the facts and chances are that I'm completely wrong with this. Anyway, sometimes I tend to think that this kind of issues are not very important, but from time to time they will certainly bit us, so perhaps this discussion would be useful for the community in the future. > > I don't know for old versions of Numeric, but my impression is that > > the ellipsis meaning is clearly stated above. In fact, in a > > 4-dimensional array, say a, a[...] should be equivalent to a[:,:,:,:] > > and this does not necessarily implies a copy. > > I am not proposing any change for rank > 0 arrays, nor for the new numpy > scalars. For a = array(0), why would you want a[...] have different > type from a[()]? > If as for rank-4 array a, a[...] should be equivalent to a[:,:,:,:] > why would you expect > a[...] for a rank-0 a be different from a[()]? For the reasons mentioned above: just to be consistent with higher dimensions (although as it turns out to be, the interpretation of "consistency" is different between both of us) and to have two different objects returned from [...] and [()]. > PS: There seems to be a terminological difficulty discussing this type > of things. You call an array that takes 4 indices a 4-dimensional > array, but in algebra 4-dimensional vector is a sequence of 4 numbers > (array of shape (4,)). An object that is indexed by 4 numbers is a > tensor of rank 4 (array of shape (n1, n2, n3, n4)). That's true. However, in the Numeric/numarray/numpy literature I think N-dimensional arrays are generally understood as tensors of rank-N. From the scipy manual ("Object Essentials" chapter): """ NumPy provides two fundamental objects: an N-dimensional array object (ndarray) and... """ But anyway, I think that, provided the context is clearly defined, the probability of being misled is (hopefully) quite low. Regards, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From stefan at sun.ac.za Tue Jan 10 07:31:49 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 10 Jan 2006 14:31:49 +0200 Subject: [SciPy-dev] interpolate Message-ID: <20060110123149.GB1165@alpha> In scipy, interpolate.interp2d is currently broken. The scipy tutorial shows an interpolation example which uses bisplev en bisplrep. Is there any reason why these functions can't be used in interp2d? The attached patch calls those functions like in the tutorial. St?fan -------------- next part -------------- --- /home/stefan/work/scipy/scipy/Lib/interpolate/interpolate.py 2005-12-06 11:34:43.000000000 +0200 +++ interpolate.py 2005-12-06 15:30:27.000000000 +0200 @@ -42,7 +42,7 @@ Input: x,y - 1-d arrays defining 2-d grid (or 2-d meshgrid arrays) z - 2-d array of grid values - kind - interpolation type ('nearest', 'linear', 'cubic', 'spline') + kind - interpolation type ('linear', 'cubic', 'spline') copy - if true then data is copied into class, otherwise only a reference is held. bounds_error - if true, then when out_of_bounds occurs, an error is @@ -51,6 +51,10 @@ fill_value - if None, then NaN, otherwise the value to fill in outside defined region. """ + degree = {'linear' : 1, + 'cubic' : 2, + 'spline' : 3} + self.x = atleast_1d(x).copy() self.y = atleast_1d(y).copy() if rank(self.x) > 2 or rank(self.y) > 2: @@ -62,9 +66,10 @@ self.z = array(z,copy=True) if rank(z) != 2: raise ValueError, "Grid values is not a 2-d array." + if kind not in degree: + raise ValueError, "Invalid kind of interpolation requested." - - + self.tck = fitpack.bisplrep(x, y, z, kx=degree[kind], ky=degree[kind]) def __call__(self,x,y,dx=0,dy=0): """ From arnd.baecker at web.de Tue Jan 10 09:35:35 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 10 Jan 2006 15:35:35 +0100 (CET) Subject: [SciPy-dev] cephes.tandg test Message-ID: Hi, all numpy/scipy tests pass on the Itanium2, apart from the following error: ====================================================================== FAIL: check_tandg (scipy.special.basic.test_basic.test_cephes) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 436, in check_tandg assert_equal(cephes.tandg(45),1.0) File "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site-packages/numpy/testing/utils.py", line 81, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: DESIRED: 1.0 ACTUAL: 1.0000000000000002 ---------------------------------------------------------------------- The is caused by the strict equality test: def check_tandg(self): assert_equal(cephes.tandg(45),1.0) Is there a reason, why one should get 1.0 up to the last bit? If not an `assert_almost_equal` might be more appropriate? Best, Arnd From oliphant.travis at ieee.org Tue Jan 10 10:45:20 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 10 Jan 2006 08:45:20 -0700 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: <43C3D690.5050600@ieee.org> Ed Schofield wrote: >On 09/01/2006, at 11:32 PM, Travis Oliphant wrote: > > > >>Great. What do you need access to? I'm not sure how to grant it. >>Aren't you already a user on the new.scipy.org site (if you have SVN >>access you are)? Can you edit the wiki? >> >> > >Well, to upload to the SourceForge project site I need admin access, >which one of scipy's current SF administrators has to grant. My SF >login name is 'edschofield'. I can edit the wiki fine, but I have no >idea how to upload file attachments with MoinMoin. In theory it >should be possible, right? ;) But I may as well release scipy onto >sourceforge like you did for numpy ... > > Unfortunately, I'm not an administrator on the scipy.org site. We need Eric or the other Travis to either set that up or grant you access. -Travis From oliphant.travis at ieee.org Tue Jan 10 10:47:25 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 10 Jan 2006 08:47:25 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> Message-ID: <43C3D70D.7090709@ieee.org> Sasha wrote: >Attached patch implements indexing of zero rank arrays: > > > >>>>x = array(42) >>>>x[...] >>>> >>>> >42 > > >>>>x[()] >>>> >>>> >42 > > >>>>x[()].dtype >>>> >>>> > > > >>>>type(x[()]) >>>> >>>> > > >Ellipsis and empty tuple are the only two valid indices. > >So far there was only one +1 vote for the feature (from Francesc >Altet, who also suggested that x[()] be allowed). Does anyone object >to this feature? > >I have also proposed to support x[...,newaxis] (see comment by >sdementen at http://www.scipy.org/wikis/numdesign/). I will add this >feature if there is interest. > >Finally, if we make x[...] valid for rank-0 arrays, should x[...] = >value be allowed as well? I think it should, applying the principle of >least surprized. An additional consideration is that rank-0 arrays >are unhashable, but there is no obvious way to change their values. > >The following example show that rank-0 arrays are in fact mutable (in >a non-obvious way): > > > >>>>x = array(1) >>>>x.shape=(1,); x[0] = 42; x.shape=() >>>>x >>>> >>>> > > > Generally, the array scalars are supposed to replace rank-0 arrays. However, this may be too ambitious and as long as the rank-0 arrays are there, they may as well be useful. >Please comment on the following proposals: > >1. x[...] > > +1 >2. x[...] = value > > +1 >3. x[..., newaxis] > > +1 > > > From ndarray at mac.com Tue Jan 10 11:17:15 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 11:17:15 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: <43C3D70D.7090709@ieee.org> References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> Message-ID: I've commited the change (see http://projects.scipy.org/scipy/numpy/changeset/1864) that supports x[...] and x[()]. I will implement x[...] = value soon. The remaining feature x[..., newaxis] may require some refactoring, so no promises here. I have created a Wiki page to document the progress. (See http://projects.scipy.org/scipy/numpy/wiki/ZeroRankArray). -- sasha On 1/10/06, Travis Oliphant wrote: > >1. x[...] > +1 > >2. x[...] = value > +1 > >3. x[..., newaxis] > +1 From dd55 at cornell.edu Tue Jan 10 11:43:55 2006 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 10 Jan 2006 11:43:55 -0500 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: <200601060852.01160.dd55@cornell.edu> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601060852.01160.dd55@cornell.edu> Message-ID: <200601101143.56010.dd55@cornell.edu> On Friday 06 January 2006 08:52, Darren Dale wrote: > I have some failures to report this morning, after updating to version > 0.9.3.1842 on an AMD-64. I dont know how to force scipy to use fftw-2 with > the new code in distutils.system_info, but based on previous experience I > believe these failures are unique to fftw-3. I have observed these fftw-3 errors on both AMD64 and x86 architectures now. I temporarily removed fftw-3 from my system, and can confirm that all tests pass when scipy is built using fftw-2. Unfortunately, I have other programs that require fftw-3, so this wont do as a workaround. Has anyone successfully built and tested the scipy-0.4.4 with fftw-3? Does anyone know how to tell scipy which fftw version to use, or is anyone working on the fftw-3 problems? If not, maybe fftw-3 support should be masked for the time being. Thanks, Darren From pearu at scipy.org Tue Jan 10 10:58:13 2006 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 10 Jan 2006 09:58:13 -0600 (CST) Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: <200601101143.56010.dd55@cornell.edu> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601060852.01160.dd55@cornell.edu> <200601101143.56010.dd55@cornell.edu> Message-ID: On Tue, 10 Jan 2006, Darren Dale wrote: > On Friday 06 January 2006 08:52, Darren Dale wrote: >> I have some failures to report this morning, after updating to version >> 0.9.3.1842 on an AMD-64. I dont know how to force scipy to use fftw-2 with >> the new code in distutils.system_info, but based on previous experience I >> believe these failures are unique to fftw-3. > > I have observed these fftw-3 errors on both AMD64 and x86 architectures now. > > I temporarily removed fftw-3 from my system, and can confirm that all tests > pass when scipy is built using fftw-2. Unfortunately, I have other programs > that require fftw-3, so this wont do as a workaround. > > Has anyone successfully built and tested the scipy-0.4.4 with fftw-3? Yes. > Does anyone know how to tell scipy which fftw version to use, or is > anyone working on the fftw-3 problems? Define environment variable FFTW3=None to disable fftw-3 detection. > If not, maybe fftw-3 support should be masked for the time being. No need, it should be fixed instead for PPC.. Pearu From dd55 at cornell.edu Tue Jan 10 12:08:12 2006 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 10 Jan 2006 12:08:12 -0500 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601101143.56010.dd55@cornell.edu> Message-ID: <200601101208.12985.dd55@cornell.edu> On Tuesday 10 January 2006 10:58, Pearu Peterson wrote: > On Tue, 10 Jan 2006, Darren Dale wrote: > > On Friday 06 January 2006 08:52, Darren Dale wrote: > >> I have some failures to report this morning, after updating to version > >> 0.9.3.1842 on an AMD-64. I dont know how to force scipy to use fftw-2 > >> with the new code in distutils.system_info, but based on previous > >> experience I believe these failures are unique to fftw-3. > > > > I have observed these fftw-3 errors on both AMD64 and x86 architectures > > now. > > > > I temporarily removed fftw-3 from my system, and can confirm that all > > tests pass when scipy is built using fftw-2. Unfortunately, I have other > > programs that require fftw-3, so this wont do as a workaround. > > > > Has anyone successfully built and tested the scipy-0.4.4 with fftw-3? > > Yes. Really? Any suggestions then? All my tests were passing with scipy/fftw-3 for, up until about a week ago. > > > Does anyone know how to tell scipy which fftw version to use, or is > > anyone working on the fftw-3 problems? > > Define environment variable > > FFTW3=None > > to disable fftw-3 detection. > > > If not, maybe fftw-3 support should be masked for the time being. > > No need, it should be fixed instead for PPC.. For PPC? I'm seeing these problems with x86 and AMD64. From pearu at scipy.org Tue Jan 10 11:43:12 2006 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 10 Jan 2006 10:43:12 -0600 (CST) Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: <200601101208.12985.dd55@cornell.edu> References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601101208.12985.dd55@cornell.edu> Message-ID: On Tue, 10 Jan 2006, Darren Dale wrote: >>> Has anyone successfully built and tested the scipy-0.4.4 with fftw-3? >> >> Yes. > > Really? Any suggestions then? All my tests were passing with scipy/fftw-3 for, > up until about a week ago. What scipy/numpy versions are you using? Here (debian sid linux,fftw-3.0.1): In [10]: scipy.show_config() fft_opt_info: libraries = ['fftw3'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW3_H', None)] include_dirs = ['/usr/include'] In [11]: scipy.__version__ Out[11]: '0.4.4.1544' In [12]: scipy.__numpy_version__ Out[12]: '0.9.3.1863' In [7]: scipy.fftpack.test(10) ---------------------------------------------------------------------- Ran 51 tests in 127.203s OK >> >>> Does anyone know how to tell scipy which fftw version to use, or is >>> anyone working on the fftw-3 problems? >> >> Define environment variable >> >> FFTW3=None >> >> to disable fftw-3 detection. >> >>> If not, maybe fftw-3 support should be masked for the time being. >> >> No need, it should be fixed instead for PPC.. > > For PPC? I'm seeing these problems with x86 and AMD64. May be these problems are not related to different architectures. So far only OSX users have reported problems with fftw3. Although there are also performance problems with fftw3 under Opteron, fftpack tests still pass ok there. Pearu From chris at trichech.us Tue Jan 10 12:43:31 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Tue, 10 Jan 2006 12:43:31 -0500 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601060852.01160.dd55@cornell.edu> <200601101143.56010.dd55@cornell.edu> Message-ID: On Jan 10, 2006, at 10:58 AM, Pearu Peterson wrote: > >> If not, maybe fftw-3 support should be masked for the time being. > > No need, it should be fixed instead for PPC.. I have built both numpy and scipy with fftw3 for PPC; binaries are available: http://trichech.us Its gcc4 that still does not work. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From ndarray at mac.com Tue Jan 10 13:17:13 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 13:17:13 -0500 Subject: [SciPy-dev] Nested arrays Message-ID: In Numeric nested arrays could be indexed using multiple subscripts: >>> from Numeric import * >>> x = empty(2,'O') >>> x[0] = array([1,2]) >>> x[1] = array([1,2,3]) >>> x array([[1 2] , [1 2 3] ],'O') >>> x[1][2] 3 This feature is lost in NumPy: >>> x = empty(2, object) >>> x[0] = array([1,2]) >>> x[1] = array([1,2,3]) >>> x[1][2] Traceback (most recent call last): File "", line 1, in ? TypeError: unsubscriptable object This is quite surprising because x[1] prints as an array >>> x[1] array([1, 2, 3]) The problem is that x[1] is in fact a numpy scalar object holding an array: >>> x[1].size 1 >>> type(x[1]) I can work around this problem as follows: >>> x[1].item()[2] 3 but this is quite ugly and requires a lot of changes for the legacy code. I propose to add ndarray to a list of possible dtypes and not to wrap ndarrays in object_ instances. There are many other possible solutions to this problem, but I think that nested arrays a common enough to deserve special treatment. -- sasha From dd55 at cornell.edu Tue Jan 10 14:05:06 2006 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 10 Jan 2006 14:05:06 -0500 Subject: [SciPy-dev] NumPy/SciPy test results, x86 and PPC In-Reply-To: References: <4D73BEED-0547-4A89-A307-F05E95344AB4@ftw.at> <200601101208.12985.dd55@cornell.edu> Message-ID: <200601101405.07591.dd55@cornell.edu> On Tuesday 10 January 2006 11:43, Pearu Peterson wrote: > On Tue, 10 Jan 2006, Darren Dale wrote: > >>> Has anyone successfully built and tested the scipy-0.4.4 with fftw-3? > >> > >> Yes. > > > > Really? Any suggestions then? All my tests were passing with scipy/fftw-3 > > for, up until about a week ago. > > What scipy/numpy versions are you using? numpy-0.9.3.1864 scipy-0.4.4.1544 > Here (debian sid > linux,fftw-3.0.1): > > In [10]: scipy.show_config() > > fft_opt_info: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > In [11]: scipy.__version__ > Out[11]: '0.4.4.1544' > > In [12]: scipy.__numpy_version__ > Out[12]: '0.9.3.1863' > > In [7]: scipy.fftpack.test(10) > > ---------------------------------------------------------------------- > Ran 51 tests in 127.203s > > OK fft_opt_info: libraries = ['fftw3', 'djbfft'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW3_H', None), ('SCIPY_DJBFFT_H', None)] include_dirs = ['/usr/include'] ah, here it is. It looks like scipy doesnt like my djbfft. * I can build scipy against a dynamically linked djbfft, and get failures. * I can't build scipy with a statically linked djbfft (I dont know why): /usr/lib/gcc/x86_64-pc-linux-gnu/3.4.5/../../../../x86_64-pc-linux-gnu/bin/ld: /usr/lib/libdjbfft.a(8u5.o): relocation R_X86_64_32 against `fftc8_roots4096' can not be used when making a shared object; recompile with -fPIC /usr/lib/libdjbfft.a: could not read symbols: Bad value collect2: ld returned 1 exit status /usr/lib/gcc/x86_64-pc-linux-gnu/3.4.5/../../../../x86_64-pc-linux-gnu/bin/ld: /usr/lib/libdjbfft.a(8u5.o): relocation R_X86_64_32 against `fftc8_roots4096' can not be used when making a shared object; recompile with -fPIC /usr/lib/libdjbfft.a: could not read symbols: Bad value collect2: ld returned 1 exit status error: Command "/usr/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/Lib/fftpack/_fftpackmodule.o build/temp.linux-x86_64-2.4/Lib/fftpack/src/zfft.o build/temp.linux-x86_64-2.4/Lib/fftpack/src/drfft.o build/temp.linux-x86_64-2.4/Lib/fftpack/src/zrfft.o build/temp.linux-x86_64-2.4/Lib/fftpack/src/zfftnd.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -L/usr/lib -Lbuild/temp.linux-x86_64-2.4 -ldfftpack -lfftw3 -ldjbfft -lg2c -o build/lib.linux-x86_64-2.4/scipy/fftpack/_fftpack.so" failed with exit status 1 * I can revert the changes we made to distutils/system_info.py so scipy doesnt find libdjbfft.so, then scipy builds, and tests pass. I dont know enough about linking to diagnose the static djbfft problem on my own. It may be specific to gentoo. From robert.kern at gmail.com Tue Jan 10 15:18:53 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 10 Jan 2006 14:18:53 -0600 Subject: [SciPy-dev] Nested arrays In-Reply-To: References: Message-ID: <43C416AD.8060304@gmail.com> Sasha wrote: > In Numeric nested arrays could be indexed using multiple subscripts: > > >>>>from Numeric import * >>>>x = empty(2,'O') >>>>x[0] = array([1,2]) >>>>x[1] = array([1,2,3]) >>>>x > > array([[1 2] , [1 2 3] ],'O') > >>>>x[1][2] > > 3 > > This feature is lost in NumPy: > > >>>>x = empty(2, object) >>>>x[0] = array([1,2]) >>>>x[1] = array([1,2,3]) >>>>x[1][2] > > Traceback (most recent call last): > File "", line 1, in ? > TypeError: unsubscriptable object > > This is quite surprising because x[1] prints as an array > >>>>x[1] > > array([1, 2, 3]) > > The problem is that x[1] is in fact a numpy scalar object holding an array: > >>>>x[1].size > > 1 > >>>>type(x[1]) > > I think we should reconsider the decision to return scalar array objects when indexing into object arrays. The concerns for indexing into numeric arrays are much different than for object arrays. There are no type casting issues with object arrays. People use object arrays expecting to be able to access the objects themselves, not some proxy. I'm not sure that the genericity benefits (e.g. always being able to get the .shape attribute) one gets with scalar arrays carry over very well to the use cases of object arrays. Object arrays have always really been a collection of special cases, so I'm not terribly concerned about giving them exceptional behavior. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From loredo at astro.cornell.edu Tue Jan 10 16:10:02 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Tue, 10 Jan 2006 16:10:02 -0500 (EST) Subject: [SciPy-dev] Question about fft overwrites; still scipy.fft problem on OS X Message-ID: <200601102110.k0ALA2s17947@laplace.astro.cornell.edu> Hi folks, The current scipy checkout (1544) and numpy 1863 and 1864 both produce "Overwriting fft" and "Overwriting ifft" warnings when scipy is imported. This happens both on FC3 and on OS X (10.3.9), so I presume it's not unique to my machines. Is this anything I need to worry about? I have to freeze a numpy/scipy tarball tomorrow for a sysadmin who has to install them on student machines; I'll just tell him and the students that the warning is innocuous, if it really is. scipy's fft/ifft (scipy.fft, scipy.ifft) still give erroneous results on OS X, though not on FC3 (both with recent installs of FFTW2). numpy.fft works fine on both platforms, so I'll have my students stick with that. scipy.fft worked fine on OS X last week.... from scipy import * x = arange(16)*2*pi/16 sx = sin(x) sfx = fft(sx) sfix = ifft(sfx) print max(abs(sx-sfix.real)) On OS X Panther this prints "1.0"; change "scipy" to "numpy" and it prints "1.11e-16" (or thereabouts). On FC3, both work fine. FYI, the bug I reported earlier regarding f2py-wrapped functions with float arguments having trouble accepting scalar arrays has vanished over the weekend (on both platforms). I don't know what changed to fix this, but I'm glad of it---thanks to whomever! -Tom From ndarray at mac.com Tue Jan 10 16:25:07 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 16:25:07 -0500 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C416AD.8060304@gmail.com> References: <43C416AD.8060304@gmail.com> Message-ID: "Always being able to get the .shape attribute" is a nice feature and I would rather not loose it without a compelling reason. Note that making ndarray a valid dtype will not have an effect of a[i] loosing array API because ndarrays already have it by definition. For objects that don't have their own array interface one can entertain an idea of creating a scalar class dynamically that inherits from the type of the object and mixes in array API using muliple inheritance or some metaclass magic. Making ndarray a valid dtype will also allow constructing arrays from ragged lists array([[1],[1,2,3]], ndarray) -> array([array([1]), array([1,2,3])], dtype=ndarray) Currently this construct is equivalent to array([[1],[1,2,3]],object) and returns a strange result: >>> array([[1],[1,2,3]],ndarray) array([[1, 1, 1], [1, 2, 3]], dtype=object) >>> array([[1],[1,2,3]],ndarray).shape (2, 3) Interestingly, >>> array([[1,2],[1,2,3]],ndarray).shape () -- sasha On 1/10/06, Robert Kern wrote: > ... > I think we should reconsider the decision to return scalar array objects when > indexing into object arrays. The concerns for indexing into numeric arrays are > much different than for object arrays. There are no type casting issues with > object arrays. People use object arrays expecting to be able to access the > objects themselves, not some proxy. I'm not sure that the genericity benefits > (e.g. always being able to get the .shape attribute) one gets with scalar arrays > carry over very well to the use cases of object arrays. Object arrays have > always really been a collection of special cases, so I'm not terribly concerned > about giving them exceptional behavior. > > -- > Robert Kern > robert.kern at gmail.com > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From pearu at scipy.org Tue Jan 10 15:50:53 2006 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 10 Jan 2006 14:50:53 -0600 (CST) Subject: [SciPy-dev] Question about fft overwrites; still scipy.fft problem on OS X In-Reply-To: <200601102110.k0ALA2s17947@laplace.astro.cornell.edu> References: <200601102110.k0ALA2s17947@laplace.astro.cornell.edu> Message-ID: On Tue, 10 Jan 2006, Tom Loredo wrote: > The current scipy checkout (1544) and numpy 1863 and 1864 both > produce "Overwriting fft" and "Overwriting ifft" warnings > when scipy is imported. This happens both on FC3 and on OS X > (10.3.9), so I presume it's not unique to my machines. Is > this anything I need to worry about? I have to freeze > a numpy/scipy tarball tomorrow for a sysadmin who has to > install them on student machines; I'll just tell him and > the students that the warning is innocuous, if it really is. Yes, they are. You can disable these warnings by defining env. variable SCIPY_IMPORT_VERBOSE=-1 > scipy's fft/ifft (scipy.fft, scipy.ifft) still give erroneous > results on OS X, though not on FC3 (both with recent installs > of FFTW2). numpy.fft works fine on both platforms, so I'll > have my students stick with that. scipy.fft worked fine on > OS X last week.... > > from scipy import * > x = arange(16)*2*pi/16 > sx = sin(x) > sfx = fft(sx) > sfix = ifft(sfx) > print max(abs(sx-sfix.real)) > > On OS X Panther this prints "1.0"; change "scipy" to "numpy" and > it prints "1.11e-16" (or thereabouts). On FC3, both work fine. By any change, do you have djbfft installed? You can disable detecting djbfft by defining env. variable DJBFFT=None Pearu From oliphant at ee.byu.edu Tue Jan 10 17:18:03 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 10 Jan 2006 15:18:03 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> Message-ID: <43C4329B.5030208@ee.byu.edu> Sasha wrote: >I've commited the change (see >http://projects.scipy.org/scipy/numpy/changeset/1864) that supports >x[...] and x[()]. I will implement x[...] = value soon. The >remaining feature x[..., newaxis] may require some >refactoring, so no promises here. I have created a Wiki page to >document the progress. > > I would probably just handle this as a special case and not try to fit it in to the general index-parsing code which is a bit complicated because of all the functionality. Of course any improvements to that code are always welcome... -Travis From schofield at ftw.at Tue Jan 10 19:29:41 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 11 Jan 2006 00:29:41 +0000 Subject: [SciPy-dev] Question about fft overwrites ... In-Reply-To: <200601102110.k0ALA2s17947@laplace.astro.cornell.edu> References: <200601102110.k0ALA2s17947@laplace.astro.cornell.edu> Message-ID: On 10/01/2006, at 9:10 PM, Tom Loredo wrote: > ... I have to freeze > a numpy/scipy tarball tomorrow for a sysadmin who has to > install them on student machines; ... Well, the 0.4.4 tarball is now accessible from the http:// new.scipy.org/Wiki/Download, so you can use that if you like. It's based on SVN revision 1544 from yesterday. I haven't made a release announcement yet, since I'm still having trouble uploading the Windows binaries to the new.scipy.org site, but we'll upload them all to SourceForge soon anyway ... -- Ed From ndarray at mac.com Tue Jan 10 19:34:00 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 19:34:00 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: <43C4329B.5030208@ee.byu.edu> References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: Zero rank array assignments are in (http://projects.scipy.org/scipy/numpy/changeset/1866). I will follow Travis' suggestion and implement x[..., newaxis] as a special case. I still believe that the code can be refactored so that no explicit checks for nd==0 are necessary, but this can be done later. Also, before attempting any refactoring I would like to have a comprehansive test_multiarray that covers all the branches. Any volunteers? -- sasha On 1/10/06, Travis Oliphant wrote: > Sasha wrote: > > >I've commited the change (see > >http://projects.scipy.org/scipy/numpy/changeset/1864) that supports > >x[...] and x[()]. I will implement x[...] = value soon. The > >remaining feature x[..., newaxis] may require some > >refactoring, so no promises here. I have created a Wiki page to > >document the progress. > > > > > I would probably just handle this as a special case and not try to fit > it in to the general index-parsing code which is a bit complicated > because of all the functionality. Of course any improvements to that > code are always welcome... > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From oliphant.travis at ieee.org Tue Jan 10 20:12:38 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 10 Jan 2006 18:12:38 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: References: <43C416AD.8060304@gmail.com> Message-ID: Sasha wrote: > "Always being able to get the .shape attribute" is a nice feature and > I would rather > not loose it without a compelling reason. Note that making ndarray a > valid dtype > will not have an effect of a[i] loosing array API because ndarrays > already have it by > definition. For objects that don't have their own array interface one > can entertain an > idea of creating a scalar class dynamically that inherits from the > type of the object and > mixes in array API using muliple inheritance or some metaclass magic. We can also special-case the object-array typeobject so that we pass off all behavior in the function table to the underlying object. This is what I did, for example, to support attribute getting and setting. It would be a straightforward thing to do it as well for the other function pointers in the type-object table. Asking whether the object is a sub-class of a certain type would certainly fail still (although there may be a work around for this if we use a different meta-class). > > Making ndarray a valid dtype will also allow constructing arrays from > ragged lists > Originally an ndarray was a valid dtype, but it was removed to help in detecting the common error zeros(3,4) when you mean zeros((3,4)) of course the use-case would be zeros(dim1,dim2) where dim2 was an ndarray that should be interpreted as an integer. I suppose, that because ndarray is now a valid data-type as a typeobject, then we could easily special case it to make it more meaningful. -Travis From oliphant at ee.byu.edu Tue Jan 10 20:22:00 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 10 Jan 2006 18:22:00 -0700 Subject: [SciPy-dev] [SciPy-user] NumPy On OpenBSD In-Reply-To: <34090E25C2327C4AA5D276799005DDE0E34B90@SMDMX0501.mds.mdsinc.com> References: <34090E25C2327C4AA5D276799005DDE0E34B90@SMDMX0501.mds.mdsinc.com> Message-ID: <43C45DB8.1050203@ee.byu.edu> LATORNELL, Doug wrote: >Thanks, Travis. > >I've found BSD and OpenBSD #defines in my sys/param.h file. Or-ing one >of those in seems like it should work, but I think fpgetsticky() is >returning 0. I'm trying to dig into that now and understand it. > >BTW, the OpenBSD porting guide page >(http://www.openbsd.org/porting.html#Generic) is kind of adamant about >*not* using __OpenBSD__. Their take is test for features, not specific >OSes... > > Yes, that is probably a better way to do it. I followed numarray's lead in how they were handling the IEEE floating-point stuff platform-to-platform. But, we could take advantage of the decent configuration system already present where we detect quite a few things at build time in the setup.py file. Tests for the existence of different IEEE functions could be defined there as well. The presence of specific functions would tell us which method is being used and we could then define a UFUNC_IEEE_FPMETHOD variable that specified which type was in use. The only other problem is that different platforms need different header files for this capability, so we would probably need to define another variable to pick-up the correct headers which would complicate matters. -Travis From ndarray at mac.com Tue Jan 10 22:37:48 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 22:37:48 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: As promised, newaxis support is in svn. I am reasonably confident that I've got memory management issues right, but I would appreciate if someone reviews the code. (http://projects.scipy.org/scipy/numpy/changeset/1871). I will explain the current state of affairs in Wiki (http://projects.scipy.org/scipy/numpy/changeset/1866). Is this the right Wiki for this type of stuff? I don't have access to new.scipy.org/Wiki. -- sasha On 1/10/06, Sasha wrote: > Zero rank array assignments are in > (http://projects.scipy.org/scipy/numpy/changeset/1866). I will follow > Travis' suggestion and implement x[..., newaxis] as a special case. > I still believe that the code can be refactored so that no explicit > checks for nd==0 are > necessary, but this can be done later. Also, before attempting any > refactoring I would like to have a comprehansive test_multiarray that > covers all the branches. Any volunteers? > > -- sasha > > On 1/10/06, Travis Oliphant wrote: > > Sasha wrote: > > > > >I've commited the change (see > > >http://projects.scipy.org/scipy/numpy/changeset/1864) that supports > > >x[...] and x[()]. I will implement x[...] = value soon. The > > >remaining feature x[..., newaxis] may require some > > >refactoring, so no promises here. I have created a Wiki page to > > >document the progress. > > > > > > > > I would probably just handle this as a special case and not try to fit > > it in to the general index-parsing code which is a bit complicated > > because of all the functionality. Of course any improvements to that > > code are always welcome... > > > > -Travis > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-dev > > > From ndarray at mac.com Tue Jan 10 22:41:54 2006 From: ndarray at mac.com (Sasha) Date: Tue, 10 Jan 2006 22:41:54 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: Correction: The Wiki page is at http://projects.scipy.org/scipy/numpy/wiki/ZeroRankArray Sorry for the extra traffic. -- sasha On 1/10/06, Sasha wrote: > As promised, newaxis support is in svn. I am reasonably confident that > I've got memory management issues right, but I would appreciate if someone > reviews the code. (http://projects.scipy.org/scipy/numpy/changeset/1871). > > I will explain the current state of affairs in Wiki > (http://projects.scipy.org/scipy/numpy/changeset/1866). Is this the > right Wiki for this type of stuff? I don't have access to > new.scipy.org/Wiki. > > -- sasha > > > On 1/10/06, Sasha wrote: > > Zero rank array assignments are in > > (http://projects.scipy.org/scipy/numpy/changeset/1866). I will follow > > Travis' suggestion and implement x[..., newaxis] as a special case. > > I still believe that the code can be refactored so that no explicit > > checks for nd==0 are > > necessary, but this can be done later. Also, before attempting any > > refactoring I would like to have a comprehansive test_multiarray that > > covers all the branches. Any volunteers? > > > > -- sasha > > > > On 1/10/06, Travis Oliphant wrote: > > > Sasha wrote: > > > > > > >I've commited the change (see > > > >http://projects.scipy.org/scipy/numpy/changeset/1864) that supports > > > >x[...] and x[()]. I will implement x[...] = value soon. The > > > >remaining feature x[..., newaxis] may require some > > > >refactoring, so no promises here. I have created a Wiki page to > > > >document the progress. > > > > > > > > > > > I would probably just handle this as a special case and not try to fit > > > it in to the general index-parsing code which is a bit complicated > > > because of all the functionality. Of course any improvements to that > > > code are always welcome... > > > > > > -Travis > > > > > > _______________________________________________ > > > Scipy-dev mailing list > > > Scipy-dev at scipy.net > > > http://www.scipy.net/mailman/listinfo/scipy-dev > > > > > > From strawman at astraw.com Wed Jan 11 00:24:49 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 10 Jan 2006 21:24:49 -0800 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: <43C496A1.6050802@astraw.com> Sasha wrote: >I will explain the current state of affairs in Wiki >(http://projects.scipy.org/scipy/numpy/changeset/1866). Is this the >right Wiki for this type of stuff? > I think it's good to be concerned with what goes on which wiki right now because, as both Joe Harrington and Fernando have pointed out, we *don't* want our wiki to grow by accretion into an unusable blob. The main point, which I agree with, is that a little bit of work on structuring the wiki(s) now can save a lot of pain later. We did envision a separation between the user (newbie) -friendly welcome site/wiki now played by MoinMoin and the developer "hard hat zone" wiki played by Trac. And I think the content you wrote at http://projects.scipy.org/scipy/numpy/wiki/ZeroRankArray definitely falls into the hard hat category, so I think its fine to put there. (Maybe once the dust has settled, though, this would be good to write a mini-tutorial in the cookbook about.) I did make a link to the Trac instances and the svn repositories from http://new.scipy.org/Wiki/Developer_Zone . Certainly some more text on that page to explain the multi-wiki separation -- I just haven't had time to do it. >I don't have access to >new.scipy.org/Wiki. > Why do you say that? What happens when you try to login? (In fact, I think you should be able to edit pages without even logging in.) Let me know if you still think you don't have access -- I will definitely fix it ASAP. Cheers! Andrew From oliphant.travis at ieee.org Wed Jan 11 00:29:04 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 10 Jan 2006 22:29:04 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: <43C497A0.2000507@ieee.org> Sasha wrote: >As promised, newaxis support is in svn. I am reasonably confident that >I've got memory management issues right, but I would appreciate if someone >reviews the code. (http://projects.scipy.org/scipy/numpy/changeset/1871). > > It looks good so far. Thanks for this effort. Sasha is doing an epecially great job of adding unit tests when changes are made... An example to us all. -Travis From ndarray at mac.com Wed Jan 11 00:39:30 2006 From: ndarray at mac.com (Sasha) Date: Wed, 11 Jan 2006 00:39:30 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: <43C496A1.6050802@astraw.com> References: <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> <43C496A1.6050802@astraw.com> Message-ID: > >I don't have access to > >new.scipy.org/Wiki. > > > Why do you say that? What happens when you try to login? "Unknown user name: "sasha". Please enter user name and password." I use the same name/password as for svn and Trac Wiki. -- sasha From strawman at astraw.com Wed Jan 11 00:44:19 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 10 Jan 2006 21:44:19 -0800 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> <43C496A1.6050802@astraw.com> Message-ID: <43C49B33.6000706@astraw.com> Sasha wrote: >>>I don't have access to >>>new.scipy.org/Wiki. >>> >>> >>> >>Why do you say that? What happens when you try to login? >> >> > >"Unknown user name: "sasha". Please enter user name and password." > >I use the same name/password as for svn and Trac Wiki. > > The MoinMoin logins and passwords are totally separate from the other stuff. You'll have to create a new one. From oliphant.travis at ieee.org Wed Jan 11 02:30:17 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 00:30:17 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: References: Message-ID: <43C4B409.6070606@ieee.org> Sasha wrote: >In Numeric nested arrays could be indexed using multiple subscripts: > > > >>>>from Numeric import * >>>>x = empty(2,'O') >>>>x[0] = array([1,2]) >>>>x[1] = array([1,2,3]) >>>>x >>>> >>>> >array([[1 2] , [1 2 3] ],'O') > > >>>>x[1][2] >>>> >>>> >3 > >This feature is lost in NumPy: > > > >>>>x = empty(2, object) >>>>x[0] = array([1,2]) >>>>x[1] = array([1,2,3]) >>>>x[1][2] >>>> >>>> >Traceback (most recent call last): > File "", line 1, in ? >TypeError: unsubscriptable object > > > I've committed two changes 1) Now all object array-scalars look to the proxied object for it's mapping, sequence, and buffer behavior 2) An ndarray (or subclass) is never wrapped in an object array. It already has the needed methods so why wrap it... This will allow ragged arrays to be defined and used more easily I believe. -Travis From oliphant.travis at ieee.org Wed Jan 11 02:34:13 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 00:34:13 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: References: <200601091254.10861.faltet@carabos.com> <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> Message-ID: <43C4B4F5.4060006@ieee.org> Sasha wrote: >As promised, newaxis support is in svn. I am reasonably confident that >I've got memory management issues right, but I would appreciate if someone >reviews the code. (http://projects.scipy.org/scipy/numpy/changeset/1871). > > I made just a couple of changes to the new checkin: now only 1 ellipsis is allowed. In the n-d case multiple ellipsis always treats elipsis after the first as ':' (full slice objects). But 0-d arrays can't use the ':' object and so multiple ellipses should not be allowed. -Travis From Fernando.Perez at colorado.edu Wed Jan 11 03:00:14 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 11 Jan 2006 01:00:14 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C4B409.6070606@ieee.org> References: <43C4B409.6070606@ieee.org> Message-ID: <43C4BB0E.8010609@colorado.edu> Travis Oliphant wrote: > I've committed two changes > > 1) Now all object array-scalars look to the proxied object for it's > mapping, sequence, and buffer behavior I wonder if the scalars returned by object arrays should't just be the plain, 'unboxed' underlying object. Here, I agree with Robert that object arrays are special anyway, and the main purpose of using ndarrays of 'O' type is so you have the convenient indexing, slicing and arithmetic syntactic support of ndarrays. I see object arrays basically as a way of 'carrying around' a collection of arbitrary python objects. But having scalar indexing return an opaque proxy breaks almost completely this kind of usage, because you can't get back, in a straightforward manner, the things you put in the container to begin with. This should illustrate my point: In [20]: x = empty(2, object) In [21]: class foo: ....: def hi(self): ....: return 'hi' ....: ....: In [22]: x[0] = foo() In [23]: x[1] = foo() In [24]: x[0].hi() --------------------------------------------------------------------------- exceptions.AttributeError Traceback (most recent call last) /home/fperez/ AttributeError: 'object_arrtype' object has no attribute 'hi' It gets better: In [45]: x[0] Out[45]: <__main__.foo instance at 0x40a38f4c> In [46]: isinstance(x[0],foo) Out[46]: False These two look positively bizarre, and almost contradictory. Of course, I understand what's happening under the hood, but I think it's an excellent example of the 'law of leaky abstractions' at play. For the regular datatypes, I understand the reasons behind array scalars and can see value in that. But I really think that, if we keep the same behavior for object arrays, the opaque proxies that come out end up making the construction, in practice, useless. Perhaps I'm missing something, but I'd like to at least understand your reasoning better here, as I otherwise see this as costing us a lot. Cheers, f From oliphant.travis at ieee.org Wed Jan 11 03:26:12 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 01:26:12 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C4BB0E.8010609@colorado.edu> References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> Message-ID: <43C4C124.7040300@ieee.org> Fernando Perez wrote: >Travis Oliphant wrote: > > > >>I've committed two changes >> >>1) Now all object array-scalars look to the proxied object for it's >>mapping, sequence, and buffer behavior >> >> > >I wonder if the scalars returned by object arrays should't just be the plain, >'unboxed' underlying object. Here, I agree with Robert that object arrays are >special anyway, and the main purpose of using ndarrays of 'O' type is so you >have the convenient indexing, slicing and arithmetic syntactic support of >ndarrays. I see object arrays basically as a way of 'carrying around' a >collection of arbitrary python objects. > >But having scalar indexing return an opaque proxy breaks almost completely >this kind of usage, because you can't get back, in a straightforward manner, >the things you put in the container to begin with. This should illustrate my >point: > > Frankly, I'm willing to get rid of them. I suggested that a couple of months ago, but a few people said that having the attributes even on object scalars was going to be useful, so I capitulated and just tried to improve the proxying... I suppose a more thorough discussion is warranted. I think with some effort one could figure out how to make it always appear that the object array-scalar type is the actual underlying object but now with the attributes of arrays. Repeat your example with a new-style class, to see what you get however. In [1]: from scipy import * In [2]: x = empty(2, object) In [3]: class foo(object): ...: def hi(self): ...: return 'hi' ...: In [4]: x[0] = foo() In [5]: x[1] = foo() In [6]: x[0].hi() Out[6]: 'hi' In [7]: x[0] Out[7]: <__main__.foo object at 0x47ff345c> In [8]: isinstance(x[0], foo) Out[8]: True In [9]: type(x[0]) Out[9]: So, at least with new-style classes we are getting something useful. Notice that also In [10]: x[0].shape Out[10]: () In [11]: x[0].strides Out[11]: () In [12]: x[0].flags Out[12]: {'ALIGNED': True, 'CONTIGUOUS': True, 'FORTRAN': True, 'OWNDATA': True, 'UPDATEIFCOPY': False, 'WRITEABLE': False} So, I'm still undecided. Perhaps we can figure out why old-style classes aren't working this way... -Travis From oliphant.travis at ieee.org Wed Jan 11 03:37:53 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 01:37:53 -0700 Subject: [SciPy-dev] numpy changes generator behaviour In-Reply-To: References: Message-ID: <43C4C3E1.7020107@ieee.org> Christopher Fonnesbeck wrote: > I'm not sure this is intended, but it appears that numpy changes the > behaviour of the new python 2.4 generators. Here is what is supposed > to happen: > > >>> sum(x*x for x in range(10)) > 285 > > But if I import numpy, and try again I get: > > >>> from numpy import * > >>> sum(x*x for x in range(10)) > > > Why should it no longer sum? As mentioned the sum command from NumPy is being used which was inherited from Numeric. I just improved it so that it handles generators like this, however. So, your example should work in current svn. -Travis From Fernando.Perez at colorado.edu Wed Jan 11 03:47:54 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 11 Jan 2006 01:47:54 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C4C124.7040300@ieee.org> References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> Message-ID: <43C4C63A.9040507@colorado.edu> Travis Oliphant wrote: > Fernando Perez wrote: >>But having scalar indexing return an opaque proxy breaks almost completely >>this kind of usage, because you can't get back, in a straightforward manner, >>the things you put in the container to begin with. This should illustrate my >>point: >> >> > > > Frankly, I'm willing to get rid of them. I suggested that a couple of > months ago, but a few people said that having the attributes even on > object scalars was going to be useful, so I capitulated and just tried > to improve the proxying... > > I suppose a more thorough discussion is warranted. I think with some > effort one could figure out how to make it always appear that the object > array-scalar type is the actual underlying object but now with the > attributes of arrays. > > Repeat your example with a new-style class, to see what you get however. Well, let me put a mean twist on this: In [10]: class foo(object): ....: flags = 'they come in all kinds of pretty colors' ....: def hi(self): ....: return 'hi' ....: ....: In [11]: x = empty(2, object) In [12]: x[0] = foo() In [13]: x[0].hi() Out[13]: 'hi' In [14]: x[0].shape Out[14]: () In [15]: x[0].flags Out[15]: 'they come in all kinds of pretty colors' Again, I think there's too strong of a clash between the abstraction of array scalar proxies and whatever object is being boxed in. For the regular datatypes I think this is OK, as you normally don't go around calling 3.dosomething() in Python. But for a container meant to accept arbitrary objects, I'm not convinced that the cost of this kind of shadowing and blending of internal and proxy attributes is any good. I really don't claim to understand one tenth of the (often contradictory) constraints that have led you to all these design decisions, and I trust your instinct a lot. But in this case, my (perhaps naive) vote would be for x[0] to return the raw, un-proxied object always. I can also see the flip side of the argument (not having to special-case code, so you can always assume that x[0].shape exists). However, I think that my example above shows that this 'guarantee' is so weak as to be useless: In [17]: if x[0].flags['ALIGNED']: ....: print 'the stars are smiling' ....: --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /home/fperez/ TypeError: string indices must be integers Basically, if you get fed an 'O' array, today all bets are off as to what's going to come out of indexing a scalar element. So you might as well, at least, have the guarantee that you get what you put in there to begin with, instead of the bizarre hybrid that comes out today. OK, I've managed to convince myself that this _is_ the right thing to do (TM), and that 'O' arrays should return un-proxied values. I vote +1 on that until proven wrong (in 3 minutes, I'm sure :) > So, I'm still undecided. Perhaps we can figure out why old-style > classes aren't working this way... (I think) it's because only new-style classes implement the full descriptor protocol. I imagine you are using property() for all these special attributes, and property() fails in silent, mysterious ways with old-style classes (I know because it was precisely this issue that forced me to make the main ipython class new-style a week ago). Cheers, f From oliphant.travis at ieee.org Wed Jan 11 04:16:24 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 02:16:24 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C4C63A.9040507@colorado.edu> References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> <43C4C63A.9040507@colorado.edu> Message-ID: <43C4CCE8.70700@ieee.org> >Well, let me put a mean twist on this: > >In [10]: class foo(object): > ....: flags = 'they come in all kinds of pretty colors' > ....: def hi(self): > ....: return 'hi' > ....: > ....: > >In [11]: x = empty(2, object) > >In [12]: x[0] = foo() > >In [13]: x[0].hi() >Out[13]: 'hi' > >In [14]: x[0].shape >Out[14]: () > >In [15]: x[0].flags >Out[15]: 'they come in all kinds of pretty colors' > > I'm about +0.5 for returning the underlying object all the time. Actually, up until a few checkins ago this would have always returned the array attribute because they were looked for first. And it could go back to this too. >Again, I think there's too strong of a clash between the abstraction of array >scalar proxies and whatever object is being boxed in. For the regular >datatypes I think this is OK, as you normally don't go around calling >3.dosomething() in Python. But for a container meant to accept arbitrary >objects, I'm not convinced that the cost of this kind of shadowing and >blending of internal and proxy attributes is any good. > > I'm not convinced either, but I'm also not (quite) convinced that they should disappear. It would actually be quite simple to leave the object array scalars in place but never return them. In fact, we could make it a user settable option ;-) just like the default array type ;-) >I can also see the flip side of the argument (not having to special-case code, >so you can always assume that x[0].shape exists). However, I think that my >example above shows that this 'guarantee' is so weak as to be useless: > > I think that's why I originally chose to look for the array attributes first....so this guarantee would always be there. >Basically, if you get fed an 'O' array, today all bets are off as to what's >going to come out of indexing a scalar element. > >So you might as well, at least, have the guarantee that you get what you put >in there to begin with, instead of the bizarre hybrid that comes out today. > > The basic OBJECT type ndarray has high likelihood of going into Python and you can bet that no strange hybrid will be acceptable on return. It would be literally one-line of code to turn off the returning of the hybrid object arrays, so I would not hesitate to do it and then see what kind of complaints we get :-) -Travis From cookedm at physics.mcmaster.ca Wed Jan 11 06:44:56 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Wed, 11 Jan 2006 06:44:56 -0500 Subject: [SciPy-dev] cephes.tandg test In-Reply-To: References: Message-ID: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> On Jan 10, 2006, at 09:35 , Arnd Baecker wrote: > Hi, > > all numpy/scipy tests pass on the Itanium2, apart from > the following error: > > ====================================================================== > FAIL: check_tandg (scipy.special.basic.test_basic.test_cephes) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site- > packages/scipy/special/tests/test_basic.py", > line 436, in check_tandg > assert_equal(cephes.tandg(45),1.0) > File > "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site- > packages/numpy/testing/utils.py", > line 81, in assert_equal > assert desired == actual, msg > AssertionError: > Items are not equal: > DESIRED: 1.0 > ACTUAL: 1.0000000000000002 > ---------------------------------------------------------------------- > > The is caused by the strict equality test: > > def check_tandg(self): > assert_equal(cephes.tandg(45),1.0) > > Is there a reason, why one should get 1.0 up to the last bit? > If not an `assert_almost_equal` might be more appropriate? Well, I'd argue that if you've gone to the trouble of making a version of tan that takes degrees, it *should* return exact values for those that it can (tan(0) = 0, tan(+-45) = +-1). The cephes code was weird enough that it could miss that, so I added a special case test http://projects.scipy.org/scipy/scipy/changeset/1545 See if it works now. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From arnd.baecker at web.de Wed Jan 11 07:46:12 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 11 Jan 2006 13:46:12 +0100 (CET) Subject: [SciPy-dev] cephes.tandg test In-Reply-To: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> References: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> Message-ID: Hi, On Wed, 11 Jan 2006, David M. Cooke wrote: > On Jan 10, 2006, at 09:35 , Arnd Baecker wrote: [...] > > DESIRED: 1.0 > > ACTUAL: 1.0000000000000002 > > ---------------------------------------------------------------------- > > > > The is caused by the strict equality test: > > > > def check_tandg(self): > > assert_equal(cephes.tandg(45),1.0) > > > > Is there a reason, why one should get 1.0 up to the last bit? > > If not an `assert_almost_equal` might be more appropriate? > > Well, I'd argue that if you've gone to the trouble of making a > version of tan that takes degrees, it *should* return exact values > for those that it can (tan(0) = 0, tan(+-45) = +-1). > > The cephes code was weird enough that it could miss that, so I added > a special case test > http://projects.scipy.org/scipy/scipy/changeset/1545 > > See if it works now. Well: ====================================================================== FAIL: check_cotdg (scipy.special.basic.test_basic.test_cotdg) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 771, in check_cotdg assert_almost_equal(ct,ctrl,8) File "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site-packages/numpy/testing/utils.py", line 101, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 1.7320508075688774 ACTUAL: nan ====================================================================== FAIL: check_specialpoints (scipy.special.basic.test_basic.test_tandg) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 2016, in check_specialpoints assert_equal(tandg(-45), -1.0) File "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site-packages/numpy/testing/utils.py", line 81, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: DESIRED: -1.0 ACTUAL: 1.0 ====================================================================== FAIL: check_tandg (scipy.special.basic.test_basic.test_tandg) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 2003, in check_tandg assert_almost_equal(tn,tnrl,8) File "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site-packages/numpy/testing/utils.py", line 101, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 0.57735026918962573 ACTUAL: 0.0 ====================================================================== FAIL: check_tandgmore (scipy.special.basic.test_basic.test_tandg) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_lintst_n_nothing/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 2011, in check_tandgmore assert_almost_equal(tnm1,tnmrl1,8) File "/home/baecker/python2//scipy_lintst_n_nothing/lib/python2.4/site-packages/numpy/testing/utils.py", line 101, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 1.7320508075688767 ACTUAL: 1.0 Clearly, something weird is going on ... In [1]: import numpy In [2]: numpy.__version__ Out[2]: '0.9.3.1863' In [3]: import scipy Overwriting fft= from scipy.fftpack.basic (was from numpy.dft.fftpack) Overwriting ifft= from scipy.fftpack.basic (was from numpy.dft.fftpack) ._ In [4]: scipy.__version__ Out[4]: '0.4.4.1545' (BTW: I really don't like those "Overwriting" messages, and I also have reservations about steering that with an environment variable - but that's a different subject). What can I do to find out what is going wrong with tandg (not that I need that function at all, but a failing unit test could point to something deeper ...) Best, Arnd From cookedm at physics.mcmaster.ca Wed Jan 11 08:44:21 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Wed, 11 Jan 2006 08:44:21 -0500 Subject: [SciPy-dev] cephes.tandg test In-Reply-To: References: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> Message-ID: On Jan 11, 2006, at 07:46 , Arnd Baecker wrote: > Hi, > > On Wed, 11 Jan 2006, David M. Cooke wrote: > >> On Jan 10, 2006, at 09:35 , Arnd Baecker wrote: > [...] >>> DESIRED: 1.0 >>> ACTUAL: 1.0000000000000002 >>> -------------------------------------------------------------------- >>> -- >>> >>> The is caused by the strict equality test: >>> >>> def check_tandg(self): >>> assert_equal(cephes.tandg(45),1.0) >>> >>> Is there a reason, why one should get 1.0 up to the last bit? >>> If not an `assert_almost_equal` might be more appropriate? >> >> Well, I'd argue that if you've gone to the trouble of making a >> version of tan that takes degrees, it *should* return exact values >> for those that it can (tan(0) = 0, tan(+-45) = +-1). >> >> The cephes code was weird enough that it could miss that, so I added >> a special case test >> http://projects.scipy.org/scipy/scipy/changeset/1545 >> >> See if it works now. > > Well: > [snip test failures] Oh, foo. I get those too, now that I made sure it remade the file. (I wish distutils did proper dependency checking!!!!!!) > (BTW: I really don't like those "Overwriting" messages, > and I also have reservations about steering that with an environment > variable - but that's a different subject). me neither... > What can I do to find out what is going wrong with tandg > (not that I need that function at all, but a failing unit > test could point to something deeper ...) Well, at the moment, these are just pointing to my messing up of tandg :) Since I can reproduce your failures, I'll be able to fix it. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Wed Jan 11 09:14:10 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Wed, 11 Jan 2006 09:14:10 -0500 Subject: [SciPy-dev] cephes.tandg test In-Reply-To: References: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> Message-ID: <610A4DD3-4B70-4C18-AD7F-F7C0CD5F10F7@physics.mcmaster.ca> On Jan 11, 2006, at 08:44 , David M. Cooke wrote: > On Jan 11, 2006, at 07:46 , Arnd Baecker wrote: > >> Hi, >> >> On Wed, 11 Jan 2006, David M. Cooke wrote: >> >>> On Jan 10, 2006, at 09:35 , Arnd Baecker wrote: >> [...] >>>> DESIRED: 1.0 >>>> ACTUAL: 1.0000000000000002 >>>> ------------------------------------------------------------------- >>>> - >>>> -- >>>> >>>> The is caused by the strict equality test: >>>> >>>> def check_tandg(self): >>>> assert_equal(cephes.tandg(45),1.0) >>>> >>>> Is there a reason, why one should get 1.0 up to the last bit? >>>> If not an `assert_almost_equal` might be more appropriate? >>> > >> What can I do to find out what is going wrong with tandg >> (not that I need that function at all, but a failing unit >> test could point to something deeper ...) > > Well, at the moment, these are just pointing to my messing up of > tandg :) Since I can reproduce your failures, I'll be able to fix it. ... and it should be fixed now, with revision 1546. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From arnd.baecker at web.de Wed Jan 11 09:45:51 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 11 Jan 2006 15:45:51 +0100 (CET) Subject: [SciPy-dev] cephes.tandg test In-Reply-To: <610A4DD3-4B70-4C18-AD7F-F7C0CD5F10F7@physics.mcmaster.ca> References: <6DC8FBB8-B714-4A2C-8DA8-5EE5E70C66F6@physics.mcmaster.ca> <610A4DD3-4B70-4C18-AD7F-F7C0CD5F10F7@physics.mcmaster.ca> Message-ID: On Wed, 11 Jan 2006, David M. Cooke wrote: > ... and it should be fixed now, with revision 1546. Yep! Great - many thanks! Arnd From ndarray at mac.com Wed Jan 11 12:18:45 2006 From: ndarray at mac.com (Sasha) Date: Wed, 11 Jan 2006 12:18:45 -0500 Subject: [SciPy-dev] [Numpy-discussion] Re: How to handle a[...] in numpy? In-Reply-To: <43C4B4F5.4060006@ieee.org> References: <200601092003.35687.faltet@carabos.com> <43C3D70D.7090709@ieee.org> <43C4329B.5030208@ee.byu.edu> <43C4B4F5.4060006@ieee.org> Message-ID: On 1/11/06, Travis Oliphant wrote: > ... > I made just a couple of changes to the new checkin: now only 1 ellipsis > is allowed. In the n-d case multiple ellipsis always treats elipsis > after the first as ':' (full slice objects). But 0-d arrays can't use > the ':' object and so multiple ellipses should not be allowed. As I mentioned on the Wiki page, I did not see any reason to allow multiple Ellipses and only added them for consistency with the N-d case, so I like what you did. However, N-d arrays do allow a mix of multiple Ellipses and newaxes: >>> from numpy import * >>> x = array([[1,2,3],[4,5,6]]) >>> x.shape (2, 3) >>> x[newaxis,...,newaxis,...].shape (1, 2, 1, 3) >>> import numpy; numpy.__version__ '0.9.3.1831' I think this should be disallowed in favor of more explicit >>> x[newaxis,:,newaxis,...].shape (1, 2, 1, 3) -- sasha From jonathan.taylor at utoronto.ca Wed Jan 11 16:43:47 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Wed, 11 Jan 2006 16:43:47 -0500 Subject: [SciPy-dev] Can we make scipy.org redirect to new.scipy.org/Wiki Message-ID: <463e11f90601111343l28804b6jdc7168d6b499f1e6@mail.gmail.com> Can we make scipy.org redirect to new.scipy.org/Wiki? Or maybe better is to move the actual wiki to scipy.org? It's a shame to let so many people running into scipy.org go on oblivious to all this work you gals and guys are doing. Cheers. Jon. From oliphant.travis at ieee.org Wed Jan 11 23:08:09 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 11 Jan 2006 21:08:09 -0700 Subject: [SciPy-dev] [SciPy-user] numpy's math library? In-Reply-To: References: <6B2C5019-4EB3-4636-974C-74521749DDBE@physics.mcmaster.ca> <43C5CAAD.2010900@gmail.com> Message-ID: <43C5D629.7090401@ieee.org> David M. Cooke wrote: >Robert Kern writes: > > > >>David M. Cooke wrote: >> >> >> >>>I've also exposed log1p(x) = log(1+x) and expm1(x) = exp(x)-1 as >>>ufuncs, since those are quite useful if you're worrying about >>>cancellation errors. >>> >>> >>Of course, they're useful, but they're also in scipy.special. Let's try not to >>migrate more things from scipy to numpy than we strictly have to. So, I'm -1 on >>exposing log1p() and expm1() in numpy. >> >> > >They're also part of the C99 standard, so I'd say there is some >argument for making them part of numpy: exposing functions already >defined by the C library. Mind you, we're also missing exp10, pow10, >exp2, log2, cbrt, erf, erfc, lgamma, tgamma, and a few others. > > > Will we really have to test for all of these if we add them? It would be nice to have a single HAVE_C99 defined that we could use to test for the presence and/or absence of these functions. -Travis From charlesr.harris at gmail.com Thu Jan 12 00:07:39 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 11 Jan 2006 22:07:39 -0700 Subject: [SciPy-dev] [SciPy-user] numpy's math library? In-Reply-To: <43C5D629.7090401@ieee.org> References: <6B2C5019-4EB3-4636-974C-74521749DDBE@physics.mcmaster.ca> <43C5CAAD.2010900@gmail.com> <43C5D629.7090401@ieee.org> Message-ID: On 1/11/06, Travis Oliphant wrote > > David M. Cooke wrote: > > >Robert Kern writes: > > > > > > > >>David M. Cooke wrote: > >> > >> > >> > >>>I've also exposed log1p(x) = log(1+x) and expm1(x) = exp(x)-1 as > >>>ufuncs, since those are quite useful if you're worrying about > >>>cancellation errors. > >>> > >>> > >>Of course, they're useful, but they're also in scipy.special. Let's try > not to > >>migrate more things from scipy to numpy than we strictly have to. So, > I'm -1 on > >>exposing log1p() and expm1() in numpy. > >> > >> > > > >They're also part of the C99 standard, so I'd say there is some > >argument for making them part of numpy: exposing functions already > >defined by the C library. Mind you, we're also missing exp10, pow10, > >exp2, log2, cbrt, erf, erfc, lgamma, tgamma, and a few others. > > > > > > > Will we really have to test for all of these if we add them? It would > be nice to have a single HAVE_C99 defined that we could use to test for > the presence and/or absence of these functions. I use erf and erfc pretty often. If only every c compiler were c99 compliant we could bring in all the standard functions. What I'd really like to do is add all of scipy.special into the numpy package space, but unfortunately significant parts are in fortran. Maybe we could have a small project to bring a selected bunch of functions over to c and then into numpy? Do we need complex versions of them too? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant.travis at ieee.org Thu Jan 12 00:28:48 2006 From: oliphant.travis at ieee.org (Travis E. Oliphant) Date: Wed, 11 Jan 2006 22:28:48 -0700 Subject: [SciPy-dev] [SciPy-user] numpy's math library? In-Reply-To: References: <6B2C5019-4EB3-4636-974C-74521749DDBE@physics.mcmaster.ca> <43C5CAAD.2010900@gmail.com> <43C5D629.7090401@ieee.org> Message-ID: Charles R Harris wrote: > > > > > > > > Will we really have to test for all of these if we add them? It would > be nice to have a single HAVE_C99 defined that we could use to test for > the presence and/or absence of these functions. > > > I use erf and erfc pretty often. If only every c compiler were c99 > compliant we could bring in all the standard functions. I think what we are proposing is a small project to bring just the functions defined in the C99 standard over to numpy. Yes, they would have to be in C to do that. Complex-versions would be nice but not necessary for all of them. -Travis From jdhunter at ace.bsd.uchicago.edu Thu Jan 12 00:59:14 2006 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 11 Jan 2006 23:59:14 -0600 Subject: [SciPy-dev] [SciPy-user] numpy's math library? In-Reply-To: (Charles R Harris's message of "Wed, 11 Jan 2006 22:07:39 -0700") References: <6B2C5019-4EB3-4636-974C-74521749DDBE@physics.mcmaster.ca> <43C5CAAD.2010900@gmail.com> <43C5D629.7090401@ieee.org> Message-ID: <874q4aq959.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Charles" == Charles R Harris writes: Charles> unfortunately significant parts are in fortran. Maybe we Charles> could have a small project to bring a selected bunch of Charles> functions over to c and then into numpy? Do we need Charles> complex versions of them too? 'Fraid so. JDH From charlesr.harris at gmail.com Thu Jan 12 01:22:16 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 11 Jan 2006 23:22:16 -0700 Subject: [SciPy-dev] [SciPy-user] numpy's math library? In-Reply-To: References: <6B2C5019-4EB3-4636-974C-74521749DDBE@physics.mcmaster.ca> <43C5CAAD.2010900@gmail.com> <43C5D629.7090401@ieee.org> Message-ID: On 1/11/06, Travis E. Oliphant wrote: > Charles R Harris wrote: > > > > > > > > > > > > > Will we really have to test for all of these if we add them? It would > > be nice to have a single HAVE_C99 defined that we could use to test for > > the presence and/or absence of these functions. > > > > > > I use erf and erfc pretty often. If only every c compiler were c99 > > compliant we could bring in all the standard functions. > > I think what we are proposing is a small project to bring just the > functions defined in the C99 standard over to numpy. Yes, they would > have to be in C to do that. Complex-versions would be nice but not > necessary for all of them. Here is a list to start with. Some of these probably aren't in the C99 standard and some might not be suitable for ufuncs. /* Arc cosine of X. */ acos /* Arc sine of X. */ asin /* Arc tangent of X. */ atan /* Arc tangent of Y/X. */ atan2 /* Cosine of X. */ cos /* Sine of X. */ sin /* Tangent of X. */ tan /* Hyperbolic cosine of X. */ cosh /* Hyperbolic sine of X. */ sinh /* Hyperbolic tangent of X. */ tanh /* Hyperbolic arc cosine of X. */ acosh /* Hyperbolic arc sine of X. */ asinh /* Hyperbolic arc tangent of X. */ atanh /* Exponential function of X. */ exp /* Break VALUE into a normalized fraction and an integral power of 2. */ frexp /* X times (two to the EXP power). */ ldexp /* Natural logarithm of X. */ log /* Base-ten logarithm of X. */ log10 /* Break VALUE into integral and fractional parts. */ modf /* A function missing in all standards: compute exponent to base ten. */ exp10 /* Another name occasionally used. */ pow10 /* Return exp(X) - 1. */ expm1 /* Return log(1 + X). */ log1p /* Return the base 2 signed integral exponent of X. */ logb /* Compute base-2 exponential of X. */ exp2 /* Compute base-2 logarithm of X. */ log2 /* Return X to the Y power. */ pow /* Return the square root of X. */ sqrt /* Return `sqrt(X*X + Y*Y)'. */ hypot /* Return the cube root of X. */ cbrt /* Smallest integral value not less than X. */ ceil /* Absolute value of X. */ fabs /* Largest integer not greater than X. */ floor /* Floating-point modulo remainder of X/Y. */ fmod /* Return the remainder of X/Y. */ drem /* Return the fractional part of X after dividing out `ilogb (X)'. */ significand /* Return X with its signed changed to Y's. */ copysign /* Bessel functions */ j0 j1 jn y0 y1 yn erf erfc lgamma tgamma /* Obsolete alias for `lgamma'. */ gamma /* Return the integer nearest X in the direction of the prevailing rounding mode. */ rint /* Return X + epsilon if X < Y, X - epsilon if X > Y. */ nextafter nexttoward /* Return the remainder of integer divison X / Y with infinite precision. */ remainder /* Return X times (2 to the Nth power). */ scalbn /* Return the binary exponent of X, which must be nonzero. */ ilogb /* Return X times (2 to the Nth power). */ scalbln /* Round X to integral value in floating-point format using current rounding direction, but do not raise inexact exception. */ nearbyint /* Round X to nearest integral value, rounding halfway cases away from zero. */ round /* Round X to the integral value in floating-point format nearest but not larger in magnitude. */ trunc /* Compute remainder of X and Y and put in *QUO a value with sign of x/y and magnitude congruent `mod 2^n' to the magnitude of the integral quotient x/y, with n >= 3. */ remquo /* Round X to nearest integral value according to current rounding direction. */ lrint llrint /* Round X to nearest integral value, rounding halfway cases away from zero. */ lround llround /* Return positive difference between X and Y. */ fdim /* Return maximum numeric value from X and Y. */ fmax /* Return minimum numeric value from X and Y. */ fmin /* Multiply-add function computed as a ternary operation. */ fma /* Return X times (2 to the Nth power). */ scalb Chuck From nwagner at mecha.uni-stuttgart.de Thu Jan 12 05:00:58 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 12 Jan 2006 11:00:58 +0100 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines Message-ID: <43C628DA.8070208@mecha.uni-stuttgart.de> Hi all, I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) on a 64 bit machine (SuSE 9.3). The libraries are located in /usr/lib64 /usr/lib64/libdfftw.a /usr/lib64/libdfftw.la /usr/lib64/libdfftw.so /usr/lib64/libdfftw.so.2 /usr/lib64/libdfftw.so.2.0.5 /usr/lib64/libdrfftw.a /usr/lib64/libdrfftw.la /usr/lib64/libdrfftw.so /usr/lib64/libdrfftw.so.2 /usr/lib64/libdrfftw.so.2.0.5 /usr/lib64/libfftw3.a /usr/lib64/libfftw3.la /usr/lib64/libfftw3.so /usr/lib64/libfftw3.so.3 /usr/lib64/libfftw3.so.3.0.1 /usr/lib64/libfftw3f.a /usr/lib64/libfftw3f.la /usr/lib64/libfftw3f.so /usr/lib64/libfftw3f.so.3 /usr/lib64/libfftw3f.so.3.0.1 /usr/lib64/libfftw3f_threads.so.3 /usr/lib64/libfftw3f_threads.so.3.0.1 /usr/lib64/libfftw3_threads.so.3 /usr/lib64/libfftw3_threads.so.3.0.1 /usr/lib64/libsfftw.a /usr/lib64/libsfftw.la /usr/lib64/libsfftw.so /usr/lib64/libsfftw.so.2 /usr/lib64/libsfftw.so.2.0.5 /usr/lib64/libsrfftw.a /usr/lib64/libsrfftw.la /usr/lib64/libsrfftw.so /usr/lib64/libsrfftw.so.2 /usr/lib64/libsrfftw.so.2.0.5 Numpy version 0.9.3.1885 Scipy version 0.4.4.1546 scipy.show_config() results in dfftw_info: NOT AVAILABLE fft_opt_info: NOT AVAILABLE fftw2_info: NOT AVAILABLE fftw3_info: NOT AVAILABLE Hence, the fft libraries are not not detected automatically. How can I fix this problem ? Nils From loredo at astro.cornell.edu Thu Jan 12 08:24:32 2006 From: loredo at astro.cornell.edu (Tom Loredo) Date: Thu, 12 Jan 2006 08:24:32 -0500 Subject: [SciPy-dev] OS X fft problem is serious In-Reply-To: References: Message-ID: <1137072272.43c6589017df1@astrosun2.astro.cornell.edu> Hi folks, I hate to whine, especially when I don't know how to go about solving the problem I'm whining about, but I don't think it's good "advertising" to base the first official release of new SciPy on a revision with a badly broken fft exposed in the top-level namespace on widely-used platform. Chris Fonnesbeck has verified that the problem exists also on his Tiger build. This is not a quirk of my Panther installation. Something is really broken. The ipython session logged below puts it starkly. There is a simple sine-like input array: [0 1 0 -1 0]. fft and then ifft with numpy and scipy. numpy gives the array back (line 4). scipy gives nonsense (line 7). There is a clue to what is going on. The ffts, though not exactly equal, are essentially equal before the ifft (line 6). But *after* taking the ifft, scipy changes the input fft in-place (compare line 10 and line 9). I believe the default behavior is no overwrite. Perhaps this points toward a solution. Perhaps the problem is not with SciPy but with FFTW2; either way, I think we need to figure it out, preferably before announcing the new release. I wish I could help more with this, but I'll be at a workshop for the next ~2 weeks. I'll be glad to help when I'm free again. -Tom In [1]: import numpy, scipy Overwriting fft= from scipy.fftpack.basic (was from numpy.dft.fftpack) Overwriting ifft= from scipy.fftpack.basic (was from numpy.dft.fftpack) In [2]: a=numpy.array([0.,1.,0.,-1.,0.]) In [3]: nfa=numpy.fft(a) In [4]: print numpy.ifft(nfa).real [ 0.00000000e+00 1.00000000e+00 9.76996262e-16 -1.00000000e+00 -9.76996262e-16] In [5]: sfa=scipy.fft(a) In [6]: numpy.allclose(nfa,sfa) Out[6]: True In [7]: print scipy.ifft(sfa).real [ 0. 0.5 -0.5 -0.5 0.5] In [8]: numpy.allclose(nfa,sfa) Out[8]: False In [9]: print nfa [ 0. +0.j 1.11803399-1.53884177j -1.11803399+0.36327126j -1.11803399-0.36327126j 1.11803399+1.53884177j] In [10]: print sfa [ 0. +0.j 0.5+0.j -0.5-0.j -0.5+0.j 0.5-0.j] In [11]: numpy.__version__ Out[11]: '0.9.3.1863' In [12]: scipy.__version__ Out[12]: '0.4.4.1544' ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From ndbecker2 at gmail.com Thu Jan 12 08:34:22 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Thu, 12 Jan 2006 08:34:22 -0500 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines References: <43C628DA.8070208@mecha.uni-stuttgart.de> Message-ID: Nils Wagner wrote: > Hi all, > > I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) > on a 64 bit machine (SuSE 9.3). The libraries are located > in /usr/lib64 > It's not just fftw. ALL libraries are affected. I already posted a patch. Twice. From oliphant.travis at ieee.org Thu Jan 12 09:09:52 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 12 Jan 2006 07:09:52 -0700 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines In-Reply-To: References: <43C628DA.8070208@mecha.uni-stuttgart.de> Message-ID: <43C66330.3030308@ieee.org> Neal Becker wrote: >Nils Wagner wrote: > > > >>Hi all, >> >>I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) >>on a 64 bit machine (SuSE 9.3). The libraries are located >>in /usr/lib64 >> >> >> >It's not just fftw. ALL libraries are affected. I already posted a patch. Twice. > > Neal, The numpy and scipy rpms are generated automatically by python's distutils. I'm not sure how to apply the spec file patch you provide, and I guess nobody else is either. We appreciate your help, but are not sure how to use it. If somebody could teach me how to alter how the rpms are built that would be helpful... -Travis From cookedm at physics.mcmaster.ca Thu Jan 12 09:43:50 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 12 Jan 2006 09:43:50 -0500 Subject: [SciPy-dev] segfault with ndarray(1) Message-ID: <72D8EDD3-7473-40FB-8107-C6D0A88EE7C4@physics.mcmaster.ca> I don't have time to track it down at the moment, but passing invalid arguments to ndarray makes Python segfault: >>> ndarray(1) Segmentation fault (and similarly for non-type things) -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From oliphant.travis at ieee.org Thu Jan 12 10:19:24 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Thu, 12 Jan 2006 08:19:24 -0700 Subject: [SciPy-dev] segfault with ndarray(1) In-Reply-To: <72D8EDD3-7473-40FB-8107-C6D0A88EE7C4@physics.mcmaster.ca> References: <72D8EDD3-7473-40FB-8107-C6D0A88EE7C4@physics.mcmaster.ca> Message-ID: <43C6737C.3070902@ieee.org> David M. Cooke wrote: >I don't have time to track it down at the moment, but passing invalid >arguments to ndarray makes Python segfault: > > >>> ndarray(1) >Segmentation fault > > > This is fixed in SVN... -Travis From ndbecker2 at gmail.com Thu Jan 12 11:11:55 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Thu, 12 Jan 2006 11:11:55 -0500 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines References: <43C628DA.8070208@mecha.uni-stuttgart.de> <43C66330.3030308@ieee.org> Message-ID: Travis Oliphant wrote: > Neal Becker wrote: > >>Nils Wagner wrote: >> >> >> >>>Hi all, >>> >>>I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) >>>on a 64 bit machine (SuSE 9.3). The libraries are located >>>in /usr/lib64 >>> >>> >>> >>It's not just fftw. ALL libraries are affected. I already posted a >>patch. Twice. >> >> > > Neal, > > The numpy and scipy rpms are generated automatically by python's > distutils. I'm not sure how to apply the spec file patch you provide, > and I guess nobody else is either. > > We appreciate your help, but are not sure how to use it. If somebody > could teach me how to alter how the rpms are built that would be > helpful... > > -Travis Interesting. I wonder if my fedora copy of distutils was patched to fix this? grep finds 2 occurances of lib64 in ...distutils/*.py From nwagner at mecha.uni-stuttgart.de Thu Jan 12 11:49:39 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 12 Jan 2006 17:49:39 +0100 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines In-Reply-To: References: <43C628DA.8070208@mecha.uni-stuttgart.de> <43C66330.3030308@ieee.org> Message-ID: <43C688A3.9020301@mecha.uni-stuttgart.de> Neal Becker wrote: >Travis Oliphant wrote: > > >>Neal Becker wrote: >> >> >>>Nils Wagner wrote: >>> >>> >>> >>> >>>>Hi all, >>>> >>>>I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) >>>>on a 64 bit machine (SuSE 9.3). The libraries are located >>>>in /usr/lib64 >>>> >>>> >>>> >>>> >>>It's not just fftw. ALL libraries are affected. I already posted a >>>patch. Twice. >>> >>> >>> >>Neal, >> >>The numpy and scipy rpms are generated automatically by python's >>distutils. I'm not sure how to apply the spec file patch you provide, >>and I guess nobody else is either. >> >>We appreciate your help, but are not sure how to use it. If somebody >>could teach me how to alter how the rpms are built that would be >>helpful... >> >>-Travis >> > >Interesting. I wonder if my fedora copy of distutils was patched to fix >this? grep finds 2 occurances of lib64 in ...distutils/*.py > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > I guess one can add a line library_dirs = /usr/lib64 to the corresponding lines in system_info.py. Does this fix the problem of detecting the fftw libraries ? [fftw] fftw_libs = rfftw, fftw fftw_opt_libs = rfftw_threaded, fftw_threaded # if the above aren't found, look for {s,d}fftw_libs and {s,d}fftw_opt_libs Nils From strawman at astraw.com Thu Jan 12 12:49:44 2006 From: strawman at astraw.com (Andrew Straw) Date: Thu, 12 Jan 2006 09:49:44 -0800 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines In-Reply-To: <43C628DA.8070208@mecha.uni-stuttgart.de> References: <43C628DA.8070208@mecha.uni-stuttgart.de> Message-ID: <43C696B8.9080507@astraw.com> Just to let you know, not AMD64 based platforms have the 64 bit libraries in /usr/lib64 -- on my system (Debian sarge), they're in /usr/lib. Nils Wagner wrote: >Hi all, > >I have installed the fftw rpm's (fftw, fftw-devel, fftw3, fftw3-devel) >on a 64 bit machine (SuSE 9.3). The libraries are located >in /usr/lib64 > > From strawman at astraw.com Thu Jan 12 12:56:29 2006 From: strawman at astraw.com (Andrew Straw) Date: Thu, 12 Jan 2006 09:56:29 -0800 Subject: [SciPy-dev] Detection of fft libraries on 64 bit machines In-Reply-To: <43C696B8.9080507@astraw.com> References: <43C628DA.8070208@mecha.uni-stuttgart.de> <43C696B8.9080507@astraw.com> Message-ID: <43C6984D.9080804@astraw.com> Andrew Straw wrote: >Just to let you know, not AMD64 based platforms have the 64 bit >libraries in /usr/lib64 -- on my system (Debian sarge), they're in /usr/lib. > > That is, "not all AMD64 based platforms..." From stefan at sun.ac.za Thu Jan 12 16:01:05 2006 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 12 Jan 2006 23:01:05 +0200 Subject: [SciPy-dev] interpolate In-Reply-To: <20060110123149.GB1165@alpha> References: <20060110123149.GB1165@alpha> Message-ID: <20060112210105.GC28815@alpha> Robert recently published nearest neighbour interpolation routines, which, AFAIK, currently reside in the scipy sandbox. Should all the interpolation routines be consolidated in some way? What is the use of having all the spline routines in scipy if we don't use them? Regards St?fan On Tue, Jan 10, 2006 at 02:31:49PM +0200, Stefan van der Walt wrote: > In scipy, interpolate.interp2d is currently broken. The scipy > tutorial shows an interpolation example which uses bisplev en > bisplrep. > > Is there any reason why these functions can't be used in interp2d? > The attached patch calls those functions like in the tutorial. From a.h.jaffe at gmail.com Fri Jan 13 09:00:16 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Fri, 13 Jan 2006 14:00:16 +0000 Subject: [SciPy-dev] numpy.dft.real_fft problems? Message-ID: Hi All, [My apologies if this appears twice -- originally sent from a non-subscribed account; moderator, please delete the original if possible!] There seems to be a problem with the real_fft routine; starting with a length n array, it should give a length n/2+1 array with real numbers in the first and last positions (for n even). For a 2d array starting with an (n,n) array, it should give a (n,n/2+1) array with real numbers in the first and last positions on the first row. However, note that the last entry on the first row has a real part, as opposed to the 2d complex routine fft2d, which has it correct. The weirdness of the real part seems to imply it's due to uninitialized memory. The problem persists in 2d, as well. In [53]:a = numpy.randn(16) In [54]:fa = numpy.dft.real_fft(a) In [55]:fa Out[55]: array([ 0.4453+0.0000e+000j, -3.5009-3.4618e+000j, -0.1276+3.5740e-001j, -1.043+2.3504e+000j, -1.2028+1.6121e-001j, -7.1041-1.1182e+000j, 0.09-1.6013e-001j, 4.2079-1.0106e-001j, -4.8295+2.3091e+251j]) In [56]:f2a = numpy.dft.fft(a) In [57]:f2a Out[57]: array([ 0.4453+0.j , -3.5009-3.4618j, -0.1276+0.3574j, -1.043+2.3504j, -1.2028+0.1612j, -7.1041-1.1182j, 0.09-0.1601j, 4.2079-0.1011j, -4.8295+0.j , 4.2079+0.1011j, 0.09+0.1601j, -7.1041+1.1182j, -1.2028-0.1612j, -1.043-2.3504j, -0.1276-0.3574j, -3.5009+3.4618j]) -Andrew From ndarray at mac.com Fri Jan 13 12:46:45 2006 From: ndarray at mac.com (Sasha) Date: Fri, 13 Jan 2006 12:46:45 -0500 Subject: [SciPy-dev] How to specify compile args? Message-ID: Can is specify extra compile flags in site.cfg? I tried to put extra_compile_args = ... in the DEFAULT section, but it did not work. More specifically, I need to specify appropriate flags to g77. Gcc gets correct flags form Python, but g77 does not. Thanks. -- sasha From ndarray at mac.com Fri Jan 13 21:44:21 2006 From: ndarray at mac.com (Sasha) Date: Fri, 13 Jan 2006 21:44:21 -0500 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C4CCE8.70700@ieee.org> References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> <43C4C63A.9040507@colorado.edu> <43C4CCE8.70700@ieee.org> Message-ID: > It would be literally one-line of code to turn off the returning of the > hybrid object arrays, so I would not hesitate to do it and then see what > kind of complaints we get :-) Is this intended? >>> type(array(1.0).astype(object)) Breaks printing of ma zero rank arrays: >>> print ma.array(1.0,mask=1) Traceback (most recent call last): ... AttributeError: _MaskedPrintOption instance has no attribute '__float__' -- sasha From oliphant.travis at ieee.org Fri Jan 13 22:26:44 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 13 Jan 2006 20:26:44 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> <43C4C63A.9040507@colorado.edu> <43C4CCE8.70700@ieee.org> Message-ID: <43C86F74.80607@ieee.org> Sasha wrote: >>It would be literally one-line of code to turn off the returning of the >>hybrid object arrays, so I would not hesitate to do it and then see what >>kind of complaints we get :-) >> >> > >Is this intended? > > I did turn off returning hybrid object arrays as a trial run to see what would happen... > > >>>>type(array(1.0).astype(object)) >>>> >>>> > > >Breaks printing of ma zero rank arrays: > > There's our first complaint about returning object arrays. Can the masked array printing be fixed? -Travis From ndarray at mac.com Fri Jan 13 22:36:48 2006 From: ndarray at mac.com (Sasha) Date: Fri, 13 Jan 2006 22:36:48 -0500 Subject: [SciPy-dev] Nested arrays In-Reply-To: <43C86F74.80607@ieee.org> References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> <43C4C63A.9040507@colorado.edu> <43C4CCE8.70700@ieee.org> <43C86F74.80607@ieee.org> Message-ID: > > > There's our first complaint about returning object arrays. Can the > masked array printing be fixed? > Sure, I'll fix it. Afrer all it was my post that prompted the change :-). -- sasha From oliphant.travis at ieee.org Fri Jan 13 22:39:28 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 13 Jan 2006 20:39:28 -0700 Subject: [SciPy-dev] Nested arrays In-Reply-To: References: <43C4B409.6070606@ieee.org> <43C4BB0E.8010609@colorado.edu> <43C4C124.7040300@ieee.org> <43C4C63A.9040507@colorado.edu> <43C4CCE8.70700@ieee.org> <43C86F74.80607@ieee.org> Message-ID: <43C87270.5030203@ieee.org> Sasha wrote: >>There's our first complaint about returning object arrays. Can the >>masked array printing be fixed? >> >> >> >Sure, I'll fix it. Afrer all it was my post that prompted the change :-). > > Well only partly. After your post, I changed it so that array objects did not get wrapped up, and then Fernando convinced me to have all objects not get wrapped with a proxy. The masked array tests have done the most testing of object arrays, I think. It's been very helpful in tracking down problems in the past... -Travis From oliphant.travis at ieee.org Fri Jan 13 23:15:54 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Fri, 13 Jan 2006 21:15:54 -0700 Subject: [SciPy-dev] Changeset 1895 In-Reply-To: <2CBE2444-D060-4A75-B2DF-ED235BA6FC0B@mac.com> References: <2CBE2444-D060-4A75-B2DF-ED235BA6FC0B@mac.com> Message-ID: <43C87AFA.7070704@ieee.org> ndarray at mac.com wrote: > Travis, > > It looks like your change breaks compilation (I am using MacOS X, but > the problem does not seem to be OS specific). > Maybe just 18 -> 16? Yes. Sorry about that. I was in too big of a rush... I broke the rule about leaving SVN unbuildable... Bad Travis... -Travis From jonathan.taylor at utoronto.ca Sat Jan 14 13:11:27 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Sat, 14 Jan 2006 13:11:27 -0500 Subject: [SciPy-dev] Can we make scipy.org redirect to new.scipy.org/Wiki In-Reply-To: <463e11f90601111343l28804b6jdc7168d6b499f1e6@mail.gmail.com> References: <463e11f90601111343l28804b6jdc7168d6b499f1e6@mail.gmail.com> Message-ID: <463e11f90601141011g5717ec55y31eaa9c8b0f3fdf1@mail.gmail.com> Hmm... anyone think this is a good idea? On 1/11/06, Jonathan Taylor wrote: > Can we make scipy.org redirect to new.scipy.org/Wiki? > > Or maybe better is to move the actual wiki to scipy.org? > > It's a shame to let so many people running into scipy.org go on > oblivious to all this work you gals and guys are doing. > > Cheers. > Jon. > From strawman at astraw.com Sat Jan 14 13:42:31 2006 From: strawman at astraw.com (Andrew Straw) Date: Sat, 14 Jan 2006 10:42:31 -0800 (PST) Subject: [SciPy-dev] Can we make scipy.org redirect to new.scipy.org/Wiki In-Reply-To: <463e11f90601141011g5717ec55y31eaa9c8b0f3fdf1@mail.gmail.com> References: <463e11f90601111343l28804b6jdc7168d6b499f1e6@mail.gmail.com> <463e11f90601141011g5717ec55y31eaa9c8b0f3fdf1@mail.gmail.com> Message-ID: <1703.4.240.87.39.1137264151.squirrel@webmail.astraw.com> Yes! I hope many of us want to see the wiki become the site you get when you go to http://scipy.org and http://www.scipy.org There are a couple things that have to be done, however. We can do item 1, but I think Enthought has to do items 2 and 3. We should get working on item 1 and when we think we're ready, bug Enthought about the rest. 1. We need to make sure we've got all useful content off the old site. Even if we keep the old site online somewhere for archival purposes (a good idea, IMO), I think we want the new site to have this information before making it the default site. I suggest that as part of the new site, we add a section about running 'old' scipy, too. Not everyone is going to upgrade immediately, and it would be good to keep them feeling happy with the community. (I personally haven't been running old scipy recently, so I can't even begin to write this section...) 2. Move from a Python-process-per-click CGI method to a persistent process method. This will decrease latencies and the load on the server. 3. Update the Apache configuration to point to the new wiki. But, yes, I think this is a great idea and why I've worked so hard on the new wiki. Cheers! Andrew > Hmm... anyone think this is a good idea? > > On 1/11/06, Jonathan Taylor wrote: >> Can we make scipy.org redirect to new.scipy.org/Wiki? >> >> Or maybe better is to move the actual wiki to scipy.org? >> >> It's a shame to let so many people running into scipy.org go on >> oblivious to all this work you gals and guys are doing. >> >> Cheers. >> Jon. >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From ndarray at mac.com Sun Jan 15 08:07:18 2006 From: ndarray at mac.com (Sasha) Date: Sun, 15 Jan 2006 08:07:18 -0500 Subject: [SciPy-dev] Cannot access Trac Message-ID: http://projects.scipy.org/scipy/numpy/timeline Not Found The requested URL /scipy/numpy/timeline was not found on this server. Apache/2.0.54 (Fedora) Server at projects.scipy.org Port 80 -- sasha From nwagner at mecha.uni-stuttgart.de Sun Jan 15 08:11:08 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sun, 15 Jan 2006 14:11:08 +0100 Subject: [SciPy-dev] Citing scipy Message-ID: <43CA49EC.5060905@mecha.uni-stuttgart.de> Hi all, The current Bibtex entry for citing scipy points at www.scipy.org. http://www.scipy.org/documentation/citingscipy.html Is it planned to replace it by a more recent one, e.g. http://new.scipy.org/Wiki ? Nils From travis at enthought.com Sun Jan 15 09:23:44 2006 From: travis at enthought.com (Travis N. Vaught) Date: Sun, 15 Jan 2006 08:23:44 -0600 Subject: [SciPy-dev] Citing scipy In-Reply-To: <43CA49EC.5060905@mecha.uni-stuttgart.de> References: <43CA49EC.5060905@mecha.uni-stuttgart.de> Message-ID: <43CA5AF0.9080802@enthought.com> The new.scipy.org/Wiki site will be accessible via the www.scipy.org address as soon as the folks helping with the docs feel comfortable with the change--hopefully in the next few days rather than weeks. Andrew, we can make the necessary changes at the server whenever you say. Travis Nils Wagner wrote: >Hi all, > >The current Bibtex entry for citing scipy points at www.scipy.org. > >http://www.scipy.org/documentation/citingscipy.html > >Is it planned to replace it by a more recent one, e.g. >http://new.scipy.org/Wiki ? > >Nils > > From schofield at ftw.at Mon Jan 16 05:20:34 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 16 Jan 2006 11:20:34 +0100 Subject: [SciPy-dev] Object array comparisons, Monte Carlo package Message-ID: <43CB7372.8040203@ftw.at> Hi Travis, I'm finding object arrays useful with the new syntax. +1 from me :) I have a feature request. Object arrays currently don't support rich (elementwise) comparisons: >>> d = {'a':10,'b':20,'c':30} >>> s = montecarlo.dictsampler(d) >>> s.sample(10000) array([c, c, b, ..., c, c, b], dtype=object) >>> s == 'b' False Could we enable this again? This would be useful. I seem to recall that you disabled this a couple of months ago in response to some problem or other, but I haven't been able to find the thread. If the problem was with arrays as truth values, is this resolved now? The first three commands above are an example of using the new Monte Carlo package to sample from a discrete distribution given a probability mass function (here unnormalized). It uses Marsaglia's compressed table lookup sampler, and generates about 50 million variates per second on my P4, independent of the size of the sample space. It should be useful as an efficient foundation for various discrete Monte Carlo algorithms. -- Ed From chris at trichech.us Mon Jan 16 12:38:05 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Mon, 16 Jan 2006 12:38:05 -0500 Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: <43BD9CB6.50103@gmail.com> References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> <43BD9CB6.50103@gmail.com> Message-ID: <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> On Jan 5, 2006, at 5:24 PM, Robert Kern wrote: > Christopher Fonnesbeck wrote: >> Are f2py extensions compatible with eggs? At the moment, if I try to >> import setup from setuptools rather than from numpy.distutils, it >> does >> not recognize fortran extensions. > > setuptools does interfere with the Fortran extensions to build_ext > a little bit > if one isn't careful. Everything works fine if you *build* without > setuptools, > and then do the bdist_egg command separately with setuptools. > > E.g. > > $ python setup.py build > $ python -c "import setuptools; execfile('setup.py')" bdist_egg I'm trying to take advantage of the automatic prerequisite installing features of setuptools. Do I then have 2 calls: one to distutils.setup (for building) and another to setuptools.setup (for prerequisite verification and installing)? Thanks for your help. C. -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From robert.kern at gmail.com Mon Jan 16 13:50:12 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 16 Jan 2006 12:50:12 -0600 Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> <43BD9CB6.50103@gmail.com> <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> Message-ID: <43CBEAE4.5080304@gmail.com> Christopher Fonnesbeck wrote: > On Jan 5, 2006, at 5:24 PM, Robert Kern wrote: > >> Christopher Fonnesbeck wrote: >> >>> Are f2py extensions compatible with eggs? At the moment, if I try to >>> import setup from setuptools rather than from numpy.distutils, it does >>> not recognize fortran extensions. >> >> setuptools does interfere with the Fortran extensions to build_ext a >> little bit >> if one isn't careful. Everything works fine if you *build* without >> setuptools, >> and then do the bdist_egg command separately with setuptools. >> >> E.g. >> >> $ python setup.py build >> $ python -c "import setuptools; execfile('setup.py')" bdist_egg > > I'm trying to take advantage of the automatic prerequisite installing > features of setuptools. Do I then have 2 calls: one to distutils.setup > (for building) and another to setuptools.setup (for prerequisite > verification and installing)? setuptools.setup is distutils.core.setup. However, numpy.distutils.core.setup is not distutils.core.setup. Pearu started doing some work to move things out of numpy.distutils.core.setup such that numpy.distutils can play well with other distutils extensions. Until that happens, I don't see using setuptools' extensions to the setup() keywords with numpy.distutils. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Mon Jan 16 15:37:10 2006 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 16 Jan 2006 14:37:10 -0600 (CST) Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: <43CBEAE4.5080304@gmail.com> References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> <43BD9CB6.50103@gmail.com> <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> <43CBEAE4.5080304@gmail.com> Message-ID: On Mon, 16 Jan 2006, Robert Kern wrote: > Christopher Fonnesbeck wrote: >> On Jan 5, 2006, at 5:24 PM, Robert Kern wrote: >> >>> Christopher Fonnesbeck wrote: >>> >>>> Are f2py extensions compatible with eggs? At the moment, if I try to >>>> import setup from setuptools rather than from numpy.distutils, it does >>>> not recognize fortran extensions. >>> >>> setuptools does interfere with the Fortran extensions to build_ext a >>> little bit >>> if one isn't careful. Everything works fine if you *build* without >>> setuptools, >>> and then do the bdist_egg command separately with setuptools. >>> >>> E.g. >>> >>> $ python setup.py build >>> $ python -c "import setuptools; execfile('setup.py')" bdist_egg >> >> I'm trying to take advantage of the automatic prerequisite installing >> features of setuptools. Do I then have 2 calls: one to distutils.setup >> (for building) and another to setuptools.setup (for prerequisite >> verification and installing)? > > setuptools.setup is distutils.core.setup. However, numpy.distutils.core.setup is > not distutils.core.setup. Pearu started doing some work to move things out of > numpy.distutils.core.setup such that numpy.distutils can play well with other > distutils extensions. Until that happens, I don't see using setuptools' > extensions to the setup() keywords with numpy.distutils. numpy.distutils.core.setup should play nicely with all distutils extensions. It's an extension of distutils.core.setup, not a replacement. Could you try out the following code in the header of setup.py: from numpy.distutils.core import setup import setuptools setuptools.setup = setup ? If it works, numpy.distutils could overwrite setuptools.setup automatically in numpy/distutils/__init__.py, for instance. Pearu From oliphant.travis at ieee.org Mon Jan 16 17:06:49 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 16 Jan 2006 15:06:49 -0700 Subject: [SciPy-dev] Object array comparisons, Monte Carlo package In-Reply-To: <43CB7372.8040203@ftw.at> References: <43CB7372.8040203@ftw.at> Message-ID: <43CC18F9.1070906@ieee.org> Ed Schofield wrote: >Hi Travis, > >I'm finding object arrays useful with the new syntax. +1 from me :) > >I have a feature request. Object arrays currently don't support rich >(elementwise) comparisons: > > Well, they actually do. But, there is some code left-over from Numeric that is not working the same way now that there are string arrays available. > > >>>>d = {'a':10,'b':20,'c':30} >>>>s = montecarlo.dictsampler(d) >>>>s.sample(10000) >>>> >>>> >array([c, c, b, ..., c, c, b], dtype=object) > > >>>>s == 'b' >>>> >>>> >False > >Could we enable this again? > I never disabled it. Consider s == array('b', object) The problem is that 'b' is being interpreted as a string array and that is returning False (probably due to special code in the rich_compare function. I need to re-look at the code in the Py_EQ section of array_richcompare to make sure it is not special-casing inappropriately... -Travis From chris at trichech.us Mon Jan 16 22:55:57 2006 From: chris at trichech.us (Christopher Fonnesbeck) Date: Mon, 16 Jan 2006 22:55:57 -0500 Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> <43BD9CB6.50103@gmail.com> <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> <43CBEAE4.5080304@gmail.com> Message-ID: <516443A3-D7FC-4C16-B770-33AD57AC435F@trichech.us> On Jan 16, 2006, at 3:37 PM, Pearu Peterson wrote: >> setuptools.setup is distutils.core.setup. However, >> numpy.distutils.core.setup is >> not distutils.core.setup. Pearu started doing some work to move >> things out of >> numpy.distutils.core.setup such that numpy.distutils can play well >> with other >> distutils extensions. Until that happens, I don't see using >> setuptools' >> extensions to the setup() keywords with numpy.distutils. > > numpy.distutils.core.setup should play nicely with all distutils > extensions. It's an extension of distutils.core.setup, not a > replacement. > > Could you try out the following code in the header of setup.py: > > from numpy.distutils.core import setup > import setuptools > setuptools.setup = setup > > ? If it works, numpy.distutils could overwrite setuptools.setup > automatically in numpy/distutils/__init__.py, for instance. Unfortunately, setuputils.setup does not recognize f2py modules. Adding the lines above yields: Oliver:/usr/local/src/PyMC chris$ python setup.py build running build running build_py creating build creating build/lib.darwin-8.4.0-Power_Macintosh-2.4 creating build/lib.darwin-8.4.0-Power_Macintosh-2.4/PyMC copying PyMC/__init__.py -> build/lib.darwin-8.4.0- Power_Macintosh-2.4/PyMC copying PyMC/Matplot.py -> build/lib.darwin-8.4.0-Power_Macintosh-2.4/ PyMC copying PyMC/MCMC.py -> build/lib.darwin-8.4.0-Power_Macintosh-2.4/PyMC running build_ext building 'PyMC.flib' extension gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp - mno-fused-madd -fPIC -fno-common -dynamic -DNDEBUG -g -O3 -Wall - Wstrict-prototypes' error: unknown file type '.f' (from 'PyMC/flib.f') Thanks, Chris -- Christopher J. Fonnesbeck Population Ecologist, Marine Mammal Section Fish & Wildlife Research Institute (FWC) St. Petersburg, FL Adjunct Assistant Professor Warnell School of Forest Resources University of Georgia Athens, GA T: 727.235.5570 E: chris at trichech.us -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2417 bytes Desc: not available URL: From robert.kern at gmail.com Mon Jan 16 23:53:17 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 16 Jan 2006 22:53:17 -0600 Subject: [SciPy-dev] eggs and f2py extensions In-Reply-To: <516443A3-D7FC-4C16-B770-33AD57AC435F@trichech.us> References: <40D0FCE6-F778-4656-A613-B6A63BA9EB39@trichech.us> <43BD9CB6.50103@gmail.com> <3D8A75DC-75C3-4C48-B250-40576E4917D8@trichech.us> <43CBEAE4.5080304@gmail.com> <516443A3-D7FC-4C16-B770-33AD57AC435F@trichech.us> Message-ID: <43CC783D.1080702@gmail.com> Christopher Fonnesbeck wrote: > On Jan 16, 2006, at 3:37 PM, Pearu Peterson wrote: >> Could you try out the following code in the header of setup.py: >> >> from numpy.distutils.core import setup >> import setuptools >> setuptools.setup = setup >> >> ? If it works, numpy.distutils could overwrite setuptools.setup >> automatically in numpy/distutils/__init__.py, for instance. > > Unfortunately, setuputils.setup does not recognize f2py modules. Adding > the lines above yields: Import setuptools first and use numpy.distutils.core.setup(). The install_requires checking doesn't happen until easy_install, so nothing should be stomped on by numpy.distutils. setup_requires also seems to work (not shown here). I don't know why the $ python -c "import setuptools; execfile('setup.py')" build form didn't work for you. [testf2py]$ ls setup.py subr.f subr.pyf [testf2py]$ cat setup.py import setuptools from numpy.distutils.core import setup from numpy.distutils.misc_util import Configuration config = Configuration(name='subr') config.add_extension('subr', sources=['subr.pyf'], ) kwds = {'install_requires': ['DoesNotExist'], } kwds.update(config.todict()) setup(**kwds) [testf2py]$ python setup.py build running build running config_fc running build_src building extension "subr.subr" sources creating build creating build/src creating build/src/testf2py f2py options: [] f2py: /Users/kern/src/testf2py/subr.pyf Reading fortran codes... Reading file '/Users/kern/src/testf2py/subr.pyf' (format:free) Post-processing... Block: subr Block: foo Post-processing (stage 2)... Building modules... Building module "subr"... Constructing wrapper function "foo"... foo() Wrote C/API module "subr" to file "build/src/testf2py/subrmodule.c" adding 'build/src/fortranobject.c' to sources. adding 'build/src' to include_dirs. [...] gcc: build/src/testf2py/subrmodule.c creating build/lib.darwin-8.3.0-Power_Macintosh-2.4 creating build/lib.darwin-8.3.0-Power_Macintosh-2.4/subr gcc -Wl,-x -bundle -undefined dynamic_lookup build/temp.darwin-8.3.0-Power_Macintosh-2.4/build/src/testf2py/subrmodule.o build/temp.darwin-8.3.0-Power_Macintosh-2.4/build/src/fortranobject.o -o build/lib.darwin-8.3.0-Power_Macintosh-2.4/subr/subr.so [testf2py]$ python setup.py bdist_egg running bdist_egg running egg_info creating subr.egg-info [...] build/bdist.darwin-8.3.0-Power_Macintosh/egg/EGG-INFO zip_safe flag not set; analyzing archive contents... creating dist creating 'dist/subr-0.0.0-py2.4-macosx-10.4-ppc.egg' and adding 'build/bdist.darwin-8.3.0-Power_Macintosh/egg' to it removing 'build/bdist.darwin-8.3.0-Power_Macintosh/egg' (and everything under it) [testf2py]$ easy_install dist/subr-0.0.0-py2.4-macosx-10.4-ppc.egg Processing subr-0.0.0-py2.4-macosx-10.4-ppc.egg Copying subr-0.0.0-py2.4-macosx-10.4-ppc.egg to /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages Adding subr 0.0.0 to easy-install.pth file Installed /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/subr-0.0.0-py2.4-macosx-10.4-ppc.egg Processing dependencies for subr==0.0.0 Searching for DoesNotExist Reading http://www.python.org/pypi/DoesNotExist/ Couldn't find index page for 'DoesNotExist' (maybe misspelled?) Scanning index of all packages (this may take a while) Reading http://www.python.org/pypi/ ^Cinterrupted -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Tue Jan 17 03:22:42 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 17 Jan 2006 09:22:42 +0100 Subject: [SciPy-dev] FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) Message-ID: <43CCA952.5070801@mecha.uni-stuttgart.de> scipy.test(1,10) gives ====================================================================== FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 42, in check_nils assert_array_almost_equal(r,cr) File "/usr/local/lib/python2.4/site-packages/numpy/testing/utils.py", line 183, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 19.2395622 -6.0552963j -31.4604173 -8.5979283j 76.1476323+12.4835226j 19.9582071 -9.7389952j 12.0353248+12.25712... Array 2: [[ 11.9493333 -2.2453333 15.3173333 21.6533333 -2.2453333] [ -3.8426667 0.4986667 -4.5906667 -7.1866667 0.498... ---------------------------------------------------------------------- Ran 1075 tests in 5.490s FAILED (failures=1) >>> scipy.__version__ '0.4.4.1558' >>> numpy.__version__ '0.9.4.1922' From jonathan.taylor at utoronto.ca Tue Jan 17 14:39:49 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Tue, 17 Jan 2006 14:39:49 -0500 Subject: [SciPy-dev] Can we make scipy.org redirect to new.scipy.org/Wiki In-Reply-To: <1703.4.240.87.39.1137264151.squirrel@webmail.astraw.com> References: <463e11f90601111343l28804b6jdc7168d6b499f1e6@mail.gmail.com> <463e11f90601141011g5717ec55y31eaa9c8b0f3fdf1@mail.gmail.com> <1703.4.240.87.39.1137264151.squirrel@webmail.astraw.com> Message-ID: <463e11f90601171139y1bb250b2q7a6537ae4cb7c965@mail.gmail.com> Andrew, The site looks really great. I will see what I can do to help. J. On 1/14/06, Andrew Straw wrote: > Yes! I hope many of us want to see the wiki become the site you get when > you go to http://scipy.org and http://www.scipy.org > > There are a couple things that have to be done, however. We can do item 1, > but I think Enthought has to do items 2 and 3. We should get working on > item 1 and when we think we're ready, bug Enthought about the rest. > > 1. We need to make sure we've got all useful content off the old site. > Even if we keep the old site online somewhere for archival purposes (a > good idea, IMO), I think we want the new site to have this information > before making it the default site. I suggest that as part of the new site, > we add a section about running 'old' scipy, too. Not everyone is going to > upgrade immediately, and it would be good to keep them feeling happy with > the community. (I personally haven't been running old scipy recently, so I > can't even begin to write this section...) > > 2. Move from a Python-process-per-click CGI method to a persistent process > method. This will decrease latencies and the load on the server. > > 3. Update the Apache configuration to point to the new wiki. > > But, yes, I think this is a great idea and why I've worked so hard on the > new wiki. > > Cheers! > Andrew > > > Hmm... anyone think this is a good idea? > > > > On 1/11/06, Jonathan Taylor wrote: > >> Can we make scipy.org redirect to new.scipy.org/Wiki? > >> > >> Or maybe better is to move the actual wiki to scipy.org? > >> > >> It's a shame to let so many people running into scipy.org go on > >> oblivious to all this work you gals and guys are doing. > >> > >> Cheers. > >> Jon. > >> > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From ndbecker2 at gmail.com Tue Jan 17 20:38:18 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Tue, 17 Jan 2006 20:38:18 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs Message-ID: Here are .spec files I used to build numpy and scipy on Fedora FC5T2 x86_64. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: numpy.spec URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: scipy.spec URL: From ivazquez at ivazquez.net Tue Jan 17 22:07:18 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Tue, 17 Jan 2006 22:07:18 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: References: Message-ID: <1137553638.19832.0.camel@ignacio.lan> On Tue, 2006-01-17 at 20:38 -0500, Neal Becker wrote: > Here are .spec files I used to build numpy and scipy on Fedora FC5T2 > x86_64. Mind if I clean them up and submit them to Fedora Extras? -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From oliphant.travis at ieee.org Tue Jan 17 19:19:43 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 17 Jan 2006 17:19:43 -0700 Subject: [SciPy-dev] [SciPy-user] Failed build w/ mingw In-Reply-To: References: Message-ID: <43CD899F.4010001@ieee.org> Travis Brady wrote: > With SVN updated 5 minutes ago I receive (with win2k, msys, mingw and > Python 2.4): > > building 'scipy.montecarlo.intsampler' extension > compiling C sources [snip...] > :\Python24\libs -LC:\Python24\PCBuild -Lbuild\temp.win32-2.4 > -lpython24 -lmsvcr7 > 1 -o build\lib.win32-2.4\scipy\montecarlo\intsampler.pyd" failed with > exit status 1 > removed Lib\__svn_version__.py > removed Lib\__svn_version__.pyc > This example shows the need for a more organized way of moving packages from the sandbox to the main SciPy distribution. While we don't sub-packages in the sandbox to stay forever hidden, we do need a fairly organized way to approve sub-packages being included in the main SciPy tree. I remember that Robert Kern had written a series of requirements at one point. Perhaps he would dig those up and revise them and post them as a starting point for discussion. In the mean time, at the very least, a move from the sandbox to SciPy, should be prefaced by a posting to the scipy-dev list (and perhaps even the scipy-user list) and some discussion. This will give everybody who may be thinking of similar packages a chance to have their ideas heard, before a specific interface is decided on de-facto. I don't want the process of moving to SciPy to be too overly formal and difficult, but it should represent at least some consensus. In addition, we need to make sure that packages moved over do not break builds on some platforms --- especially widely used platforms. -Travis From arnd.baecker at web.de Wed Jan 18 03:49:18 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 18 Jan 2006 09:49:18 +0100 (CET) Subject: [SciPy-dev] numpy/scipy some remarks Message-ID: Hi, I just successfully installed a recent numpy/scipy using MKL as math library on Itanium2. I have few minor comments, which might cause confusion to new-comers (and me as well;-) to the numpy/scipy combo: 1.) What does the overwriting stuff mean? In [1]: import scipy In [2]: scipy.pkgload() Overwriting lib= from /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/lib/__init__.pyc (was from /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/lib/__init__.pyc) 2.) Because I did not use ATLAS, I get the following warnings during testing (scipy.test(10): Ran 1038 tests in 21.509: **************************************************************** WARNING: clapack module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses flapack instead of clapack. **************************************************************** **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** I am not sure, if it is clear (even after reading scipy/INSTALL.txt), what one has to do with this warning. Should one in addition also install atlas for the clapack/cblas part? (Personally I think that for MKL one should just go for the fortran wrappers, and don't think about the clapack/cblas stuff. Isn't this only of importance, if the arrays have the wrong orderering in memory to avoid a copy?) 3.) There was one of those statistical failures again: ====================================================================== FAIL: check_normal (scipy.stats.morestats.test_morestats.test_anderson) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/stats/tests/test_morestats.py", line 47, in check_normal assert_array_less(A, crit[-2:]) File "/home/baecker/python2//scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/testing/utils.py", line 207, in assert_array_less assert cond,\ AssertionError: Arrays are not less-ordered (mismatch 50.0%): Array 1: 0.86583024233101469 Array 2: [ 0.858 1.0210000000000001] ((Of course, the more often one builds, the more events of this type will happen ;-)) Would it make sense to run these statistical tests eg. up to 5 times, and only if they fail all the time, to issue a FAILURE? 4.) help(scipy) Contents -------- numpy name space [[ looks quite empty here - is any more contents (eg. a short example?) planned ]] Available subpackages --------------------- stats --- Statistical Functions [*] sparse --- Sparse matrix [*] lib --- Python wrappers to external libraries [*] linalg --- Linear algebra routines [*] signal --- Signal Processing Tools [*] misc --- Various utilities that don't have another home. interpolate --- Interpolation Tools [*] optimize --- Optimization Tools [*] cluster --- Vector Quantization / Kmeans [*] fftpack --- Discrete Fourier Transform algorithms [*] io --- Data input and output [*] integrate --- Integration routines [*] lib.lapack --- Wrappers to LAPACK library [*] special --- Special Functions [*] lib.blas --- Wrappers to BLAS library [*] [*] - using a package requires explicit import (see pkgload) [[ couldn't one get rid of all these [*]? They don't look nice. Only `misc` is different ]] Global symbols from subpackages ------------------------------- stats --> find_repeats misc --> info, factorial, factorial2, factorialk, comb, who, lena, central_diff_weights, derivative, pade, source fftpack --> fft, fftn, fft2, ifft, ifft2, ifftn, fftshift, ifftshift, fftfreq Utility tools ------------- test --- Run scipy unittests pkgload --- Load scipy packages show_config --- Show scipy build configuration show_numpy_config --- Show numpy build configuration [[ this is directly mapped to `numpy`s show - is it really needed here as well? ]] __version__ --- Scipy version string __numpy_version__ --- Numpy version string [[ same with this one, why not just have it in numpy?]] Environment variables --------------------- SCIPY_IMPORT_VERBOSE --- pkgload verbose flag, default is 0. Best, Arnd From pearu at scipy.org Wed Jan 18 03:03:58 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 18 Jan 2006 02:03:58 -0600 (CST) Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: References: Message-ID: On Wed, 18 Jan 2006, Arnd Baecker wrote: > Hi, > > I just successfully installed a recent numpy/scipy using MKL as math > library on Itanium2. > I have few minor comments, which might cause confusion > to new-comers (and me as well;-) to the numpy/scipy combo: > > 1.) What does the overwriting stuff mean? > > In [1]: import scipy > In [2]: scipy.pkgload() > Overwriting lib= '/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/lib/__init__.pyc'> > from > /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/lib/__init__.pyc > (was '/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/lib/__init__.pyc'> > from > /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/lib/__init__.pyc) It means exactly what is written. You can disable these messages by defining env variable: export SCIPY_IMPORT_VERBOSE=-1 Basically, it says which numpy symbols are overwritten by scipy symbols in scipy namespace. Only scipy developers should watch out these messages. > 2.) Because I did not use ATLAS, I get the following warnings > during testing (scipy.test(10): Ran 1038 tests in 21.509: > > **************************************************************** > WARNING: clapack module is empty > ----------- > See scipy/INSTALL.txt for troubleshooting. > Notes: > * If atlas library is not found by numpy/distutils/system_info.py, > then scipy uses flapack instead of clapack. > **************************************************************** > > **************************************************************** > WARNING: cblas module is empty > ----------- > See scipy/INSTALL.txt for troubleshooting. > Notes: > * If atlas library is not found by numpy/distutils/system_info.py, > then scipy uses fblas instead of cblas. > **************************************************************** > > > I am not sure, if it is clear (even after reading scipy/INSTALL.txt), > what one has to do with this warning. > Should one in addition also install atlas for the clapack/cblas > part? > (Personally I think that for MKL one should just go for the fortran > wrappers, and don't think about the clapack/cblas stuff. > Isn't this only of importance, if the arrays have the wrong > orderering in memory to avoid a copy?) These messages are old and should be fixes. I agree that they can be optimized when using MKL. > 3.) There was one of those statistical failures again: > > ====================================================================== > FAIL: check_normal (scipy.stats.morestats.test_morestats.test_anderson) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/stats/tests/test_morestats.py", > line 47, in check_normal > assert_array_less(A, crit[-2:]) > File > "/home/baecker/python2//scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/testing/utils.py", > line 207, in assert_array_less > assert cond,\ > AssertionError: > Arrays are not less-ordered (mismatch 50.0%): > Array 1: 0.86583024233101469 > Array 2: [ 0.858 1.0210000000000001] > > > ((Of course, the more often one builds, the more events of this > type will happen ;-)) > > Would it make sense to run these statistical tests eg. up to 5 times, > and only if they fail all the time, to issue a FAILURE? +1 > 4.) help(scipy) > > Contents > -------- > > numpy name space > > [[ looks quite empty here - > is any more contents (eg. a short example?) planned ]] Yes, documentation patches are welcome. > Available subpackages > --------------------- > stats --- Statistical Functions [*] > sparse --- Sparse matrix [*] > lib --- Python wrappers to external libraries [*] > linalg --- Linear algebra routines [*] > signal --- Signal Processing Tools [*] > misc --- Various utilities that don't have another home. > interpolate --- Interpolation Tools [*] > optimize --- Optimization Tools [*] > cluster --- Vector Quantization / Kmeans [*] > fftpack --- Discrete Fourier Transform algorithms [*] > io --- Data input and output [*] > integrate --- Integration routines [*] > lib.lapack --- Wrappers to LAPACK library [*] > special --- Special Functions [*] > lib.blas --- Wrappers to BLAS library [*] > [*] - using a package requires explicit import (see pkgload) > > [[ couldn't one get rid of all these [*]? They don't look nice. > Only `misc` is different ]] :) Suggestions how to denote packages that require explicit import more nicely are welcome. > Global symbols from subpackages > ------------------------------- > stats --> find_repeats > misc --> info, factorial, factorial2, factorialk, comb, who, lena, > central_diff_weights, derivative, pade, source > fftpack --> fft, fftn, fft2, ifft, ifft2, ifftn, fftshift, ifftshift, > fftfreq > > Utility tools > ------------- > > test --- Run scipy unittests > pkgload --- Load scipy packages > show_config --- Show scipy build configuration > show_numpy_config --- Show numpy build configuration > > [[ this is directly mapped to `numpy`s show - is it really needed here > as well? ]] > > __version__ --- Scipy version string > __numpy_version__ --- Numpy version string The tools are different though. Compare scipy.test() with numpy.test() for instance. The same holds for other symbols. So, the answer is yes, they are needed. > [[ same with this one, why not just have it in numpy?]] > Environment variables > --------------------- > > SCIPY_IMPORT_VERBOSE --- pkgload verbose flag, default is 0. See previous answer. Pearu From arnd.baecker at web.de Wed Jan 18 04:55:51 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 18 Jan 2006 10:55:51 +0100 (CET) Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: References: Message-ID: Hi, thanks for your comments Pearu! On Wed, 18 Jan 2006, Pearu Peterson wrote: > On Wed, 18 Jan 2006, Arnd Baecker wrote: > > > Hi, > > > > I just successfully installed a recent numpy/scipy using MKL as math > > library on Itanium2. > > I have few minor comments, which might cause confusion > > to new-comers (and me as well;-) to the numpy/scipy combo: > > > > 1.) What does the overwriting stuff mean? > > > > In [1]: import scipy > > In [2]: scipy.pkgload() > > Overwriting lib= > '/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/lib/__init__.pyc'> > > from > > /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/scipy/lib/__init__.pyc > > (was > '/home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/lib/__init__.pyc'> > > from > > /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/lib/__init__.pyc) > > It means exactly what is written. You can disable these messages by > defining env variable: > > export SCIPY_IMPORT_VERBOSE=-1 > > Basically, it says which numpy symbols are overwritten by scipy symbols in > scipy namespace. Only scipy developers should watch out these messages. Fair enough ;-) Shouldn't then the default be SCIPY_IMPORT_VERBOSE=-1 so that that normal users don't see this? [...] > > 4.) help(scipy) > > > > Contents > > -------- > > > > numpy name space > > > > [[ looks quite empty here - > > is any more contents (eg. a short example?) planned ]] > > Yes, documentation patches are welcome. OK. I will see if I can write a few lines. > > Available subpackages > > --------------------- > > stats --- Statistical Functions [*] > > sparse --- Sparse matrix [*] > > lib --- Python wrappers to external libraries [*] > > linalg --- Linear algebra routines [*] > > signal --- Signal Processing Tools [*] > > misc --- Various utilities that don't have another home. > > interpolate --- Interpolation Tools [*] > > optimize --- Optimization Tools [*] > > cluster --- Vector Quantization / Kmeans [*] > > fftpack --- Discrete Fourier Transform algorithms [*] > > io --- Data input and output [*] > > integrate --- Integration routines [*] > > lib.lapack --- Wrappers to LAPACK library [*] > > special --- Special Functions [*] > > lib.blas --- Wrappers to BLAS library [*] > > [*] - using a package requires explicit import (see pkgload) > > > > [[ couldn't one get rid of all these [*]? They don't look nice. > > Only `misc` is different ]] > > :) Suggestions how to denote packages that require explicit import more > nicely are welcome. What about something like this at the end: """All the above packages require an implicit import. The only exception is the `misc` subpackages whose symbols are made available in the `numpy` namespace. """ > > Global symbols from subpackages > > ------------------------------- > > stats --> find_repeats > > misc --> info, factorial, factorial2, factorialk, comb, who, lena, > > central_diff_weights, derivative, pade, source > > fftpack --> fft, fftn, fft2, ifft, ifft2, ifftn, fftshift, ifftshift, > > fftfreq > > > > Utility tools > > ------------- > > > > test --- Run scipy unittests > > pkgload --- Load scipy packages > > show_config --- Show scipy build configuration > > show_numpy_config --- Show numpy build configuration > > > > [[ this is directly mapped to `numpy`s show - is it really needed here > > as well? ]] > > > > __version__ --- Scipy version string > > __numpy_version__ --- Numpy version string > > The tools are different though. Compare scipy.test() with numpy.test() for > instance. The same holds for other symbols. So, the answer is yes, they > are needed. Still confused: clearly scipy.test() and numpy.test() are different, but: In [1]: import scipy In [2]: scipy.show_numpy_config? Type: function Base Class: String Form: Namespace: Interactive File: /home/baecker/python2/scipy_icc_lintst_n_mkl/lib/python2.4/site-packages/numpy/__config__.py Definition: scipy.show_numpy_config() So this is directly mapped to numpy. Concerning __numpy_version__: if this is the present version of numpy (and not the one used during build), why do we need scipy.__numpy_version__ and numpy.__version__ ? > > [[ same with this one, why not just have it in numpy?]] > > Environment variables > > --------------------- > > > > SCIPY_IMPORT_VERBOSE --- pkgload verbose flag, default is 0. > > See previous answer. ;-) well, this wasn't a question, it was just a straight copy from `help(scipy)` - sorry for making this not clear enough. Best, Arnd From schofield at ftw.at Wed Jan 18 06:02:10 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 18 Jan 2006 12:02:10 +0100 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages Message-ID: <43CE2032.8070607@ftw.at> Hi all, I recently moved two new packages, maxent and montecarlo, from the sandbox into the main SciPy tree. I've now moved them back to the sandbox pending further discussion. I'll introduce them here and ask for feedback on whether they should be included in the main tree. The maxent package is for fitting maximum entropy models subject to expectation constraints. Maximum entropy models represent the 'least biased' models subject to given constraints. When the constraints are on the expectations of functionals -- the usual formulation -- maximum entropy models take the form of a generalized exponential family. A normal distribution, for example, is a maximum entropy distribution subject to mean and variance constraints. The maxent package contains one main module and one module with utility functions. Both are entirely in Python. (I have now removed the F2Py dependency.) The main module supports fitting models on either small or large sample spaces, where 'large' means continuous or otherwise too large to iterate over. Maxent models on 'small' sample spaces are common in natural language processing; models on 'large' sample spaces are useful for channel modelling in mobile communications, spectrum and chirp analysis, and (I believe) fluid turbulence. Some simple examples are in the examples/ directory. The simplest use is to define a list of functions f, an array of desired expectations K, and a sample space, and use the commands >>> model = maxent.model(f, samplespace) >>> model.fit(K) You can then retrieve the fitted parameters directly or analyze the model in other ways. I've been developing the maxent algorithms and code for about 4 years. The code is very well commented and should be straightforward to maintain. The montecarlo package currently does only one thing. It generates discrete variates from a given distribution. It does this FAST. On my P4 it generates over 10^7 variates per second, even for a sample space with 10^6 elements. The algorithm is the compact 5-table lookup sampler of Marsaglia. The main module, called 'intsampler', is written in C. There is also a simple Python wrapper class around this called 'dictsampler' that provides a nicer interface, allowing sampling from a distribution with arbitrary hashable objects (e.g. strings) as labels instead of {0,1,2,...}. dictsampler has slightly more overhead than intsampler, but is also very fast (around 10^6 per second for me with a sample space of 10^6 elements labelled with strings). An example of using it to sample from this discrete distribution: x 'a' 'b' 'c' p(x) 10/180 150/180 20/180 is: >>> table = {'a':10, 'b':150, 'c':20} >>> sampler = dictsampler(table) >>> sampler.sample(10**4) array([b, b, a, ..., b, b, c], dtype=object) The montecarlo package is very small (and not nearly as impressive as Christopher Fonnesbeck's PyMC package), but the functionality that is there would be an efficient foundation for many discrete Monte Carlo algorithms. I'm aware of the build issue Travis Brady reported with MinGW not defining lrand48(). I can't remember why I used this, but I'll adapt it to use lrand() instead and report back. Would these packages be useful? Are there any objections to including them? -- Ed From arnd.baecker at web.de Wed Jan 18 07:09:34 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 18 Jan 2006 13:09:34 +0100 (CET) Subject: [SciPy-dev] numpy with icc Message-ID: Hi, I tried to compile numpy using the intel compiler. Apart from the huge amount of messages (61 warnings, 2265 remarks) it compiles, but on import I get: In [1]: import numpy import core -> failed: /home/baecker/python2/scipy_icc2_lintst_n_mkl/lib/python2.4/site-packages/numpy/core/multiarray.so: undefined symbol: ?1__serial_memmove import random -> failed: 'module' object has no attribute 'dtype' import lib -> failed: /home/baecker/python2/scipy_icc2_lintst_n_mkl/lib/python2.4/site-packages/numpy/core/multiarray.so: undefined symbol: ?1__serial_memmove Fatal Python error: can't initialize module lapack_lite Aborted Googling for serial_memmove reveals this message http://softwareforums.intel.com/ids/board/message?board.id=16&message.id=1973 So this one seems needed: libirc: Intel-specific C runtime library. (note, there is also a libircmt Multithreaded Intel-specific C runtime library. ) Therefore I tried: export FC_VENDOR=Intel export F77=ifort export CC=icc export CXX=icc python setup.py config --compiler=intel --libraries="irc" install --prefix=$DESTnumpyDIR | tee ../build_log_numpy_${nr}.txt However, in the build log I only found -lirc for the config_tests but nowhere else. What should I do instead of the above? ((What I am also not sure about is, whether this -lirc is needed in general or specific to our situation. The reason is that python2.4 was compiled with gcc and numpy with icc9.1_beta.)) Best, Arnd From jonathan.taylor at utoronto.ca Wed Jan 18 07:29:32 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Wed, 18 Jan 2006 07:29:32 -0500 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: <43CE2032.8070607@ftw.at> References: <43CE2032.8070607@ftw.at> Message-ID: <463e11f90601180429lda6084di376d716efaed828@mail.gmail.com> Maybe there should be some place for "extra" packages. That is, packages that only a subset of people will want. This is like in R where by default you get some useful packages, and it is fairly easy to add on "extra" packages. Jon. On 1/18/06, Ed Schofield wrote: > Hi all, > > I recently moved two new packages, maxent and montecarlo, from the > sandbox into the main SciPy tree. I've now moved them back to the > sandbox pending further discussion. I'll introduce them here and ask > for feedback on whether they should be included in the main tree. > > The maxent package is for fitting maximum entropy models subject to > expectation constraints. Maximum entropy models represent the 'least > biased' models subject to given constraints. When the constraints are > on the expectations of functionals -- the usual formulation -- maximum > entropy models take the form of a generalized exponential family. A > normal distribution, for example, is a maximum entropy distribution > subject to mean and variance constraints. > > The maxent package contains one main module and one module with utility > functions. Both are entirely in Python. (I have now removed the F2Py > dependency.) The main module supports fitting models on either small or > large sample spaces, where 'large' means continuous or otherwise too > large to iterate over. Maxent models on 'small' sample spaces are > common in natural language processing; models on 'large' sample spaces > are useful for channel modelling in mobile communications, spectrum and > chirp analysis, and (I believe) fluid turbulence. Some simple examples > are in the examples/ directory. The simplest use is to define a list of > functions f, an array of desired expectations K, and a sample space, and > use the commands > > >>> model = maxent.model(f, samplespace) > >>> model.fit(K) > > You can then retrieve the fitted parameters directly or analyze the > model in other ways. > > I've been developing the maxent algorithms and code for about 4 years. > The code is very well commented and should be straightforward to maintain. > > > The montecarlo package currently does only one thing. It generates > discrete variates from a given distribution. It does this FAST. On my > P4 it generates over 10^7 variates per second, even for a sample space > with 10^6 elements. The algorithm is the compact 5-table lookup sampler > of Marsaglia. The main module, called 'intsampler', is written in C. > There is also a simple Python wrapper class around this called > 'dictsampler' that provides a nicer interface, allowing sampling from a > distribution with arbitrary hashable objects > (e.g. strings) as labels instead of {0,1,2,...}. dictsampler has > slightly more overhead than intsampler, but is also very fast (around > 10^6 per second for me with a sample space of 10^6 elements labelled > with strings). An example of using it to sample from this discrete > distribution: > > x 'a' 'b' 'c' > p(x) 10/180 150/180 20/180 > > is: > > >>> table = {'a':10, 'b':150, 'c':20} > >>> sampler = dictsampler(table) > >>> sampler.sample(10**4) > array([b, b, a, ..., b, b, c], dtype=object) > > The montecarlo package is very small (and not nearly as impressive as > Christopher Fonnesbeck's PyMC package), but the functionality that is > there would be an efficient foundation for many discrete Monte Carlo > algorithms. > > I'm aware of the build issue Travis Brady reported with MinGW not > defining lrand48(). I can't remember why I used this, but I'll adapt it > to use lrand() instead and report back. > > > Would these packages be useful? Are there any objections to including them? > > > -- Ed > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From ndbecker2 at gmail.com Wed Jan 18 07:30:19 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 18 Jan 2006 07:30:19 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs References: <1137553638.19832.0.camel@ignacio.lan> Message-ID: Ignacio Vazquez-Abrams wrote: > On Tue, 2006-01-17 at 20:38 -0500, Neal Becker wrote: >> Here are .spec files I used to build numpy and scipy on Fedora FC5T2 >> x86_64. > > Mind if I clean them up and submit them to Fedora Extras? > Please do! A couple of notes: 1) It builds- but I haven't done any other tests. Don't know how. 2) Needs some buildrequires. lapack, blas. scipy wants compat-gcc-32-g77-3.2.3-54.fc5 to provide f77. BTW, I assume it's OK to use this for f77 even though gcc is 4.1? There is no f77 for gcc4.1 - it has f95 instead. Don't know if that would work - and definitely don't know how to convince python setup.py to use it even if it did! From a.h.jaffe at gmail.com Wed Jan 18 08:09:59 2006 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Wed, 18 Jan 2006 13:09:59 +0000 Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: References: Message-ID: <43CE3E27.5080207@gmail.com> I have some related numpy/scipy questions: - I recently upgraded to the most recent SVN of numpy, without doing likewise for scipy. I found that the numpy.test() calls failed in a couple of places -- because *scipy* hadn't been updated with the latest dtype updates! (I can't reproduce the errors since I've since updated scipy.) I thought the whole point of the numpy/scipy split was to avoid 'implicit' calls of scipy from numpy, wasn't it? - I echo the question about the 'cblas'/'fblas' messages. On OSX should the altivec framework be able to deal with all of this? But as usual progress is fantastic. Thanks to Travis and everyone else. Yours, Andrew From ravi at ati.com Wed Jan 18 09:54:19 2006 From: ravi at ati.com (Ravikiran Rajagopal) Date: Wed, 18 Jan 2006 09:54:19 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: References: <1137553638.19832.0.camel@ignacio.lan> Message-ID: <200601180954.19840.ravi@ati.com> On Wednesday 18 January 2006 07:30, Neal Becker wrote: > 2) Needs some buildrequires. lapack, blas. scipy wants > compat-gcc-32-g77-3.2.3-54.fc5 to provide f77. BTW, I assume it's OK to > use this for f77 even though gcc is 4.1? There is no f77 for gcc4.1 - it > has f95 instead. Don't know if that would work - and definitely don't know > how to convince python setup.py to use it even if it did! You do not need g77/f77. With gfortran and gcc from the gcc 4.1 tool suite, numpy and scipy compile just fine, and work very well - I am even using them on a daily basis for my production work! The following minimal changes should get everything working fine with gfortran (lines split to accomodate my mailer): %build env CFLAGS="$RPM_OPT_FLAGS" BLAS=%{_libdir} LAPACK=%{_libdir} \ python setup.py config_fc --fcompiler=gnu95 build %install python setup.py config_fc --fcompiler=gnu95 \ install --root=$RPM_BUILD_ROOT #--record=INSTALLED_FILES In general, for FC4 and above, consider using gfortran rather than g77 since g77 is a dead branch. Virtually everything handled by g77 is also handled by gfortran; see http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19292 for more information (and ignore the flaming at the end of the comments). Regards, Ravi From ndbecker2 at gmail.com Wed Jan 18 10:50:22 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 18 Jan 2006 10:50:22 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> Message-ID: I tried your suggestion, but got this cryptic result: compile options: '-Ibuild/src -I/usr/lib64/python2.4/site-packages/numpy/core/include -I/usr/include/python2.4 -c' gcc: Lib/fftpack/src/zrfft.c gcc: Lib/fftpack/src/zfft.c gcc: build/src/Lib/fftpack/_fftpackmodule.c gcc: Lib/fftpack/src/drfft.c gcc: build/src/fortranobject.c gcc: Lib/fftpack/src/zfftnd.c Traceback (most recent call last): File "setup.py", line 42, in ? setup_package() File "setup.py", line 35, in setup_package setup( **config.todict() ) File "/usr/lib64/python2.4/site-packages/numpy/distutils/core.py", line 93, in setup return old_setup(**new_attr) File "/usr/lib64/python2.4/distutils/core.py", line 149, in setup dist.run_commands() File "/usr/lib64/python2.4/distutils/dist.py", line 946, in run_commands self.run_command(cmd) File "/usr/lib64/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib64/python2.4/distutils/command/build.py", line 112, in run self.run_command(cmd_name) File "/usr/lib64/python2.4/distutils/cmd.py", line 333, in run_command self.distribution.run_command(command) File "/usr/lib64/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib64/python2.4/site-packages/numpy/distutils/command/build_ext.py", line 107, in run self.build_extensions() File "/usr/lib64/python2.4/distutils/command/build_ext.py", line 405, in build_extensions self.build_extension(ext) File "/usr/lib64/python2.4/site-packages/numpy/distutils/command/build_ext.py", line 299, in build_extension link = self.fcompiler.link_shared_object AttributeError: 'NoneType' object has no attribute 'link_shared_object' error: Bad exit status from /var/tmp/rpm/rpm-tmp.32873 (%build) RPM build errors: Bad exit status from /var/tmp/rpm/rpm-tmp.32873 (%build) From robert.kern at gmail.com Wed Jan 18 11:10:36 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 10:10:36 -0600 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: <463e11f90601180429lda6084di376d716efaed828@mail.gmail.com> References: <43CE2032.8070607@ftw.at> <463e11f90601180429lda6084di376d716efaed828@mail.gmail.com> Message-ID: <43CE687C.1040901@gmail.com> Jonathan Taylor wrote: > Maybe there should be some place for "extra" packages. That is, > packages that only a subset of people will want. This is like in R > where by default you get some useful packages, and it is fairly easy > to add on "extra" packages. I hope that we can make the installation process for scipy such that nearly every subpackage is an "extra" package. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ndbecker2 at gmail.com Wed Jan 18 11:26:07 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 18 Jan 2006 11:26:07 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> Message-ID: Ravikiran Rajagopal wrote: > On Wednesday 18 January 2006 07:30, Neal Becker wrote: >> 2) Needs some buildrequires. lapack, blas. scipy wants >> compat-gcc-32-g77-3.2.3-54.fc5 to provide f77. BTW, I assume it's OK to >> use this for f77 even though gcc is 4.1? There is no f77 for gcc4.1 - it >> has f95 instead. Don't know if that would work - and definitely don't >> know how to convince python setup.py to use it even if it did! > > You do not need g77/f77. With gfortran and gcc from the gcc 4.1 tool > suite, numpy and scipy compile just fine, and work very well - I am even > using them on a daily basis for my production work! The following minimal > changes should get everything working fine with gfortran (lines split to > accomodate my mailer): > > %build > env CFLAGS="$RPM_OPT_FLAGS" BLAS=%{_libdir} LAPACK=%{_libdir} \ > python setup.py config_fc --fcompiler=gnu95 build > > %install > python setup.py config_fc --fcompiler=gnu95 \ > install --root=$RPM_BUILD_ROOT #--record=INSTALLED_FILES > > In general, for FC4 and above, consider using gfortran rather than g77 > since g77 is a dead branch. Virtually everything handled by g77 is also > handled by gfortran; see http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19292 > for more information (and ignore the flaming at the end of the comments). > Also, f2py seems to be a separate package that is already in fedora extras. I guess it needs to be removed from scipy package (or renamed). From charlesr.harris at gmail.com Wed Jan 18 11:32:45 2006 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 18 Jan 2006 09:32:45 -0700 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: <43CE2032.8070607@ftw.at> References: <43CE2032.8070607@ftw.at> Message-ID: Ed, I'm going to be picky and suggest using maxentropy instead of maxent. The latter name doesn't give much of a clue as to the contents of the package. Chuck On 1/18/06, Ed Schofield wrote: > > Hi all, > > I recently moved two new packages, maxent and montecarlo, from the > sandbox into the main SciPy tree. I've now moved them back to the > sandbox pending further discussion. I'll introduce them here and ask > for feedback on whether they should be included in the main tree. > > The maxent package is for fitting maximum entropy models subject to > expectation constraints. Maximum entropy models represent the 'least > biased' models subject to given constraints. When the constraints are > on the expectations of functionals -- the usual formulation -- maximum > entropy models take the form of a generalized exponential family. A > normal distribution, for example, is a maximum entropy distribution > subject to mean and variance constraints. > > The maxent package contains one main module and one module with utility > functions. Both are entirely in Python. (I have now removed the F2Py > dependency.) The main module supports fitting models on either small or > large sample spaces, where 'large' means continuous or otherwise too > large to iterate over. Maxent models on 'small' sample spaces are > common in natural language processing; models on 'large' sample spaces > are useful for channel modelling in mobile communications, spectrum and > chirp analysis, and (I believe) fluid turbulence. Some simple examples > are in the examples/ directory. The simplest use is to define a list of > functions f, an array of desired expectations K, and a sample space, and > use the commands > > >>> model = maxent.model(f, samplespace) > >>> model.fit(K) > > You can then retrieve the fitted parameters directly or analyze the > model in other ways. > > I've been developing the maxent algorithms and code for about 4 years. > The code is very well commented and should be straightforward to maintain. > > > The montecarlo package currently does only one thing. It generates > discrete variates from a given distribution. It does this FAST. On my > P4 it generates over 10^7 variates per second, even for a sample space > with 10^6 elements. The algorithm is the compact 5-table lookup sampler > of Marsaglia. The main module, called 'intsampler', is written in C. > There is also a simple Python wrapper class around this called > 'dictsampler' that provides a nicer interface, allowing sampling from a > distribution with arbitrary hashable objects > (e.g. strings) as labels instead of {0,1,2,...}. dictsampler has > slightly more overhead than intsampler, but is also very fast (around > 10^6 per second for me with a sample space of 10^6 elements labelled > with strings). An example of using it to sample from this discrete > distribution: > > x 'a' 'b' 'c' > p(x) 10/180 150/180 20/180 > > is: > > >>> table = {'a':10, 'b':150, 'c':20} > >>> sampler = dictsampler(table) > >>> sampler.sample(10**4) > array([b, b, a, ..., b, b, c], dtype=object) > > The montecarlo package is very small (and not nearly as impressive as > Christopher Fonnesbeck's PyMC package), but the functionality that is > there would be an efficient foundation for many discrete Monte Carlo > algorithms. > > I'm aware of the build issue Travis Brady reported with MinGW not > defining lrand48(). I can't remember why I used this, but I'll adapt it > to use lrand() instead and report back. > > > Would these packages be useful? Are there any objections to including > them? > > > -- Ed > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ravi at ati.com Wed Jan 18 11:35:22 2006 From: ravi at ati.com (Ravikiran Rajagopal) Date: Wed, 18 Jan 2006 11:35:22 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: References: <200601180954.19840.ravi@ati.com> Message-ID: <200601181135.22146.ravi@ati.com> On Wednesday 18 January 2006 10:50, Neal Becker wrote: > I tried your suggestion, but got this cryptic result: [snip] I do not have a 64-bit machine, but on my 32-bit machine, the attached spec files produce the right RPMs. The only suggestion I have for you is to check whether you have the following RPMs installed on your machine: gcc-gfortran libgfortran It looks as if numpy.distutils does not see gfortran on your machine. Regards, Ravi -------------- next part -------------- %define name scipy %define version 0.4.4 %define release 1 %define python_sitearch %(%{__python} -c 'from distutils import sysconfig; print sysconfig.get_python_lib(1)') Summary: Scipy: array processing for numbers, strings, records, and objects. Name: %{name} Version: %{version} Release: %{release} Source0: %{name}-%{version}.tar.gz License: BSD Group: Development/Libraries BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-buildroot Prefix: %{_prefix} Vendor: SciPy Developers Url: http://numeric.scipy.org %description Scipy is a general-purpose array-processing package designed to efficiently manipulate large multi-dimensional arrays of arbitrary records without sacrificing too much speed for small multi-dimensional arrays. Scipy is built on the Numeric code base and adds features introduced by numarray as well as an extended C-API and the ability to create arrays of arbitrary type. There are also basic facilities for discrete fourier transform, basic linear algebra and random number generation. %prep %setup %build env CFLAGS="$RPM_OPT_FLAGS" BLAS=%{_libdir} LAPACK=%{_libdir} python setup.py config_fc --fcompiler=gnu95 build %install env CFLAGS="$RPM_OPT_FLAGS" BLAS=%{_libdir} LAPACK=%{_libdir} python setup.py config_fc --fcompiler=gnu95 install --root=$RPM_BUILD_ROOT #--record=INSTALLED_FILES %clean rm -rf $RPM_BUILD_ROOT %files %{python_sitearch}/scipy %defattr(-,root,root) -------------- next part -------------- %define name numpy %define version 0.9.2 %define release 1 %define python_sitearch %(%{__python} -c 'from distutils import sysconfig; print sysconfig.get_python_lib(1)') Summary: Numpy: array processing for numbers, strings, records, and objects. Name: %{name} Version: %{version} Release: %{release} Source0: %{name}-%{version}.tar.gz License: BSD Group: Development/Libraries BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-buildroot Prefix: %{_prefix} Vendor: SciPy Developers Url: http://numeric.scipy.org %description Numpy is a general-purpose array-processing package designed to efficiently manipulate large multi-dimensional arrays of arbitrary records without sacrificing too much speed for small multi-dimensional arrays. Numpy is built on the Numeric code base and adds features introduced by numarray as well as an extended C-API and the ability to create arrays of arbitrary type. There are also basic facilities for discrete fourier transform, basic linear algebra and random number generation. %prep %setup %build env CFLAGS="$RPM_OPT_FLAGS" BLAS=%{_libdir} LAPACK=%{_libdir} python setup.py config_fc --fcompiler=gnu95 build %install python setup.py config_fc --fcompiler=gnu95 install --root=$RPM_BUILD_ROOT #--record=INSTALLED_FILES %clean rm -rf $RPM_BUILD_ROOT %files %defattr(-,root,root) %{python_sitearch}/numpy %{_bindir}/f2py From ndbecker2 at gmail.com Wed Jan 18 11:41:35 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 18 Jan 2006 11:41:35 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs References: <200601180954.19840.ravi@ati.com> <200601181135.22146.ravi@ati.com> Message-ID: Ravikiran Rajagopal wrote: > On Wednesday 18 January 2006 10:50, Neal Becker wrote: >> I tried your suggestion, but got this cryptic result: > > [snip] > > I do not have a 64-bit machine, but on my 32-bit machine, the attached > spec files produce the right RPMs. The only suggestion I have for you is > to check whether you have the following RPMs installed on your machine: > gcc-gfortran > libgfortran > It looks as if numpy.distutils does not see gfortran on your machine. > > Regards, > Ravi rpm -q gcc-gfortran gcc-gfortran-4.1.0-0.15 rpm -q libgfortran libgfortran-4.1.0-0.15 Maybe because I'm running FC5T2? This is a PIA to debug. Not very transparent IMHO. From hgamboa at gmail.com Wed Jan 18 11:51:27 2006 From: hgamboa at gmail.com (Hugo Gamboa) Date: Wed, 18 Jan 2006 16:51:27 +0000 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: <43CE2032.8070607@ftw.at> References: <43CE2032.8070607@ftw.at> Message-ID: <86522b1a0601180851h6911050eja8e19708dae48646@mail.gmail.com> I can see application of these packages to my work. Thank you, Ed. Hugo Gamboa On 1/18/06, Ed Schofield wrote: > > Hi all, > > I recently moved two new packages, maxent and montecarlo, from the > sandbox into the main SciPy tree. I've now moved them back to the > sandbox pending further discussion. I'll introduce them here and ask > for feedback on whether they should be included in the main tree. > > The maxent package is for fitting maximum entropy models subject to > expectation constraints. Maximum entropy models represent the 'least > biased' models subject to given constraints. When the constraints are > on the expectations of functionals -- the usual formulation -- maximum > entropy models take the form of a generalized exponential family. A > normal distribution, for example, is a maximum entropy distribution > subject to mean and variance constraints. > > The maxent package contains one main module and one module with utility > functions. Both are entirely in Python. (I have now removed the F2Py > dependency.) The main module supports fitting models on either small or > large sample spaces, where 'large' means continuous or otherwise too > large to iterate over. Maxent models on 'small' sample spaces are > common in natural language processing; models on 'large' sample spaces > are useful for channel modelling in mobile communications, spectrum and > chirp analysis, and (I believe) fluid turbulence. Some simple examples > are in the examples/ directory. The simplest use is to define a list of > functions f, an array of desired expectations K, and a sample space, and > use the commands > > >>> model = maxent.model(f, samplespace) > >>> model.fit(K) > > You can then retrieve the fitted parameters directly or analyze the > model in other ways. > > I've been developing the maxent algorithms and code for about 4 years. > The code is very well commented and should be straightforward to maintain. > > > The montecarlo package currently does only one thing. It generates > discrete variates from a given distribution. It does this FAST. On my > P4 it generates over 10^7 variates per second, even for a sample space > with 10^6 elements. The algorithm is the compact 5-table lookup sampler > of Marsaglia. The main module, called 'intsampler', is written in C. > There is also a simple Python wrapper class around this called > 'dictsampler' that provides a nicer interface, allowing sampling from a > distribution with arbitrary hashable objects > (e.g. strings) as labels instead of {0,1,2,...}. dictsampler has > slightly more overhead than intsampler, but is also very fast (around > 10^6 per second for me with a sample space of 10^6 elements labelled > with strings). An example of using it to sample from this discrete > distribution: > > x 'a' 'b' 'c' > p(x) 10/180 150/180 20/180 > > is: > > >>> table = {'a':10, 'b':150, 'c':20} > >>> sampler = dictsampler(table) > >>> sampler.sample(10**4) > array([b, b, a, ..., b, b, c], dtype=object) > > The montecarlo package is very small (and not nearly as impressive as > Christopher Fonnesbeck's PyMC package), but the functionality that is > there would be an efficient foundation for many discrete Monte Carlo > algorithms. > > I'm aware of the build issue Travis Brady reported with MinGW not > defining lrand48(). I can't remember why I used this, but I'll adapt it > to use lrand() instead and report back. > > > Would these packages be useful? Are there any objections to including > them? > > > -- Ed > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Jan 18 11:54:53 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 10:54:53 -0600 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: References: <43CE2032.8070607@ftw.at> Message-ID: <43CE72DD.6010905@gmail.com> Charles R Harris wrote: > Ed, > > I'm going to be picky and suggest using maxentropy instead of maxent. > The latter name doesn't give much of a clue as to the contents of the > package. It does to people who do maxent, e.g., attendees of the annual MAXENT conference. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Wed Jan 18 12:09:05 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 18 Jan 2006 10:09:05 -0700 Subject: [SciPy-dev] New maximum entropy and Monte Carlo packages In-Reply-To: <43CE72DD.6010905@gmail.com> References: <43CE2032.8070607@ftw.at> <43CE72DD.6010905@gmail.com> Message-ID: <43CE7631.6020500@colorado.edu> Robert Kern wrote: > Charles R Harris wrote: > >>Ed, >> >>I'm going to be picky and suggest using maxentropy instead of maxent. >>The latter name doesn't give much of a clue as to the contents of the >>package. > > > It does to people who do maxent, e.g., attendees of the annual MAXENT conference. There's also something to be said for slightly more explicit names, so that newcomers to a field, or those who may need a tool on occasion but are not specialists in a given subtopic, find things more easily. Given how easy python makes it to shorten package names upon import import maxentropy as maxent I'd be +1 on a more descriptive name. Cheers, f From oliphant.travis at ieee.org Wed Jan 18 12:50:23 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 18 Jan 2006 10:50:23 -0700 Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: <43CE3E27.5080207@gmail.com> References: <43CE3E27.5080207@gmail.com> Message-ID: <43CE7FDF.5030506@ieee.org> Andrew Jaffe wrote: >I have some related numpy/scipy questions: > > - I recently upgraded to the most recent SVN of numpy, without doing >likewise for scipy. I found that the numpy.test() calls failed in a >couple of places -- because *scipy* hadn't been updated with the latest >dtype updates! (I can't reproduce the errors since I've since updated >scipy.) I thought the whole point of the numpy/scipy split was to avoid >'implicit' calls of scipy from numpy, wasn't it? > > Not entirely. The issue was namespace clashes between packages. This issue has come up before and is localized to the numpy.dual module. If somebody would like to re-write the module to contain a register function that scipy hooks into to register new functions for the ones listed there, then that would be a better solution. But, I haven't had time to do it. -Travis From Fernando.Perez at colorado.edu Wed Jan 18 12:52:55 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 18 Jan 2006 10:52:55 -0700 Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: <43CE3E27.5080207@gmail.com> References: <43CE3E27.5080207@gmail.com> Message-ID: <43CE8077.8020706@colorado.edu> Andrew Jaffe wrote: > I have some related numpy/scipy questions: > > - I recently upgraded to the most recent SVN of numpy, without doing > likewise for scipy. I found that the numpy.test() calls failed in a > couple of places -- because *scipy* hadn't been updated with the latest > dtype updates! (I can't reproduce the errors since I've since updated > scipy.) I thought the whole point of the numpy/scipy split was to avoid > 'implicit' calls of scipy from numpy, wasn't it? Now that things are slowing down a little in the frantic internal reorganization, I'd really like to have a clear pronouncement on this issue. I personally find that the overwriting of the numpy namespace is just not a good idea. To the Python zen's explicit is better than implicit I'd add the corollary and predictability beats convenience I think pkgload() is a step in the right direction, but I remain convinced that it should NOT touch the numpy namespace, _at all_. Consider the following scenario: 1. You write some code which uses numpy.foo, benchmark it and write up the results for a talk/paper/whatever. 2. Sometime later, in your code you start using a third-party library for some ancillary functionality. Unbeknownst to you, that library calls pkg_load() (you had the env var set to -1 so you could use pkg_load interactively, so you don't see any warnings). 3. Now, your benchmarks have changed dramatically, but you can't figure out why. Things are probably better, but it's still very disconcerting. If you're really unlucky, instead you now have some bizarre bug (we've seen some of those already). And you start pulling your hair out: tests done in isolation with numpy work fine, but your big code (with the _exact same calls_) fails. At this point, python debugging begins to feel like the kind of muddy-water-diving that debugging a stray pointer in C is: something is going wrong, and yet there seems to be no causal path you can possibly follow. Anyway, I won't belabor this point any longer. I'd just like to hear from others their opinion on this matter, and if a decision is made to go ahead with the overwriting, at least I think the rationale for it should be well justified (and be more than "it's convenient"). The fact that over the last few weeks we've had several surprised questions on this is, to me, an indicator that I'm not the one uncomfortable with this decision. Best to all, f From ndbecker2 at gmail.com Wed Jan 18 12:58:50 2006 From: ndbecker2 at gmail.com (Neal Becker) Date: Wed, 18 Jan 2006 12:58:50 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs References: <200601180954.19840.ravi@ati.com> <200601181135.22146.ravi@ati.com> Message-ID: Ravikiran Rajagopal wrote: > On Wednesday 18 January 2006 10:50, Neal Becker wrote: >> I tried your suggestion, but got this cryptic result: > > [snip] > > I do not have a 64-bit machine, but on my 32-bit machine, the attached > spec files produce the right RPMs. The only suggestion I have for you is > to check whether you have the following RPMs installed on your machine: > gcc-gfortran > libgfortran > It looks as if numpy.distutils does not see gfortran on your machine. > Actually, it does build lots of fortran code before it fails - so something strange. It finally quits here: compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=nocona -D_GNU_SOURCE -fPIC -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=nocona -fPIC' creating build/temp.linux-x86_64-2.4/build creating build/temp.linux-x86_64-2.4/build/src creating build/temp.linux-x86_64-2.4/build/src/Lib creating build/temp.linux-x86_64-2.4/build/src/Lib/fftpack creating build/temp.linux-x86_64-2.4/Lib/fftpack/src compile options: '-Ibuild/src -I/usr/lib64/python2.4/site-packages/numpy/core/include -I/usr/include/python2.4 -c' gcc: Lib/fftpack/src/zrfft.c gcc: Lib/fftpack/src/zfft.c gcc: build/src/Lib/fftpack/_fftpackmodule.c gcc: Lib/fftpack/src/drfft.c gcc: build/src/fortranobject.c gcc: Lib/fftpack/src/zfftnd.c Traceback (most recent call last): File "setup.py", line 42, in ? setup_package() File "setup.py", line 35, in setup_package setup( **config.todict() ) File "/usr/lib64/python2.4/site-packages/numpy/distutils/core.py", line 93, in setup return old_setup(**new_attr) File "/usr/lib64/python2.4/distutils/core.py", line 149, in setup dist.run_commands() File "/usr/lib64/python2.4/distutils/dist.py", line 946, in run_commands self.run_command(cmd) File "/usr/lib64/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib64/python2.4/distutils/command/build.py", line 112, in run self.run_command(cmd_name) File "/usr/lib64/python2.4/distutils/cmd.py", line 333, in run_command self.distribution.run_command(command) File "/usr/lib64/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib64/python2.4/site-packages/numpy/distutils/command/build_ext.py", line 107, in run self.build_extensions() File "/usr/lib64/python2.4/distutils/command/build_ext.py", line 405, in build_extensions self.build_extension(ext) File "/usr/lib64/python2.4/site-packages/numpy/distutils/command/build_ext.py", line 299, in build_extension link = self.fcompiler.link_shared_object AttributeError: 'NoneType' object has no attribute 'link_shared_object' error: Bad exit status from /var/tmp/rpm/rpm-tmp.44882 (%build) From robert.kern at gmail.com Wed Jan 18 13:07:17 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 12:07:17 -0600 Subject: [SciPy-dev] numpy/scipy some remarks In-Reply-To: <43CE8077.8020706@colorado.edu> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> Message-ID: <43CE83D5.4060105@gmail.com> Fernando Perez wrote: > Anyway, I won't belabor this point any longer. I'd just like to hear from > others their opinion on this matter, and if a decision is made to go ahead > with the overwriting, at least I think the rationale for it should be well > justified (and be more than "it's convenient"). The fact that over the last > few weeks we've had several surprised questions on this is, to me, an > indicator that I'm not the one uncomfortable with this decision. I haven't followed this discussion in great detail, but I believe the current situation is this: 1) If you use numpy.dft and numpy.linalg directly, you will always get the numpy versions no matter what else is installed. 2) If you want to optionally use optimized scipy versions if they are available and regular numpy versions otherwise, then you use the functions exposed in numpy.dual. You do so at your own risk. 3) pkgload() exists to support the loading of subpackages. It does not reach into numpy.dft or numpy.linalg at all. It is not relevant to this issue. 4) There are some places in numpy that use numpy.dual. I think we can address all of your concerns by changing #4. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant.travis at ieee.org Wed Jan 18 13:19:49 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Wed, 18 Jan 2006 11:19:49 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: <43CE83D5.4060105@gmail.com> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> Message-ID: <43CE86C5.1050702@ieee.org> Robert Kern wrote: >Fernando Perez wrote: > > > >>Anyway, I won't belabor this point any longer. I'd just like to hear from >>others their opinion on this matter, and if a decision is made to go ahead >>with the overwriting, at least I think the rationale for it should be well >>justified (and be more than "it's convenient"). The fact that over the last >>few weeks we've had several surprised questions on this is, to me, an >>indicator that I'm not the one uncomfortable with this decision. >> >> > >I haven't followed this discussion in great detail, but I believe the current >situation is this: > >1) If you use numpy.dft and numpy.linalg directly, you will always get the numpy >versions no matter what else is installed. > >2) If you want to optionally use optimized scipy versions if they are available >and regular numpy versions otherwise, then you use the functions exposed in >numpy.dual. You do so at your own risk. > >3) pkgload() exists to support the loading of subpackages. It does not reach >into numpy.dft or numpy.linalg at all. It is not relevant to this issue. > >4) There are some places in numpy that use numpy.dual. > >I think we can address all of your concerns by changing #4. > > > This is an accurate assessment. However, I do not want to eliminate number 4 as I've mentioned before. I think there is a place for having functions that can be over-written with better versions. I agree that it could be implemented better, however, with some kind of register function instead of automatically looking in scipy... -Travis From robert.kern at gmail.com Wed Jan 18 13:31:31 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 12:31:31 -0600 Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: <43CE86C5.1050702@ieee.org> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE86C5.1050702@ieee.org> Message-ID: <43CE8983.2050804@gmail.com> Travis Oliphant wrote: > Robert Kern wrote: >>4) There are some places in numpy that use numpy.dual. >> >>I think we can address all of your concerns by changing #4. And actually, I think we can eat our cake and have it, too, by providing a way to restrict numpy.dual to only use numpy versions. We won't provide a way to force numpy.dual to only use some_other_version. I think Fernando's examples won't be problematic, then. > This is an accurate assessment. However, I do not want to eliminate > number 4 as I've mentioned before. I think there is a place for having > functions that can be over-written with better versions. I agree that > it could be implemented better, however, with some kind of register > function instead of automatically looking in scipy... Like egg entry_points. Please, let's not reinvent this wheel again. http://peak.telecommunity.com/DevCenter/PkgResources#entry-points -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From schofield at ftw.at Wed Jan 18 14:05:28 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 18 Jan 2006 20:05:28 +0100 Subject: [SciPy-dev] [Numpy-discussion] importing madness In-Reply-To: <43CE83D5.4060105@gmail.com> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> Message-ID: <43CE9178.3050602@ftw.at> Robert Kern wrote: >Fernando Perez wrote: > > > >>Anyway, I won't belabor this point any longer. I'd just like to hear from >>others their opinion on this matter, and if a decision is made to go ahead >>with the overwriting, at least I think the rationale for it should be well >>justified (and be more than "it's convenient"). The fact that over the last >>few weeks we've had several surprised questions on this is, to me, an >>indicator that I'm not the one uncomfortable with this decision. >> >> > >I haven't followed this discussion in great detail, but I believe the current >situation is this: > >1) If you use numpy.dft and numpy.linalg directly, you will always get the numpy >versions no matter what else is installed. > >2) If you want to optionally use optimized scipy versions if they are available >and regular numpy versions otherwise, then you use the functions exposed in >numpy.dual. You do so at your own risk. > >3) pkgload() exists to support the loading of subpackages. It does not reach >into numpy.dft or numpy.linalg at all. It is not relevant to this issue. > >4) There are some places in numpy that use numpy.dual. > >I think we can address all of your concerns by changing #4. > > I've been battling for half an hour trying to import scipy.linalg. Is this one of Fernando's scary predictions coming true? I get: >>> from scipy import linalg >>> dir(linalg) ['Heigenvalues', 'Heigenvectors', 'LinAlgError', 'ScipyTest', '__builtins__', '__doc__', '__file__', '__name__', '__path__', 'cholesky', 'cholesky_decomposition', 'det', 'determinant', 'eig', 'eigenvalues', 'eigenvectors', 'eigh', 'eigvals', 'eigvalsh', 'generalized_inverse', 'inv', 'inverse', 'lapack_lite', 'linalg', 'linear_least_squares', 'lstsq', 'pinv', 'singular_value_decomposition', 'solve', 'solve_linear_equations', 'svd', 'test'] >>> linalg.__file__ '/home/schofield/Tools/lib/python2.4/site-packages/numpy/linalg/__init__.pyc' This is the linalg package from numpy, not scipy. It's missing my favourite pinv2 function. What is going on?! I've just cleaned everything out and built from the latest SVN revisions. Using pkgload('linalg') doesn't seem to help: >>> import scipy >>> scipy.pkgload('linalg') >>> linalg.__file__ Traceback (most recent call last): File "", line 1, in ? NameError: name 'linalg' is not defined >>> scipy.linalg.__file__ '/home/schofield/Tools/lib/python2.4/site-packages/numpy/linalg/__init__.pyc' The only thing that helps is calling pkgload() with no arguments: >>> import scipy >>> scipy.pkgload() Overwriting lib= from /home/schofield/Tools/lib/python2.4/site-packages/scipy/lib/__init__.pyc (was from /home/schofield/Tools/lib/python2.4/site-packages/numpy/lib/__init__.pyc) >>> scipy.linalg.__file__ '/home/schofield/Tools/lib/python2.4/site-packages/scipy/linalg/__init__.pyc' Unless I'm doing something very stupid, there seem to be multiple sources of evil here. First, numpy's linalg package is available from the scipy namespace, which seems like a recipe for Ed's madness, since he can't find his pinv2() function. Second, scipy.pkgload('linalg') silently fails to make any visible difference. This is probably a simple bug. Third, ..., there is no third. But bad things usually come in threes. -- Ed From Fernando.Perez at colorado.edu Wed Jan 18 14:02:02 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 18 Jan 2006 12:02:02 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: <43CE86C5.1050702@ieee.org> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE86C5.1050702@ieee.org> Message-ID: <43CE90AA.2020104@colorado.edu> Travis Oliphant wrote: >>3) pkgload() exists to support the loading of subpackages. It does not reach >>into numpy.dft or numpy.linalg at all. It is not relevant to this issue. >> >>4) There are some places in numpy that use numpy.dual. >> >>I think we can address all of your concerns by changing #4. >> >> >> > > > This is an accurate assessment. However, I do not want to eliminate > number 4 as I've mentioned before. I think there is a place for having > functions that can be over-written with better versions. I agree that > it could be implemented better, however, with some kind of register > function instead of automatically looking in scipy... Mmh, I think I'm confused then: it seemed to me that pkgload() WAS overwriting numpy names, from the messages which the environment variable controls. Is that not true? Here's a recent thread: http://aspn.activestate.com/ASPN/Mail/Message/scipy-dev/2974044 where this was shown: In [3]: import scipy Overwriting fft= from scipy.fftpack.basic (was from numpy.dft.fftpack) Overwriting ifft= from scipy.fftpack.basic (was from numpy.dft.fftpack) I understood this as 'scipy.fftpack.basic.fft overwrote numpy.dft.fftpack.fft'. Does this then not affect the numpy namespace at all? I also would like to propose that, rather than using an environment variable, pkgload() takes a 'verbose=' keyword (or 'quiet='). I think it's much cleaner to say pkgload(quiet=1) or pkgload(verbose=0) than relying on users configuring env. variables for something like this. Cheers, f From ivazquez at ivazquez.net Wed Jan 18 15:00:24 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Wed, 18 Jan 2006 15:00:24 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> Message-ID: <1137614424.32536.1.camel@ignacio.lan> On Wed, 2006-01-18 at 11:26 -0500, Neal Becker wrote: > Also, f2py seems to be a separate package that is already in fedora extras. > I guess it needs to be removed from scipy package (or renamed). I intend on splitting the numpy package into numpy and f2py subpackages. -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From robert.kern at gmail.com Wed Jan 18 15:05:54 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 14:05:54 -0600 Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: <43CE90AA.2020104@colorado.edu> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE86C5.1050702@ieee.org> <43CE90AA.2020104@colorado.edu> Message-ID: <43CE9FA2.3000705@gmail.com> Fernando Perez wrote: > Travis Oliphant wrote: > >>>3) pkgload() exists to support the loading of subpackages. It does not reach >>>into numpy.dft or numpy.linalg at all. It is not relevant to this issue. >>> >>>4) There are some places in numpy that use numpy.dual. >>> >>>I think we can address all of your concerns by changing #4. >> >>This is an accurate assessment. However, I do not want to eliminate >>number 4 as I've mentioned before. I think there is a place for having >>functions that can be over-written with better versions. I agree that >>it could be implemented better, however, with some kind of register >>function instead of automatically looking in scipy... > > Mmh, I think I'm confused then: So am I, now! > it seemed to me that pkgload() WAS overwriting > numpy names, from the messages which the environment variable controls. Is > that not true? Here's a recent thread: > > http://aspn.activestate.com/ASPN/Mail/Message/scipy-dev/2974044 > > where this was shown: > > In [3]: import scipy > Overwriting fft= from > scipy.fftpack.basic (was from > numpy.dft.fftpack) > Overwriting ifft= from > scipy.fftpack.basic (was from > numpy.dft.fftpack) [~]$ python Python 2.4.1 (#2, Mar 31 2005, 00:05:10) [GCC 3.3 20030304 (Apple Computer, Inc. build 1666)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import scipy scipy.>>> scipy.pkgload(verbose=2) Imports to 'scipy' namespace ---------------------------- __all__.append('io') import lib -> success Overwriting lib= from /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy-0.4.4.1307-py2.4-macosx-10.4-ppc.egg/scipy/lib/__init__.pyc (was from /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/numpy-0.9.4.1849-py2.4-macosx-10.4-ppc.egg/numpy/lib/__init__.pyc) __all__.append('signal') __all__.append('interpolate') __all__.append('lib.lapack') import cluster -> success __all__.append('montecarlo') __all__.append('fftpack') __all__.append('sparse') __all__.append('integrate') __all__.append('optimize') __all__.append('special') import lib.blas -> success __all__.append('linalg') __all__.append('stats') >>> import numpy >>> numpy.lib >>> scipy.lib [Ignore the SVN version numbers. They are faked. I was using checkouts from last night.] > I understood this as 'scipy.fftpack.basic.fft overwrote > numpy.dft.fftpack.fft'. Does this then not affect the numpy namespace at all? If it does, then I agree with you that this should change. > I also would like to propose that, rather than using an environment variable, > pkgload() takes a 'verbose=' keyword (or 'quiet='). I think it's much cleaner > to say > > pkgload(quiet=1) or pkgload(verbose=0) > > than relying on users configuring env. variables for something like this. It does take a verbose keyword argument. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From robert.kern at gmail.com Wed Jan 18 15:08:34 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 18 Jan 2006 14:08:34 -0600 Subject: [SciPy-dev] [Numpy-discussion] importing madness In-Reply-To: <43CE9178.3050602@ftw.at> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE9178.3050602@ftw.at> Message-ID: <43CEA042.20108@gmail.com> Ed Schofield wrote: > Unless I'm doing something very stupid, there seem to be multiple > sources of evil here. First, numpy's linalg package is available from > the scipy namespace, which seems like a recipe for Ed's madness, since > he can't find his pinv2() function. Second, scipy.pkgload('linalg') > silently fails to make any visible difference. This is probably a > simple bug. It's certainly not intended behavior. The process by which pkgload() determines where it is being called from is messing up. pkgload lives in numpy._import_tools, but it is also exposed in the scipy namespace, too. Please file a bug report and assign it to Pearu. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Jan 18 16:21:49 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 18 Jan 2006 14:21:49 -0700 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: <1137614424.32536.1.camel@ignacio.lan> References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> <1137614424.32536.1.camel@ignacio.lan> Message-ID: <43CEB16D.5000905@ee.byu.edu> Ignacio Vazquez-Abrams wrote: >On Wed, 2006-01-18 at 11:26 -0500, Neal Becker wrote: > > >>Also, f2py seems to be a separate package that is already in fedora extras. >>I guess it needs to be removed from scipy package (or renamed). >> >> > >I intend on splitting the numpy package into numpy and f2py subpackages. > > > Be careful, here. I'm pretty sure the numpy.f2py only works with numpy while the other f2py out there works with numarray and Numeric. -Travis From oliphant at ee.byu.edu Wed Jan 18 16:24:21 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 18 Jan 2006 14:24:21 -0700 Subject: [SciPy-dev] [Numpy-discussion] importing madness In-Reply-To: <43CEA042.20108@gmail.com> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE9178.3050602@ftw.at> <43CEA042.20108@gmail.com> Message-ID: <43CEB205.2030302@ee.byu.edu> Robert Kern wrote: >Ed Schofield wrote: > > > >>Unless I'm doing something very stupid, there seem to be multiple >>sources of evil here. First, numpy's linalg package is available from >>the scipy namespace, which seems like a recipe for Ed's madness, since >>he can't find his pinv2() function. Second, scipy.pkgload('linalg') >>silently fails to make any visible difference. This is probably a >>simple bug. >> >> > >It's certainly not intended behavior. The process by which pkgload() determines >where it is being called from is messing up. pkgload lives in >numpy._import_tools, but it is also exposed in the scipy namespace, too. > > > I think the problem is that I recently added numpy.linalg to the linalg namespace (and loaded it by default) and scipy imports the full numpy namespace for backwards compatibility. -Travis From oliphant at ee.byu.edu Wed Jan 18 17:04:12 2006 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 18 Jan 2006 15:04:12 -0700 Subject: [SciPy-dev] [Numpy-discussion] importing madness In-Reply-To: <43CEB205.2030302@ee.byu.edu> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE9178.3050602@ftw.at> <43CEA042.20108@gmail.com> <43CEB205.2030302@ee.byu.edu> Message-ID: <43CEBB5C.5030502@ee.byu.edu> Travis Oliphant wrote: >Robert Kern wrote: > > > >>It's certainly not intended behavior. The process by which pkgload() determines >>where it is being called from is messing up. pkgload lives in >>numpy._import_tools, but it is also exposed in the scipy namespace, too. >> >> >> >> >> > >I think the problem is that I recently added numpy.linalg to the linalg > > >namespace (and loaded it by default) and scipy imports the full numpy >namespace for backwards compatibility. > > Read that ... added numpy.linalg to the *numpy* namespace .... -Travis From pearu at scipy.org Wed Jan 18 16:14:01 2006 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 18 Jan 2006 15:14:01 -0600 (CST) Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: <43CE90AA.2020104@colorado.edu> References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE86C5.1050702@ieee.org> <43CE90AA.2020104@colorado.edu> Message-ID: On Wed, 18 Jan 2006, Fernando Perez wrote: > Mmh, I think I'm confused then: it seemed to me that pkgload() WAS > overwriting numpy names, from the messages which the environment variable > controls. Is that not true? Here's a recent thread: > > http://aspn.activestate.com/ASPN/Mail/Message/scipy-dev/2974044 pkgload() is overwriting numpy names in *scipy namespace*. To be explicit, the following is going on in scipy/__init__.py when pkgload is called, pseudocode follows: from numpy import * # imports fft old_fft = fft from scipy.fftpack import fft print 'Overwriting',old_fft,'with',fft del old_fft And nothing else! So, scipy.fft, that was numpy.fft, is set to scipy.fftpack.fft. numpy.fft remains the same. ... > I understood this as 'scipy.fftpack.basic.fft overwrote > numpy.dft.fftpack.fft'. Does this then not affect the numpy namespace at > all? No! See above. > I also would like to propose that, rather than using an environment variable, > pkgload() takes a 'verbose=' keyword (or 'quiet='). I think it's much > cleaner to say > > pkgload(quiet=1) or pkgload(verbose=0) > > than relying on users configuring env. variables for something like this. pkgload has already verbose kwd argumend. See ?pkgload for more information. Pearu From Fernando.Perez at colorado.edu Wed Jan 18 17:28:56 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 18 Jan 2006 15:28:56 -0700 Subject: [SciPy-dev] [Numpy-discussion] Re: numpy/scipy some remarks In-Reply-To: References: <43CE3E27.5080207@gmail.com> <43CE8077.8020706@colorado.edu> <43CE83D5.4060105@gmail.com> <43CE86C5.1050702@ieee.org> <43CE90AA.2020104@colorado.edu> Message-ID: <43CEC128.30601@colorado.edu> Pearu Peterson wrote: >>I understood this as 'scipy.fftpack.basic.fft overwrote >>numpy.dft.fftpack.fft'. Does this then not affect the numpy namespace at >>all? > > > No! See above. Ah, OK. Thanks for the clarification, I misunderstood the message. >>I also would like to propose that, rather than using an environment variable, >>pkgload() takes a 'verbose=' keyword (or 'quiet='). I think it's much >>cleaner to say >> >>pkgload(quiet=1) or pkgload(verbose=0) >> >>than relying on users configuring env. variables for something like this. > > > > pkgload has already verbose kwd argumend. See ?pkgload for more > information. What is this '?foo' syntax you speak of? Certainly not python... ;-) Sorry for not checking, I just don't have new-scipy on my work machine, only at home. Cheers, f From ivazquez at ivazquez.net Thu Jan 19 22:44:27 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Thu, 19 Jan 2006 22:44:27 -0500 Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: <43CEB16D.5000905@ee.byu.edu> References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> <1137614424.32536.1.camel@ignacio.lan> <43CEB16D.5000905@ee.byu.edu> Message-ID: <1137728667.3410.45.camel@ignacio.lan> On Wed, 2006-01-18 at 14:21 -0700, Travis Oliphant wrote: > Ignacio Vazquez-Abrams wrote: > >On Wed, 2006-01-18 at 11:26 -0500, Neal Becker wrote: > > > >>Also, f2py seems to be a separate package that is already in fedora extras. > >>I guess it needs to be removed from scipy package (or renamed). > > > >I intend on splitting the numpy package into numpy and f2py subpackages. > > Be careful, here. I'm pretty sure the numpy.f2py only works with numpy > while the other f2py out there works with numarray and Numeric. I'm afraid the logic of this has escaped me. Why is it not possible to make the new f2py backwards-compatible? Anyways, the man page (that gets placed in the wrong location on install, BTW) still contains references to the old packages and website, and the script also has a reference there. -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From ivazquez at ivazquez.net Fri Jan 20 01:23:44 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Fri, 20 Jan 2006 01:23:44 -0500 Subject: [SciPy-dev] Running tests outside site-packages (was: Re: numpy-0.9.2, scipy-0.4.4 RPMs) In-Reply-To: References: Message-ID: <1137738225.3410.49.camel@ignacio.lan> I'm trying to write the %check section of the numpy spec file and I keep running into the following: + /usr/bin/python -c 'import numpy ; numpy.test(1, 1)' numpy/__init__.py:47: UserWarning: Module numpy was already imported from numpy/__init__.pyc, but /home/ignacio/work/rpmbuild/BUILD/numpy-0.9.2 is being added to sys.path import pkg_resources as _pk # activate namespace packages (manipulates __path__) Running from numpy source directory. Traceback (most recent call last): File "", line 1, in ? AttributeError: 'module' object has no attribute 'test' How can I get the tests to run? -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From pearu at scipy.org Fri Jan 20 02:37:39 2006 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 20 Jan 2006 01:37:39 -0600 (CST) Subject: [SciPy-dev] numpy-0.9.2, scipy-0.4.4 RPMs In-Reply-To: <1137728667.3410.45.camel@ignacio.lan> References: <1137553638.19832.0.camel@ignacio.lan> <200601180954.19840.ravi@ati.com> <1137614424.32536.1.camel@ignacio.lan> <43CEB16D.5000905@ee.byu.edu> <1137728667.3410.45.camel@ignacio.lan> Message-ID: On Thu, 19 Jan 2006, Ignacio Vazquez-Abrams wrote: > On Wed, 2006-01-18 at 14:21 -0700, Travis Oliphant wrote: >> Ignacio Vazquez-Abrams wrote: >>> On Wed, 2006-01-18 at 11:26 -0500, Neal Becker wrote: >>> >>>> Also, f2py seems to be a separate package that is already in fedora extras. >>>> I guess it needs to be removed from scipy package (or renamed). >>> >>> I intend on splitting the numpy package into numpy and f2py subpackages. >> >> Be careful, here. I'm pretty sure the numpy.f2py only works with numpy >> while the other f2py out there works with numarray and Numeric. > > I'm afraid the logic of this has escaped me. Why is it not possible to > make the new f2py backwards-compatible? It's possible but the efford is not worth of it, IMO. Old f2py installed as f2py2e works with Numeric and numarray. I plan to keep it available and fix bugs in CVS but no new features will be added. The new f2py that is installed as numpy.f2py works only with numpy. f2py2e and numpy.f2py both install f2py script and f2py.1. This nameing conflict must be fixed. > Anyways, the man page (that gets placed in the wrong location on > install, BTW) still contains references to the old packages and website, > and the script also has a reference there. I haven't had time to update man pages nor home pages of f2py. It's in my todo list.. Pearu From pearu at scipy.org Fri Jan 20 02:43:55 2006 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 20 Jan 2006 01:43:55 -0600 (CST) Subject: [SciPy-dev] Running tests outside site-packages (was: Re: numpy-0.9.2, scipy-0.4.4 RPMs) In-Reply-To: <1137738225.3410.49.camel@ignacio.lan> References: <1137738225.3410.49.camel@ignacio.lan> Message-ID: On Fri, 20 Jan 2006, Ignacio Vazquez-Abrams wrote: > I'm trying to write the %check section of the numpy spec file and I keep > running into the following: > > + /usr/bin/python -c 'import numpy ; numpy.test(1, 1)' > numpy/__init__.py:47: UserWarning: Module numpy was already imported > from numpy/__init__.pyc, > but /home/ignacio/work/rpmbuild/BUILD/numpy-0.9.2 is being added to > sys.path > import pkg_resources as _pk # activate namespace packages (manipulates > __path__) > Running from numpy source directory. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: 'module' object has no attribute 'test' > > How can I get the tests to run? Make sure that numpy is not imported from source directory, see ^^^^ above. Check sys.path and may be you need to remove source directory from sys.path before importing numpy. Hmm, maybe something like /usr/bin/python -c 'import pkg_resources,numpy; ...' will work? Pearu From hgamboa at gmail.com Fri Jan 20 05:51:24 2006 From: hgamboa at gmail.com (Hugo Gamboa) Date: Fri, 20 Jan 2006 10:51:24 +0000 Subject: [SciPy-dev] Smooth filtering method Message-ID: <86522b1a0601200251p781f13b3te84c227800b1c695@mail.gmail.com> I came across the need of a smoothing function for my processing tasks. In attach you can find a smooth.py that I think can useful, and could be introduced in the scipy signal package. In the process of preparing the code in order to be presentable for others I collected some scipy development guidelines questions: 1. Docstrings. Is there (or should exist) a docstring guideline for numpy/scipy? In this code I make an approach using items in the docstring like: input, output, example, see also and TODO. Is this too much. I my opinion a guideline could help newcomers (like me) to contribute with standardized documentation to scipy. 2. Demo methods. I find it interesting to have some of the functions accompanied by demo code for demonstration purposes. Have any one thought about the structure for such code. Could it be included in a demo directory like the tests directory? And is it pacific to use mpl in those demos? 3. Argument verification. Should a guideline for numpy argument verification be defined? What I me think is in a wiki page that describes the steps for standard verification of the arguments, pointing to some helper functions that do some of those tasks (like atleast_Xd). Best regards, Hugo Gamboa -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smooth.py Type: text/x-python Size: 2491 bytes Desc: not available URL: From dd55 at cornell.edu Fri Jan 20 09:25:29 2006 From: dd55 at cornell.edu (Darren Dale) Date: Fri, 20 Jan 2006 09:25:29 -0500 Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <200511200949.06502.dd55@cornell.edu> References: <200510181341.12351.dd55@cornell.edu> <200511111608.49972.dd55@cornell.edu> <200511200949.06502.dd55@cornell.edu> Message-ID: <200601200925.29213.dd55@cornell.edu> On Sunday 20 November 2005 09:49, Darren Dale wrote: > On Friday 11 November 2005 4:08 pm, Darren Dale wrote: > > On Friday 11 November 2005 03:16 pm, Stephen Walton wrote: > > > Darren Dale wrote: > > > >Well, in my case, editing site.cfg alone does not work, because my > > > > fortran blas libraries are named "blas" instead of "f77blas". > > > > system_info.py has "f77blas" hardcoded in several places... I just reviewed the gentoo ebuild for numpy. This naming issue is solved in the gentoo ebuild, which is appropriate since they're using a non-standard naming convention. Darren From ivazquez at ivazquez.net Sun Jan 22 10:37:17 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Sun, 22 Jan 2006 10:37:17 -0500 Subject: [SciPy-dev] Running tests outside site-packages (was: Re: numpy-0.9.2, scipy-0.4.4 RPMs) In-Reply-To: References: <1137738225.3410.49.camel@ignacio.lan> Message-ID: <1137944237.12513.27.camel@ignacio.lan> On Fri, 2006-01-20 at 01:43 -0600, Pearu Peterson wrote: > On Fri, 20 Jan 2006, Ignacio Vazquez-Abrams wrote: > > I'm trying to write the %check section of the numpy spec file and I keep > > running into the following: > > > > + /usr/bin/python -c 'import numpy ; numpy.test(1, 1)' > > numpy/__init__.py:47: UserWarning: Module numpy was already imported > > from numpy/__init__.pyc, > > but /home/ignacio/work/rpmbuild/BUILD/numpy-0.9.2 is being added to > > sys.path > > import pkg_resources as _pk # activate namespace packages (manipulates > > __path__) > > Running from numpy source directory. > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > > Traceback (most recent call last): > > File "", line 1, in ? > > AttributeError: 'module' object has no attribute 'test' > > > > How can I get the tests to run? > > Make sure that numpy is not imported from source directory, see ^^^^ > above. Check sys.path and may be you need to remove source directory from > sys.path before importing numpy. I'm not. I'm running it from the transient install directory the rpmbuild uses. I don't know why Numpy thinks it's the source directory, but it certainly isn't. > Hmm, maybe something like > > /usr/bin/python -c 'import pkg_resources,numpy; ...' > > will work? No dice. It keeps numpy from complaining, but it doesn't cause it to pull in the definition for test. -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From strawman at astraw.com Sun Jan 22 11:02:05 2006 From: strawman at astraw.com (Andrew Straw) Date: Sun, 22 Jan 2006 08:02:05 -0800 Subject: [SciPy-dev] Smooth filtering method In-Reply-To: <86522b1a0601200251p781f13b3te84c227800b1c695@mail.gmail.com> References: <86522b1a0601200251p781f13b3te84c227800b1c695@mail.gmail.com> Message-ID: <43D3AC7D.6080400@astraw.com> Hi Hugo, As for as the demos, you could certainly make pages in the new wiki's Cookbook section. I think using matplotlib is perfectly fine. A docstring guideline - perhaps you could write one? I'm sure it would get corrected and included, but it's the initial effort that probably most of the work. Argument verification - I'm not sure what you mean. Could you post a pseudo-code example? And again, this is probably a case where you might have to do (most of) the work if you want it included. Cheers! Andrew Hugo Gamboa wrote: > I came across the need of a smoothing function for my processing > tasks. In attach you can find a smooth.py that I think can useful, and > could be introduced in the scipy signal package. > > In the process of preparing the code in order to be presentable for > others I collected some scipy development guidelines questions: > > 1. Docstrings. Is there (or should exist) a docstring guideline for > numpy/scipy? In this code I make an approach using items in the > docstring like: input, output, example, see also and TODO. Is this too > much. I my opinion a guideline could help newcomers (like me) to > contribute with standardized documentation to scipy. > > 2. Demo methods. I find it interesting to have some of the functions > accompanied by demo code for demonstration purposes. Have any one > thought about the structure for such code. Could it be included in a > demo directory like the tests directory? And is it pacific to use mpl > in those demos? > > 3. Argument verification. Should a guideline for numpy argument > verification be defined? What I me think is in a wiki page that > describes the steps for standard verification of the arguments, > pointing to some helper functions that do some of those tasks (like > atleast_Xd). > > Best regards, > > Hugo Gamboa > > >------------------------------------------------------------------------ > >import numpy > >def smooth(x,window_len=10,window='hanning'): > """smooth the data using a window with requested size. > > This method is based on the convolution of a scaled window with the signal. > The signal is prepared by introducing reflected copies of the signal > (with the window size) in both ends so that transient parts are minimized > in the begining and end part of the output signal. > > inputs: > x: the input signal > window_len: the dimension of the smoothing window > window: the type of window from 'flat', 'hanning', 'hamming', 'bartlett', 'blackman' > flat window will produce a moving average smoothing. > > output: > the smoothed signal > > example: > > t=linspace(-2,2,0.1) > x=sin(t)+randn(len(t)*0.1 > y=smooth(x) > > see also: > > numpy.hanning, numpy.hamming, numpy.bartlett, numpy.blackman, numpy.convolve > scipy.signal.lfilter > > TODO: the window parameter could be the window itself if a array instead of a string > """ > > if x.ndim != 1: > raise ValueError, "smooth only accepts 1 dimension arrays." > > if x.size < window_len: > raise ValueError, "Input vector needs to be bigger than window size." > > > > if not window in ['flat', 'hanning', 'hamming', 'bartlett', 'blackman']: > raise ValueError, "Window is on of 'flat', 'hanning', 'hamming', 'bartlett', 'blackman'" > > > s=numpy.r_[2*x[0]-x[window_len:1:-1],x,2*x[-1]-x[-1:-window_len:-1]] > > if window == 'flat': #moving average > w=ones(window_len,'d') > else: > w=eval('numpy.'+window+'(window_len)') > > y=numpy.convolve(w/w.sum(),s,mode='same') > return y[window_len-1:-window_len+1] > > > > >from numpy import * >from pylab import * > >def smooth_demo(): > > t=linspace(-4,4,100) > x=sin(t) > xn=x+randn(len(t))*0.1 > y=smooth(x) > > ws=31 > > subplot(211) > plot(ones(ws)) > > windows=['flat', 'hanning', 'hamming', 'bartlett', 'blackman'] > > hold(True) > for w in windows[1:]: > eval('plot('+w+'(ws) )') > > axis([0,30,0,1.1]) > > legend(windows) > title("The smoothing windows") > subplot(212) > plot(x) > plot(xn) > for w in windows: > plot(smooth(xn,10,w)) > l=['original signal', 'signal with noise'] > l.extend(windows) > > legend(l) > title("Smoothing a noisy signal") > show() > > >if __name__=='__main__': > smooth_demo() > > > >------------------------------------------------------------------------ > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From ivazquez at ivazquez.net Sun Jan 22 12:08:09 2006 From: ivazquez at ivazquez.net (Ignacio Vazquez-Abrams) Date: Sun, 22 Jan 2006 12:08:09 -0500 Subject: [SciPy-dev] Running tests outside site-packages (was: Re: numpy-0.9.2, scipy-0.4.4 RPMs) In-Reply-To: <1137944237.12513.27.camel@ignacio.lan> References: <1137738225.3410.49.camel@ignacio.lan> <1137944237.12513.27.camel@ignacio.lan> Message-ID: <1137949689.12513.29.camel@ignacio.lan> On Sun, 2006-01-22 at 10:37 -0500, Ignacio Vazquez-Abrams wrote: > I'm not. I'm running it from the transient install directory the > rpmbuild uses. I don't know why Numpy thinks it's the source directory, > but it certainly isn't. Never mind, I figured out the problem. -- Ignacio Vazquez-Abrams http://fedora.ivazquez.net/ gpg --keyserver hkp://subkeys.pgp.net --recv-key 38028b72 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From strawman at astraw.com Sun Jan 22 18:58:18 2006 From: strawman at astraw.com (Andrew Straw) Date: Sun, 22 Jan 2006 15:58:18 -0800 Subject: [SciPy-dev] patch to fix numpy PyArray_GetNDArrayCVersion() implementation Message-ID: <43D41C1A.7040907@astraw.com> With current NumPy, the following file will no longer compile as C++: #include "Python.h" #include #define PY_ARRAY_TYPES_PREFIX SomeUnlikelyString #include "numpy/arrayobject.h" Here's an extract from the build log: building 'numpy_bool_redefinition.test' extension gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-I/home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include -I/usr/include/python2.4 -I/home/astraw/py24-amd64/include/python2.4 -I/usr/include/python2.4 -c' gcc: src/test.cpp In file included from src/test.cpp:3: /home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include/numpy/arrayobject.h:110: error: redeclaration of C++ built-in type `int' In file included from src/test.cpp:3: /home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include/numpy/arrayobject.h:110: error: redeclaration of C++ built-in type `int' I don't pretend to completely understand what's going on, but it seems to involve the numpy/C++ naming magic of "uint". If I change the function definition to "unsigned int" in numpy/core/src/multiarraymodule.c, the issue is gone. Let me know if more info/testing is needed... -------------- next part -------------- A non-text attachment was scrubbed... Name: NDArrayCVersion.patch Type: text/x-patch Size: 457 bytes Desc: not available URL: From ashuang at gmail.com Sun Jan 22 23:39:14 2006 From: ashuang at gmail.com (Albert Huang) Date: Sun, 22 Jan 2006 23:39:14 -0500 Subject: [SciPy-dev] should Int8 be UnsignedInt8 in misc/pilutil.py? Message-ID: Hello, pilutil.imread currently converts RGB images to arrays of signed integers. I think it should be unsigned integers instead. If you do an imshow(imread( ... ) ) on any brightly colored image, then all the RGB values > 127 get whacked. Also, it's probably not very intuitive to say that -1 is brighter than 127. The attached patch fixes it for me. I just changed all occurences of 'b' to 'B', but am not too sure what the effect is in toimage(). Regards, Albert -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: pilutil-type-b2B.ptach Type: application/octet-stream Size: 1203 bytes Desc: not available URL: From jonathan.taylor at utoronto.ca Mon Jan 23 15:51:43 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Mon, 23 Jan 2006 15:51:43 -0500 Subject: [SciPy-dev] Status of 1.0 release and website? Message-ID: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> I am wondering what needs to happen before a 1.0 release of both numpy and scipy? Also... what is the current status of the new web site... it looks to me like it has basically all of the old content. Well at least the stuff that is not out of date. J. From Fernando.Perez at colorado.edu Mon Jan 23 16:02:05 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 23 Jan 2006 14:02:05 -0700 Subject: [SciPy-dev] Status of 1.0 release and website? In-Reply-To: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> References: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> Message-ID: <43D5444D.2090509@colorado.edu> Jonathan Taylor wrote: > I am wondering what needs to happen before a 1.0 release of both numpy > and scipy? > > Also... what is the current status of the new web site... it looks to > me like it has basically all of the old content. Well at least the > stuff that is not out of date. Minor note: yesterday I noticed that the 'Artificial intelligence & machine learning' link at http://new.scipy.org/Wiki/Topical_Software is busted. That page seems to have vanished. I dug it out of the wayback machine: http://web.archive.org/web/20041014062104/http://yaroslav.hopto.org/pubwiki/index.php/ai-python It would probably be best to copy the (valid) content from this archived page directly into our wiki. Cheers, f From robert.kern at gmail.com Mon Jan 23 16:03:03 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 23 Jan 2006 15:03:03 -0600 Subject: [SciPy-dev] Status of 1.0 release and website? In-Reply-To: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> References: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> Message-ID: <43D54487.8060506@gmail.com> Jonathan Taylor wrote: > Also... what is the current status of the new web site... it looks to > me like it has basically all of the old content. Well at least the > stuff that is not out of date. The decision to pull the trigger is Andrew's. We at Enthought can make the switch whenever he thinks it is ready. I believe we have resolved the "one process per click" issue. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From strawman at astraw.com Mon Jan 23 22:08:59 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 23 Jan 2006 19:08:59 -0800 Subject: [SciPy-dev] patch to fix numpy PyArray_GetNDArrayCVersion() implementation In-Reply-To: <43D41C1A.7040907@astraw.com> References: <43D41C1A.7040907@astraw.com> Message-ID: <43D59A4B.9060304@astraw.com> OK, since no one seems to be biting, let me include the full source and setup.py so that anyone can see the problem by doing a simple "python setup.py build". Andrew Straw wrote: >With current NumPy, the following file will no longer compile as C++: > >#include "Python.h" >#include >#define PY_ARRAY_TYPES_PREFIX SomeUnlikelyString >#include "numpy/arrayobject.h" > >Here's an extract from the build log: > >building 'numpy_bool_redefinition.test' extension >gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall >-Wstrict-prototypes -fPIC' >compile options: >'-I/home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include >-I/usr/include/python2.4 -I/home/astraw/py24-amd64/include/python2.4 >-I/usr/include/python2.4 -c' >gcc: src/test.cpp >In file included from src/test.cpp:3: >/home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include/numpy/arrayobject.h:110: >error: redeclaration > of C++ built-in type `int' >In file included from src/test.cpp:3: >/home/astraw/py24-amd64/lib/python2.4/site-packages/numpy-0.9.5.1980-py2.4-linux-x86_64.egg/numpy/core/include/numpy/arrayobject.h:110: >error: redeclaration > of C++ built-in type `int' > >I don't pretend to completely understand what's going on, but it seems >to involve the numpy/C++ naming magic of "uint". If I change the >function definition to "unsigned int" in >numpy/core/src/multiarraymodule.c, the issue is gone. > >Let me know if more info/testing is needed... > > >------------------------------------------------------------------------ > >Index: numpy/core/src/multiarraymodule.c >=================================================================== >--- numpy/core/src/multiarraymodule.c (revision 1980) >+++ numpy/core/src/multiarraymodule.c (working copy) >@@ -5112,8 +5112,8 @@ > /*MULTIARRAY_API > GetNDArrayCVersion > */ >-static uint >+static unsigned int > PyArray_GetNDArrayCVersion(void) > { >- return (uint)NDARRAY_VERSION; >+ return (unsigned int)NDARRAY_VERSION; > } > > static char > > >------------------------------------------------------------------------ > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > -------------- next part -------------- A non-text attachment was scrubbed... Name: numpy_bool_redefinition.tar.gz Type: application/x-tar Size: 495 bytes Desc: not available URL: From oliphant.travis at ieee.org Tue Jan 24 00:45:51 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 23 Jan 2006 22:45:51 -0700 Subject: [SciPy-dev] patch to fix numpy PyArray_GetNDArrayCVersion() implementation In-Reply-To: <43D59A4B.9060304@astraw.com> References: <43D41C1A.7040907@astraw.com> <43D59A4B.9060304@astraw.com> Message-ID: <43D5BF0F.9060306@ieee.org> Andrew Straw wrote: > OK, since no one seems to be biting, let me include the full source > and setup.py so that anyone can see the problem by doing a simple > "python setup.py build". > > Andrew Straw wrote: > >> With current NumPy, the following file will no longer compile as C++: >> >> #include "Python.h" >> #include >> #define PY_ARRAY_TYPES_PREFIX SomeUnlikelyString >> #include "numpy/arrayobject.h" >> >> Here's an extract from the build log: > I think this has to do with the use of the C-API auto generator but I'm not sure. So, now I'm concerned about the use of the Bool variable in a few of the API calls. I applied your patch in SVN. -Travis From schofield at ftw.at Tue Jan 24 11:28:51 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 24 Jan 2006 17:28:51 +0100 Subject: [SciPy-dev] Monte Carlo package Message-ID: <43D655C3.8030109@ftw.at> Hi all, The Monte Carlo package now builds and runs fine for me on MinGW. I'd now like to ask for some help with testing on other platforms. (Are there any that don't define the rand() function? ;) If there are no cries of anguish I'll move it to the main tree on Friday. -- Ed From schofield at ftw.at Tue Jan 24 12:35:45 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 24 Jan 2006 18:35:45 +0100 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D65625.4070707@gmail.com> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> Message-ID: <43D66571.9030309@ftw.at> Robert Kern wrote: >Ed Schofield wrote: > > >>Hi all, >> >>The Monte Carlo package now builds and runs fine for me on MinGW. I'd >>now like to ask for some help with testing on other platforms. (Are >>there any that don't define the rand() function? ;) >> >> >I really would prefer that it not use rand() but use numpy.random instead. What >do you need to make this happen? > > Hmmm ... do you mean rk_random() defined in numpy/random/mtrand/randomkit.c? I guess I could link to this instead of the system default rand(), but this is probably overkill for this application. I just need a single 30-bit random number per instance, to use as a seed. I could use the system time directly, but at the moment I use rand() to scramble it up a little first: /* First seed our faster XORshift RNG. */ /* We need to do this in a way that supports non-SVID C platforms. * We use rand() twice, constructing a 30-bit integer out of the * two results. */ srand( (unsigned int) time( NULL )); seedhigh = rand(); seedlow = rand(); /* Construct a 30-bit random integer out of these. */ jxr = ((seedhigh & 0x3fff) << 16) + (seedlow & 0xffff); I fear that depending on the randomkit.h header would just add an unnecessary build dependency. Do you still see a need for it? -- Ed From robert.kern at gmail.com Tue Jan 24 14:49:15 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 24 Jan 2006 13:49:15 -0600 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D66571.9030309@ftw.at> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> Message-ID: <43D684BB.6030009@gmail.com> Ed Schofield wrote: > Robert Kern wrote: >>I really would prefer that it not use rand() but use numpy.random instead. What >>do you need to make this happen? > > Hmmm ... do you mean rk_random() defined in > numpy/random/mtrand/randomkit.c? Yes. > I guess I could link to this instead of > the system default rand(), but this is probably overkill for this > application. I just need a single 30-bit random number per instance, to > use as a seed. I could use the system time directly, but at the moment > I use rand() to scramble it up a little first: > > /* First seed our faster XORshift RNG. */ > /* We need to do this in a way that supports non-SVID C platforms. > * We use rand() twice, constructing a 30-bit integer out of the > * two results. */ > srand( (unsigned int) time( NULL )); > seedhigh = rand(); > seedlow = rand(); > > /* Construct a 30-bit random integer out of these. */ > jxr = ((seedhigh & 0x3fff) << 16) + (seedlow & 0xffff); So there's no way for the user to specify a seed? That's not acceptable for scientific use. Preferably, I should be able to pass in a numpy.random.RandomState object, and use the Mersenne Twister generator instead of the XORshift algorithm. If I can do that, then I don't mind also exposing the XORshift algorithm, too, *if* I can set the seed for that algorithm. > I fear that depending on the randomkit.h header would just add an > unnecessary build dependency. Do you still see a need for it? Yes. We can add the appropriate headers to numpy and expose the RandomState extension object to be used in other extension modules. I should do that anyways, so this is as good a time as any. So what do you need from me to make this happen? -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cookedm at physics.mcmaster.ca Tue Jan 24 15:44:13 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 24 Jan 2006 15:44:13 -0500 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D66571.9030309@ftw.at> (Ed Schofield's message of "Tue, 24 Jan 2006 18:35:45 +0100") References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> Message-ID: Ed Schofield writes: > Robert Kern wrote: > >>Ed Schofield wrote: >> >> >>>Hi all, >>> >>>The Monte Carlo package now builds and runs fine for me on MinGW. I'd >>>now like to ask for some help with testing on other platforms. (Are >>>there any that don't define the rand() function? ;) >>> >>> >>I really would prefer that it not use rand() but use numpy.random instead. What >>do you need to make this happen? >> >> > Hmmm ... do you mean rk_random() defined in > numpy/random/mtrand/randomkit.c? I guess I could link to this instead of > the system default rand(), but this is probably overkill for this > application. I just need a single 30-bit random number per instance, to > use as a seed. I could use the system time directly, but at the moment > I use rand() to scramble it up a little first: If you just need a random seed, you don't need a C API to get it: you could call Python (random.random(), for instance): unsigned int random_seed(void) { PyObject *randomModule = NULL, *orandom = NULL, *oseed = NULL; unsigned int seed = 0; randomModule = PyImport_ImportModule("random"); if (!randomModule) { goto fail; } orandom = PyObject_GetAttrString(randomModule, "random"); if (!orandom) { goto fail; } oseed = PyObject_CallObject(orandom, NULL); if (!oseed) { goto fail; } seed = (unsigned int)PyInt_AsLong(oseed); fail: Py_XDECREF(oseed); Py_XDECREF(orandom); Py_XDECREF(randomModule); return seed; } -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Tue Jan 24 15:49:58 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 24 Jan 2006 15:49:58 -0500 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: (David M. Cooke's message of "Tue, 24 Jan 2006 15:44:13 -0500") References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> Message-ID: cookedm at physics.mcmaster.ca (David M. Cooke) writes: > > If you just need a random seed, you don't need a C API to get it: you > could call Python (random.random(), for instance): > > unsigned int > random_seed(void) > { > PyObject *randomModule = NULL, *orandom = NULL, *oseed = NULL; > unsigned int seed = 0; > > randomModule = PyImport_ImportModule("random"); > if (!randomModule) { goto fail; } > > orandom = PyObject_GetAttrString(randomModule, "random"); > if (!orandom) { goto fail; } > > oseed = PyObject_CallObject(orandom, NULL); > if (!oseed) { goto fail; } > > seed = (unsigned int)PyInt_AsLong(oseed); > > fail: > Py_XDECREF(oseed); > Py_XDECREF(orandom); > Py_XDECREF(randomModule); > return seed; > } Or something like that (random.random() returns a Python float, not a long :) -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From robert.kern at gmail.com Tue Jan 24 15:50:02 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 24 Jan 2006 14:50:02 -0600 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D66571.9030309@ftw.at> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> Message-ID: <43D692FA.7090702@gmail.com> Ed Schofield wrote: > /* First seed our faster XORshift RNG. */ Actually, I won't accept an XORshift PRNG either unless if it addresses the issues discussed in this paper: http://www.iro.umontreal.ca/~lecuyer/myftp/papers/xorshift.pdf -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Tue Jan 24 15:03:10 2006 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 24 Jan 2006 14:03:10 -0600 (CST) Subject: [SciPy-dev] numpy.typeinfo missing In-Reply-To: <43D680DE.80407@ieee.org> References: <41717FD0-92EA-4537-A8E0-1E73273C0461@nrl.navy.mil> <43D680DE.80407@ieee.org> Message-ID: Hi Travis, Earlier versions of numpy had typeinfo dictionary that contained information about dtype, elsize, dtypechar, bits of an arrays type for given C contant, say BOOL, BYTE, etc. Now typeinfo dictionary is missing and I wonder what is the current way to resolve the above information from Python? Pearu From icy.flame.gm at gmail.com Tue Jan 24 16:24:44 2006 From: icy.flame.gm at gmail.com (iCy-fLaME) Date: Tue, 24 Jan 2006 21:24:44 +0000 Subject: [SciPy-dev] typo in signal.kaiserord() Message-ID: kaiserord(ripple, width) in filter_design.py, ---------------------------------------------------------- beta = select([A>50, A>21], [0.1102*(A-8.7), 0.5842*(A-21)**(0.4) + 0.07866*(A-21)], 0.0) ---------------------------------------------------------- => 0.7866 should be 0.7886 instead ===================================== Also in my system at lest it complain about pi, ceil and select are not defined. from scipy import select from scipy import ceil from scipy import pi will fix that. but then: ---------------------------------------------------------- >>> A = 20 >>> beta = select([A>50, A>21], [0.1102*(A-8.7), 0.5842*(A-21)**(0.4) + 0.07886*(A-21)], 0.0) ValueError: negative number cannot be raised to a fractional power ---------------------------------------------------------- Looks like the else part is not performed correctly I fixed it with: ---------------------------------------------------------- if A> 50: beta = 0.1102 * (A-8.7) elif A>= 21: beta = (0.5842 * (A-21))**0.4 + 0.07886 * (A-21) else: beta = 0 ---------------------------------------------------------- iCy From nwagner at mecha.uni-stuttgart.de Tue Jan 24 16:24:42 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 24 Jan 2006 22:24:42 +0100 Subject: [SciPy-dev] check_blas Message-ID: scipy.test(1,10) results in ====================================================================== ERROR: check_blas (scipy.lib.blas.test_blas.test_blas) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/lib/blas/tests/test_blas.py", line 193, in check_blas assert_array_almost_equal(gemm(1,a,b),[[3]],15) error: (trans_b?kb==k:ldb==k) failed for hidden n >>> scipy.__version__ '0.4.5.1571' ---------------------------------------------------------------------- Ran 1034 tests in 1.806s From oliphant.travis at ieee.org Tue Jan 24 16:41:29 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 24 Jan 2006 14:41:29 -0700 Subject: [SciPy-dev] numpy.typeinfo missing In-Reply-To: References: <41717FD0-92EA-4537-A8E0-1E73273C0461@nrl.navy.mil> <43D680DE.80407@ieee.org> Message-ID: <43D69F09.9020800@ieee.org> Pearu Peterson wrote: >Hi Travis, > >Earlier versions of numpy had typeinfo dictionary that contained >information about dtype, elsize, dtypechar, bits of an arrays type for >given C contant, say BOOL, BYTE, etc. > >Now typeinfo dictionary is missing and I wonder what is the current way to >resolve the above information from Python? > > > It's still there. But, the dtype objects themselves are much more informative. So just do d = numpy.dtype(byte) dir(d) '__class__', '__cmp__', '__delattr__', '__doc__', '__getattribute__', '__hash__', '__init__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__setstate__', '__str__', 'alignment', 'arrdescr', 'byteorder', 'char', 'fields', 'isbuiltin', 'isnative', 'itemsize', 'kind', 'name', 'newbyteorder', 'num', 'str', 'subdescr', 'type'] The original dictionary is in numeric.core.multiarray as typeinfo (should you need it). -Travis From oliphant.travis at ieee.org Tue Jan 24 17:00:51 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Tue, 24 Jan 2006 15:00:51 -0700 Subject: [SciPy-dev] typo in signal.kaiserord() In-Reply-To: References: Message-ID: <43D6A393.3050909@ieee.org> iCy-fLaME wrote: >kaiserord(ripple, width) in filter_design.py, > >---------------------------------------------------------- >beta = select([A>50, A>21], > [0.1102*(A-8.7), 0.5842*(A-21)**(0.4) + 0.07866*(A-21)], > 0.0) >---------------------------------------------------------- > >=> 0.7866 should be 0.7886 instead > > > Good catch... >===================================== > >Also in my system at lest it complain about pi, ceil and select are not defined. > from scipy import select > from scipy import ceil > from scipy import pi >will fix that. > > > select was indeed missing in the transition to numpy but that should be all (pi is imported and ceil should be obtained from numpy.core.umath). >but then: >---------------------------------------------------------- > > >>>>A = 20 >>>>beta = select([A>50, A>21], [0.1102*(A-8.7), 0.5842*(A-21)**(0.4) >>>> >>>> >+ 0.07886*(A-21)], 0.0) > >ValueError: negative number cannot be raised to a fractional power >---------------------------------------------------------- > >Looks like the else part is not performed correctly > >I fixed it with: >---------------------------------------------------------- > if A> 50: > beta = 0.1102 * (A-8.7) > elif A>= 21: > beta = (0.5842 * (A-21))**0.4 + 0.07886 * (A-21) > else: > beta = 0 >---------------------------------------------------------- > > > Thanks again. The select usage was intended to support array inputs and array outputs, but it doesn't work as advertised anyway and is documented only for scalar inputs, and so I used your simpler code. -Travis From schofield at ftw.at Wed Jan 25 10:40:33 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 25 Jan 2006 16:40:33 +0100 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D684BB.6030009@gmail.com> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> <43D684BB.6030009@gmail.com> Message-ID: <43D79BF1.2030009@ftw.at> Robert Kern wrote: >So there's no way for the user to specify a seed? That's not acceptable for >scientific use. Preferably, I should be able to pass in a >numpy.random.RandomState object ... > If by 'scientific use' you mean cryptography, I'd agree with you. But for most Monte Carlo applications (yes, scientific ones) a user-definable seed is a luxury, not a necessity. > Actually, I won't accept an XORshift PRNG either unless if it > addresses the issues discussed in this paper Robert, your tone is arrogant. By saying that *you* won't accept it, you are implying that you are in charge. Remember that SciPy is a community project. This means we take joint decisions, and are polite to each other. Despite your rudeness, you have a valid point. Stronger RNGs exist, such as the Mersenne Twister and L'Ecuyer's own 7-xorshift generator. We should probably change the default to one of these, reusing existing code from numpy where possible. -- Ed From robert.kern at gmail.com Wed Jan 25 11:10:02 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 25 Jan 2006 10:10:02 -0600 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D79BF1.2030009@ftw.at> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> <43D684BB.6030009@gmail.com> <43D79BF1.2030009@ftw.at> Message-ID: <43D7A2DA.9060003@gmail.com> Ed Schofield wrote: > Robert Kern wrote: > >>So there's no way for the user to specify a seed? That's not acceptable for >>scientific use. Preferably, I should be able to pass in a >>numpy.random.RandomState object ... > > If by 'scientific use' you mean cryptography, I'd agree with you. But > for most Monte Carlo applications (yes, scientific ones) a > user-definable seed is a luxury, not a necessity. No, it really is a necessity. I need to be able to do Monte Carlo runs repeatably. If nothing else, we need to be able to write unit tests that test the same thing every time. >>Actually, I won't accept an XORshift PRNG either unless if it >>addresses the issues discussed in this paper > > Robert, your tone is arrogant. By saying that *you* won't accept it, > you are implying that you are in charge. Remember that SciPy is a > community project. This means we take joint decisions, and are polite > to each other. No, what I am trying to imply is that I can't accept such code for my own use just like I can't accept POSIX rand() or RANLIB ranf(). However, as I am also a Scipy developer with experience in this field, I will also vote against including bad XORshift PRNGs. I don't pretend that my acceptance is determinative, though. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From robert.kern at gmail.com Wed Jan 25 11:41:23 2006 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 25 Jan 2006 10:41:23 -0600 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D79BF1.2030009@ftw.at> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> <43D684BB.6030009@gmail.com> <43D79BF1.2030009@ftw.at> Message-ID: <43D7AA33.9030906@gmail.com> Ed Schofield wrote: > If by 'scientific use' you mean cryptography, I'd agree with you. But > for most Monte Carlo applications (yes, scientific ones) a > user-definable seed is a luxury, not a necessity. Let me give an example. At this year's Scipy conference, Fernando P?rez gave a demonstration of the parallel features of the development branch of ipython. He sent code to a small cluster of machines to generate random numbers, do some computation and return the results to his computer at the conference. He had to send different code fragments to each machine with different time.sleep(x) calls such that the old RANLIB PRNG would get different seeds. Otherwise, all 10 machines would have generated the same data and the same results. Now, giving each stream a different seed is not the ideal way to implement parallel streams of pseudorandom numbers, but it's not terribly bad, either. In the absence of a PRNG designed for parallel use, it's a reasonable compromise. Another would be to leapfrog the streams (for N streams, stream i will generate N numbers and take the i'th number and discard the rest each time it needs a sample), but that also requires each stream to have been initialized with the same seed. This is why I think controlling the seed is a fundamental capability of pseudorandom samplers, and that we shouldn't be including samplers without that capability. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jonathan.taylor at utoronto.ca Wed Jan 25 12:19:44 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Wed, 25 Jan 2006 12:19:44 -0500 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D7A2DA.9060003@gmail.com> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> <43D684BB.6030009@gmail.com> <43D79BF1.2030009@ftw.at> <43D7A2DA.9060003@gmail.com> Message-ID: <463e11f90601250919l102a630aiaa7296de9663998e@mail.gmail.com> I agree... it is hard enough to write tests for scientific routines already. Being able to make your routines deterministic is a very desirable feature. On the other hand, I am sure it must be possible, albeit with a little work, to make the routine accept a seed... no? J. On 1/25/06, Robert Kern wrote: > Ed Schofield wrote: > > Robert Kern wrote: > > > >>So there's no way for the user to specify a seed? That's not acceptable for > >>scientific use. Preferably, I should be able to pass in a > >>numpy.random.RandomState object ... > > > > If by 'scientific use' you mean cryptography, I'd agree with you. But > > for most Monte Carlo applications (yes, scientific ones) a > > user-definable seed is a luxury, not a necessity. > > No, it really is a necessity. I need to be able to do Monte Carlo runs > repeatably. If nothing else, we need to be able to write unit tests that test > the same thing every time. > > >>Actually, I won't accept an XORshift PRNG either unless if it > >>addresses the issues discussed in this paper > > > > Robert, your tone is arrogant. By saying that *you* won't accept it, > > you are implying that you are in charge. Remember that SciPy is a > > community project. This means we take joint decisions, and are polite > > to each other. > > No, what I am trying to imply is that I can't accept such code for my own use > just like I can't accept POSIX rand() or RANLIB ranf(). However, as I am also a > Scipy developer with experience in this field, I will also vote against > including bad XORshift PRNGs. I don't pretend that my acceptance is > determinative, though. > > -- > Robert Kern > robert.kern at gmail.com > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From schofield at ftw.at Wed Jan 25 12:32:50 2006 From: schofield at ftw.at (Ed Schofield) Date: Wed, 25 Jan 2006 18:32:50 +0100 Subject: [SciPy-dev] Monte Carlo package In-Reply-To: <43D7AA33.9030906@gmail.com> References: <43D655C3.8030109@ftw.at> <43D65625.4070707@gmail.com> <43D66571.9030309@ftw.at> <43D684BB.6030009@gmail.com> <43D79BF1.2030009@ftw.at> <43D7AA33.9030906@gmail.com> Message-ID: <43D7B642.1050105@ftw.at> Robert Kern wrote: >This is why I think controlling the seed is a fundamental capability of >pseudorandom samplers, and that we shouldn't be including samplers without that >capability. > > Okay, I'm willing to add this capability. I'll work on it over the weekend. In order to use RandomState objects, I guess we just need to move the fn prototypes from mtrand.c to a separate header file, then make sure distutils can find it... -- Ed From nwagner at mecha.uni-stuttgart.de Thu Jan 26 03:01:24 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 26 Jan 2006 09:01:24 +0100 Subject: [SciPy-dev] Sparse eigensolver ?? Message-ID: <43D881D4.10906@mecha.uni-stuttgart.de> Hi all, All tests pass with the current svn versions :-) Numpy version 0.9.5.1999 Scipy version 0.4.5.1573 Many thanks for this wonderful package ! I want to ask a favour of you. Is it planned to add a sparse eigensolver to scipy in the near future ? Anyway, I look forward to this feature whenever it will be available in scipy. Nils From jonathan.taylor at utoronto.ca Thu Jan 26 17:09:07 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Thu, 26 Jan 2006 17:09:07 -0500 Subject: [SciPy-dev] Any advice on testing scientific algorithms? Message-ID: <463e11f90601261409n2edfcf60sc0d2431e001e7d0f@mail.gmail.com> I am wondering if any of you have thoughts on how to test scientific algorithms. For example, an algorithm that tries to learn some sort of probalistic model. The only thing that really comes to mind is regression testing. Any suggestions here? Would I just print the output to output.txt once, and then as a test, rerun and make sure I get the same output.txt? I suppose I would have to use a fixed seed, to make sure that I get the same results. Will floating point numbers show up differently on different architectures? Are there any other options I am not thinking of? Thanks, Jon. From Fernando.Perez at colorado.edu Thu Jan 26 17:13:08 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 26 Jan 2006 15:13:08 -0700 Subject: [SciPy-dev] Any advice on testing scientific algorithms? In-Reply-To: <463e11f90601261409n2edfcf60sc0d2431e001e7d0f@mail.gmail.com> References: <463e11f90601261409n2edfcf60sc0d2431e001e7d0f@mail.gmail.com> Message-ID: <43D94974.7050609@colorado.edu> Jonathan Taylor wrote: > I am wondering if any of you have thoughts on how to test scientific > algorithms. For example, an algorithm that tries to learn some sort > of probalistic model. > > The only thing that really comes to mind is regression testing. Any > suggestions here? > Would I just print the output to output.txt once, and then as a test, > rerun and make sure I get the same output.txt? > I suppose I would have to use a fixed seed, to make sure that I get > the same results. Will floating point numbers show up differently on > different architectures? Sorry that I can't give a longer answer right now, but the scipy test suite is a good place to look at for pointers. It also contains a bunch of utilities for testing, including comparisons for arrays with controlled floating point accuracy. I use those a lot, they're quite handy. Cheers, f From seefeld at sympatico.ca Thu Jan 26 17:17:33 2006 From: seefeld at sympatico.ca (Stefan Seefeld) Date: Thu, 26 Jan 2006 17:17:33 -0500 Subject: [SciPy-dev] Any advice on testing scientific algorithms? In-Reply-To: <463e11f90601261409n2edfcf60sc0d2431e001e7d0f@mail.gmail.com> References: <463e11f90601261409n2edfcf60sc0d2431e001e7d0f@mail.gmail.com> Message-ID: <43D94A7D.4060206@sympatico.ca> Jonathan Taylor wrote: > I am wondering if any of you have thoughts on how to test scientific > algorithms. For example, an algorithm that tries to learn some sort > of probalistic model. > > The only thing that really comes to mind is regression testing. Any > suggestions here? > Would I just print the output to output.txt once, and then as a test, > rerun and make sure I get the same output.txt? > I suppose I would have to use a fixed seed, to make sure that I get > the same results. Will floating point numbers show up differently on > different architectures? Probably, yes. In fact, if you are trying to validate statistical / stochastic algorithms involving you probably don't want to compare numbers bitwise, but rather make sure the numbers fall into a valid range. But using a fixed seed is valuable for reproducibility nonetheless. :-) Regards, Stefan From nwagner at mecha.uni-stuttgart.de Fri Jan 27 11:35:10 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 27 Jan 2006 17:35:10 +0100 Subject: [SciPy-dev] Incorporating SUNDIALS in SciPy Message-ID: <43DA4BBE.7070203@mecha.uni-stuttgart.de> Hi all, Some time ago there was a discussion on "Incorporating SUNDIALS in SciPy" http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-December/003894.html I am wonderning if this topic is still on the wishlist of the developers/users. Best regards, Nils From robert.kern at gmail.com Fri Jan 27 11:38:21 2006 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 27 Jan 2006 10:38:21 -0600 Subject: [SciPy-dev] Incorporating SUNDIALS in SciPy In-Reply-To: <43DA4BBE.7070203@mecha.uni-stuttgart.de> References: <43DA4BBE.7070203@mecha.uni-stuttgart.de> Message-ID: <43DA4C7D.9000106@gmail.com> Nils Wagner wrote: > Hi all, > > Some time ago there was a discussion on > > "Incorporating SUNDIALS in SciPy" > > http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-December/003894.html > > I am wonderning if this topic is still on the wishlist of the > developers/users. Certainly. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From erickt at dslextreme.com Sun Jan 29 15:13:49 2006 From: erickt at dslextreme.com (Erick Tryzelaar) Date: Sun, 29 Jan 2006 12:13:49 -0800 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> Message-ID: <43DD21FD.7040005@dslextreme.com> Hello all, Any chance that 0.4.4 is going to end up on sourceforge.net? I maintain the scipy release for darwinports, and I'd hate to have to hit your wiki in order to download the release. -e Ed Schofield wrote: > On 09/01/2006, at 11:32 PM, Travis Oliphant wrote: > > >> Ed Schofield wrote: >> >> >>> Hi Travis, >>> >>> I've now built the tarball and Windows binaries for the 0.4.4 SciPy >>> release. They seem to pass all the unit tests under Ubuntu 5.10 and >>> Windows 2000. I could release them on Sourceforge and/or >>> new.scipy.org >>> tonight, but I still need access. >>> >>> >>> >> Great. What do you need access to? I'm not sure how to grant it. >> Aren't you already a user on the new.scipy.org site (if you have SVN >> access you are)? Can you edit the wiki? >> > > Well, to upload to the SourceForge project site I need admin access, > which one of scipy's current SF administrators has to grant. My SF > login name is 'edschofield'. I can edit the wiki fine, but I have no > idea how to upload file attachments with MoinMoin. In theory it > should be possible, right? ;) But I may as well release scipy onto > sourceforge like you did for numpy ... > > >> Thanks for doing the release. Very, very helpful... >> > > You're welcome. It was a bit frustrating, mainly with the Windows > compilations, because it's so slooow, especially over a remote > filesystem ... ;) > > > -- Ed > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > > > From schofield at ftw.at Mon Jan 30 06:05:04 2006 From: schofield at ftw.at (Ed Schofield) Date: Mon, 30 Jan 2006 12:05:04 +0100 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43DD21FD.7040005@dslextreme.com> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> <43DD21FD.7040005@dslextreme.com> Message-ID: <43DDF2E0.2010200@ftw.at> Erick Tryzelaar wrote: >Hello all, > >Any chance that 0.4.4 is going to end up on sourceforge.net? I maintain >the scipy release for darwinports, and I'd hate to have to hit your wiki >in order to download the release. > > Yes, good point. Eric, Travis (Vaught): if you'd like me to release the files to SF.net, I need the appropriate permissions. My SF.net user name is 'edschofield'. The project also needs some simple admin work done on it -- for instance, the description states that SciPy "supplements the popular Numeric module". I'm willing to do this too. -- Ed From nwagner at mecha.uni-stuttgart.de Mon Jan 30 08:29:26 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 30 Jan 2006 14:29:26 +0100 Subject: [SciPy-dev] Scipy and event handling in differential equations Message-ID: <43DE14B6.2050807@mecha.uni-stuttgart.de> Hi all, Has scipy.integrate.odeint a framework for event detection and handling ? I would be happy about a small python script illustrating this task and its solution. Thanks in advance. Nils Reference: http://www.mathworks.com/moler/odes.pdf pp.23-- From robert.kern at gmail.com Mon Jan 30 10:18:27 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 30 Jan 2006 09:18:27 -0600 Subject: [SciPy-dev] Scipy and event handling in differential equations In-Reply-To: <43DE14B6.2050807@mecha.uni-stuttgart.de> References: <43DE14B6.2050807@mecha.uni-stuttgart.de> Message-ID: <43DE2E43.6070709@gmail.com> Nils Wagner wrote: > Hi all, > > Has scipy.integrate.odeint a framework for event detection and handling ? What kind of events? How do you want to handle them? -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Mon Jan 30 10:36:23 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 30 Jan 2006 16:36:23 +0100 Subject: [SciPy-dev] Scipy and event handling in differential equations In-Reply-To: <43DE2E43.6070709@gmail.com> References: <43DE14B6.2050807@mecha.uni-stuttgart.de> <43DE2E43.6070709@gmail.com> Message-ID: <43DE3277.8070405@mecha.uni-stuttgart.de> Robert Kern wrote: >Nils Wagner wrote: > >>Hi all, >> >>Has scipy.integrate.odeint a framework for event detection and handling ? >> > >What kind of events? How do you want to handle them? > > I have to handle a change of equation set when a certain variable e is negative or zero. Afaik this can be done inside a normal integrator. Many of them allow you to integrate until a function changes sign. When e>0 use \dot{y}_1 = f(y_1,t) Switch to \dot{y}_2 = g(y_2,t) if e <=0 Note that e is a state variable. Applications are widespread, e.g. unilateral contact problems (bouncing ball, ...) Nils From strawman at astraw.com Mon Jan 30 12:43:48 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 30 Jan 2006 09:43:48 -0800 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43DDF2E0.2010200@ftw.at> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> <43DD21FD.7040005@dslextreme.com> <43DDF2E0.2010200@ftw.at> Message-ID: <43DE5054.4000502@astraw.com> Ed Schofield wrote: >Erick Tryzelaar wrote: > > > >>Hello all, >> >>Any chance that 0.4.4 is going to end up on sourceforge.net? I maintain >>the scipy release for darwinports, and I'd hate to have to hit your wiki >>in order to download the release. >> >> >> >> >Yes, good point. Eric, Travis (Vaught): if you'd like me to release the >files to SF.net, I need the appropriate permissions. My SF.net user >name is 'edschofield'. The project also needs some simple admin work >done on it -- for instance, the description states that SciPy >"supplements the popular Numeric module". I'm willing to do this too. > > Alternatively, we could host the files on scipy.org but outside of the wiki. (Maybe the process of making the new wiki "live" would be a good time to additionally set up a drop zone for developers to put releases?) I agree it would be good to update the text at SF. Cheers! Andrew From robert.kern at gmail.com Mon Jan 30 13:12:21 2006 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 30 Jan 2006 12:12:21 -0600 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43DE5054.4000502@astraw.com> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> <43DD21FD.7040005@dslextreme.com> <43DDF2E0.2010200@ftw.at> <43DE5054.4000502@astraw.com> Message-ID: <43DE5705.2070005@gmail.com> Andrew Straw wrote: > Alternatively, we could host the files on scipy.org but outside of the > wiki. (Maybe the process of making the new wiki "live" would be a good > time to additionally set up a drop zone for developers to put releases?) > I agree it would be good to update the text at SF. We should also put tarballs and eggs, if built, on PyPI since it actually hosts packages, now. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jh at oobleck.astro.cornell.edu Mon Jan 30 14:48:29 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Mon, 30 Jan 2006 14:48:29 -0500 Subject: [SciPy-dev] ASP, wiki, renaming Topical Software Message-ID: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> Kudos to Andrew Straw and all web contributors! I'm looking forward to having a "no apologies" web site at scipy.org again. The new site is definitely ready. I've moved Web from the first to the last topic on the ASP page, leaving Documentation as the top focus going forward. I've linked the ASP page under Developer Zone. The page has two goals: to give thoughts on what needs to be done (the roadmap link), and to serve as a place for defining, organizing, and doing projects (the rest of the ASP page). The latter is a sort of topical "Who's Who", with topics being added, people listing themselves under the projects they're working on, and users/new helpers finding project leads. If nothing else, this should help reduce mailing list traffic of the "who is taking care of xxxx" sort. However, this page has only a fraction of active workers and projects listed. Is this because the old Plone site was annoying, or because people don't want to bother? Would it be better to have a page named "Who's Who" under Developer Zone, and link the ASP roadmap separately? The page isn't useful if people don't list themselves, so please do. There are several links that are now broken, or soon will be, from pages that don't appear to be migrated (Numeric Compatibility, Doc Tools, Tutorial). Is the first of these now irrelevant, or more properly a NumPy vs. Numeric rather than SciPy vs. Numeric issue? I hope the authors of these pages will migrate them. Some wiki questions: Why are h3 and h4 headings no different from h2? In several places, I had to say "http://new.scipy.org/Wiki/..." in order to get a link address to be hidden behind its text. Probably others had to do this as well, unless they figured out something I didn't. So, when the site goes live, these will all have to change. Can that be automated? Finally, I'd like to consider renaming "Topical Software". This page has a link to every Python package that is vaguely of use to scientists and engineers. It's also a top-level heading on the main page, and the heading isn't speaking to me (at least). Some ideas: Add-on Software Add-on Software Index Additional Software Additional Software Index Application Software but it isn't all really application software Application Software Index but it isn't all really application software Software Index not clear it's separate from SciPy Topical Software if it ain't broke... If there's interest in renaming, I'll put a renaming vote grid at the top of the Topical Software page. Let me know by private email if you agree it should be renamed. If I get 10 positive emails I'll hang a vote grid off the Topical page and send another email to the list. Again, many thanks to those who have contributed to new.scipy.org! --jh-- From joe at enthought.com Mon Jan 30 16:39:02 2006 From: joe at enthought.com (Joe Cooper) Date: Mon, 30 Jan 2006 15:39:02 -0600 Subject: [SciPy-dev] SciPy migration underway Message-ID: <43DE8776.5050808@enthought.com> Hi all, As many of you know, there's been a lot of work going on on various aspects of the SciPy website--Andrew Straw and Travis Oliphant and many others have been working hard on the new Moin-based website, and it's looking really great. It has already proven to be a success in all of the ways that the Plone site has always failed (i.e. making it easy for people to get involved and contribute), and everyone concerned would now like it to be the official First Contact for anyone coming to SciPy. To that end tonight I will be making the final changes that moves the website, mailing lists and old archives and user lists, and everything else to the new server and the front page of scipy.org will be the new Moin wiki (http://new.scipy.org/Wiki for those who haven't seen it, but really want to see it even before the move takes place this afternoon). I'm putting forth an effort not to break anything in the process, but that's hoping for a lot when so many services and complicated redirects, proxy rules, and assorted stuff is involved. So, if you find any SciPy.org service is broken between now and tomorrow morning, don't be too surprised. If anything is still broken in the morning, please let me know about it, as it probably means I didn't notice. Anyway, just a heads up about the impending intermittent downtime of some or all of our services on SciPy.org during the next few hours. From oliphant.travis at ieee.org Mon Jan 30 16:52:27 2006 From: oliphant.travis at ieee.org (Travis Oliphant) Date: Mon, 30 Jan 2006 14:52:27 -0700 Subject: [SciPy-dev] Status of 1.0 release and website? In-Reply-To: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> References: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> Message-ID: <43DE8A9B.2000101@ieee.org> Jonathan Taylor wrote: > I am wondering what needs to happen before a 1.0 release of both numpy > and scipy? For a 1.0 release of numpy: In terms of features. The only thing I see as necessary is that array scalars have their own math defined so that they don't go through the full ufunc machinery. This is probably about 20-40 hours of work. I would also like to see a numcompat module that allows numarray users to compile their C-extensions with a simple include file change. Also, a convert_numarraycode.py module would be useful for converting Python code. Mostly, though, I would like to see more people using NumPy in different circumstances so that we get the bugs worked out and needed C-API changes before going to 1.0. This is the process that may take several months as I see it. -Travis From strawman at astraw.com Mon Jan 30 17:31:57 2006 From: strawman at astraw.com (Andrew Straw) Date: Mon, 30 Jan 2006 14:31:57 -0800 Subject: [SciPy-dev] ASP, wiki, renaming Topical Software In-Reply-To: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> References: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> Message-ID: <43DE93DD.9060401@astraw.com> Hi Joe, A quick response to one question. I hope to respond in greater detail to the rest of your email later. Joe Harrington wrote: >In several places, I had to say "http://new.scipy.org/Wiki/..." in >order to get a link address to be hidden behind its text. Probably >others had to do this as well, unless they figured out something I >didn't. > There is a way. See, for example, the Cookbook page. That page has the following link (in raw wiki), which I think is an example of what you're trying to do: [:Cookbook/Pyrex_and_NumPy: Pyrex and NumPy] That should let you link anything within the wiki without reverting to absolute URLs. Let's keep our links that way so we don't have headaches when the URL changes. >So, when the site goes live, these will all have to change. >Can that be automated? > > Not easily, so let's do it like above. Cheers! Andrew From jh at oobleck.astro.cornell.edu Mon Jan 30 18:05:00 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Mon, 30 Jan 2006 18:05:00 -0500 Subject: [SciPy-dev] ASP, wiki, renaming Topical Software In-Reply-To: <43DE93DD.9060401@astraw.com> (message from Andrew Straw on Mon, 30 Jan 2006 14:31:57 -0800) References: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> <43DE93DD.9060401@astraw.com> Message-ID: <200601302305.k0UN502r006790@oobleck.astro.cornell.edu> > [:Cookbook/Pyrex_and_NumPy: Pyrex and NumPy] Fixed, thanks. I hope it didn't mess with JoeC's work, or miss the boat. It would be good to get that syntax into the quick-ref at the bottom of the editing box. --jh-- From jonathan.taylor at utoronto.ca Mon Jan 30 20:11:48 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Mon, 30 Jan 2006 20:11:48 -0500 Subject: [SciPy-dev] Status of 1.0 release and website? In-Reply-To: <43DE8A9B.2000101@ieee.org> References: <463e11f90601231251n14d49c8dq4b919fb45fcf4150@mail.gmail.com> <43DE8A9B.2000101@ieee.org> Message-ID: <463e11f90601301711i40d94e76o258f4be0b5a68e7a@mail.gmail.com> Interesting... I get the feeling of feature completeness. Maybe we should think about having the next release being 0.9 or something indicating that we are in the final stages to make a 1.0 release. Just a thought. J. On 1/30/06, Travis Oliphant wrote: > Jonathan Taylor wrote: > > I am wondering what needs to happen before a 1.0 release of both numpy > > and scipy? > > For a 1.0 release of numpy: > > In terms of features. The only thing I see as necessary is that array > scalars have their own math defined so that they don't go through the > full ufunc machinery. This is probably about 20-40 hours of work. > > I would also like to see a numcompat module that allows numarray users > to compile their C-extensions with a simple include file change. Also, > a convert_numarraycode.py module would be useful for converting Python > code. > > Mostly, though, I would like to see more people using NumPy in different > circumstances so that we get the bugs worked out and needed C-API > changes before going to 1.0. This is the process that may take several > months as I see it. > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From nwagner at mecha.uni-stuttgart.de Tue Jan 31 04:04:50 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 31 Jan 2006 10:04:50 +0100 Subject: [SciPy-dev] Scipy and event handling in differential equations In-Reply-To: <43DE2E43.6070709@gmail.com> References: <43DE14B6.2050807@mecha.uni-stuttgart.de> <43DE2E43.6070709@gmail.com> Message-ID: <43DF2832.90608@mecha.uni-stuttgart.de> Robert Kern wrote: >Nils Wagner wrote: > >>Hi all, >> >>Has scipy.integrate.odeint a framework for event detection and handling ? >> > >What kind of events? How do you want to handle them? > > Robert, Just now I found some slides by Fangohr www.python-in-business.org/ep2005/download.cpy?document=8855 scipy.integrate.odeint can?t deal with a stopping criterion (i.e. integrate up to the point where some function f()==0) Is it planned to add this feature to odeint ? Nils From arnd.baecker at web.de Tue Jan 31 05:36:25 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 31 Jan 2006 11:36:25 +0100 (CET) Subject: [SciPy-dev] ASP, wiki, renaming Topical Software In-Reply-To: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> References: <200601301948.k0UJmTtH019380@oobleck.astro.cornell.edu> Message-ID: Hi, On Mon, 30 Jan 2006, Joe Harrington wrote: > Some wiki questions: Why are h3 and h4 headings no different from h2? For personal use I have modified `common.css` (which is in share/moin/htdocs/sinorca4moin/css), see the diff below (full file attached). Would that be a solution? Best, Arnd ===================================== --- common.css_orig 2006-01-20 10:52:22.000000000 +0100 +++ common.css 2006-01-31 11:25:41.000000000 +0100 @@ -38,6 +38,12 @@ h4 {font-size: 1.15em;} h5, h6 {font-size: 1em;} +/* mod. by AB: + make differences between different sections more visible. */ +h4,h5,h6 {border-left: 0px solid rgb(0,0,0)} +h3,h4 {border-bottom: 1px solid rgb(100,135,220)} +h5,h6 {border-bottom: 0px solid rgb(100,135,220)} + li p { margin: .25em 0; } ======================================== -------------- next part -------------- A non-text attachment was scrubbed... Name: common.css Type: text/css Size: 4507 bytes Desc: URL: From joe at enthought.com Tue Jan 31 06:18:27 2006 From: joe at enthought.com (Joe Cooper) Date: Tue, 31 Jan 2006 05:18:27 -0600 Subject: [SciPy-dev] The new SciPy.org Message-ID: <43DF4783.50007@enthought.com> Hi all, As threatened, the new Moin wiki has taken over the SciPy.org site. The migration has proven even more complicated than anticipated (and I anticipated a lot of complexity), so some services remain on the old server as of this morning. This will be wrapped up later today, after I get some sleep. Some of the things that have gone well: - The wiki is great. Andrew and Oliphant and lots of others did a fantastic job on it. - Mailing list archive links have an accurate Moved Temporarily redirect (i.e. google links to scipy archives will continue to work). I need to hash out with Travis how he'd like to handle the mailing list archives going forward, as he wrote the scripts that integrated them into Plone, and obviously those aren't going to work in Moin. I wrote something in Perl in the meantime, but I imagine that'll go away as soon as Travis becomes aware of it. - It will be possible to redirect most "sections" of the old plone site to a similar section on the Moin site...I just need to figure out which sections from Plone apply to what in the Wiki. This will be a time-consuming thing, but is probably worth the trouble in order to keep some google links working reasonably well some of the time. Some things that remain outstanding: - Mail and list migration. I don't expect trouble with this, it just takes a long time, and requires attention to a lot of boring details. - Takeover of the old IP, so that DNS service for the "scipy.org" domain is taken over by the new server. This one is moderately tricky, since we have a half-dozen delegated domains for the other projects hosted on the server (ipython, neuroimaging, etc.). Those have to be merged back into the scipy zone. I'm sure there are problems somewhere...But it's running relatively well, with at least working content everywhere I thought to poke. Holler if you see anything glaringly stupid caused by the change. Thanks! From nwagner at mecha.uni-stuttgart.de Tue Jan 31 06:58:38 2006 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 31 Jan 2006 12:58:38 +0100 Subject: [SciPy-dev] MemoryError on 64bit machine Message-ID: <43DF50EE.6060102@mecha.uni-stuttgart.de> scipy.test(1,10) yields ====================================================================== ERROR: check_sanity (scipy.montecarlo.intsampler.test_intsampler.test_intsampler) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/montecarlo/tests/test_intsampler.py", line 60, in check_sanity x = sampler.sample(numsamples) MemoryError ====================================================================== ERROR: check_simple (scipy.montecarlo.intsampler.test_intsampler.test_intsampler) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib64/python2.4/site-packages/scipy/montecarlo/tests/test_intsampler.py", line 41, in check_simple x = sampler.sample(numsamples) MemoryError ---------------------------------------------------------------------- Ran 1068 tests in 1.877s FAILED (errors=2) From jh at oobleck.astro.cornell.edu Tue Jan 31 09:38:53 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Tue, 31 Jan 2006 09:38:53 -0500 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: (scipy-dev-request@scipy.net) References: Message-ID: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> Joe, What is the URL for the original site now? It should be linked somewhere on the new site so we can go back and get stuff. The bottom of the ASP page (in Developer Zone) would be a good and obscure place, under Web Site. --jh-- From schofield at ftw.at Tue Jan 31 10:01:21 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 31 Jan 2006 16:01:21 +0100 Subject: [SciPy-dev] MemoryError on 64bit machine In-Reply-To: <43DF50EE.6060102@mecha.uni-stuttgart.de> References: <43DF50EE.6060102@mecha.uni-stuttgart.de> Message-ID: <43DF7BC1.10209@ftw.at> Nils Wagner wrote: > scipy.test(1,10) yields > ====================================================================== > ERROR: check_sanity > (scipy.montecarlo.intsampler.test_intsampler.test_intsampler) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib64/python2.4/site-packages/scipy/montecarlo/tests/test_intsampler.py", > line 60, in check_sanity > x = sampler.sample(numsamples) > MemoryError That's good to know -- thanks. But ... the 'montecarlo' package is back in the sandbox, waiting for some tender loving care. You must have an old SVN version... -- Ed From schofield at ftw.at Tue Jan 31 10:13:20 2006 From: schofield at ftw.at (Ed Schofield) Date: Tue, 31 Jan 2006 16:13:20 +0100 Subject: [SciPy-dev] SciPy 0.4.4 release progress In-Reply-To: <43DD21FD.7040005@dslextreme.com> References: <43BEEC62.9060205@ieee.org> <43C090D6.9010108@ieee.org> <984FF548-CD41-484A-9D31-9EE9ABB2418C@ftw.at> <43C204DC.2030902@ieee.org> <43C2D493.8060108@ftw.at> <43C2F294.2090706@ee.byu.edu> <43DD21FD.7040005@dslextreme.com> Message-ID: <43DF7E90.8070205@ftw.at> Erick Tryzelaar wrote: >Hello all, > >Any chance that 0.4.4 is going to end up on sourceforge.net? I maintain >the scipy release for darwinports, and I'd hate to have to hit your wiki >in order to download the release. > > Okay, I've now put the files on SourceForge, and Travis Vaught has put links to them on the wiki download page. Let us know if you have any trouble ... -- Ed From robert.kern at gmail.com Tue Jan 31 11:23:02 2006 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 31 Jan 2006 10:23:02 -0600 Subject: [SciPy-dev] [SciPy-user] The new SciPy.org In-Reply-To: <43DF4783.50007@enthought.com> References: <43DF4783.50007@enthought.com> Message-ID: <43DF8EE6.2000004@gmail.com> Joe Cooper wrote: > Holler if you see anything glaringly stupid caused by the change. All svn.scipy.org links redirect back to http://www.scipy.org/Wiki . My SVN client is not happy with this. -- Robert Kern robert.kern at gmail.com "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ashuang at gmail.com Tue Jan 31 11:29:18 2006 From: ashuang at gmail.com (Albert Huang) Date: Tue, 31 Jan 2006 11:29:18 -0500 Subject: [SciPy-dev] Fwd: should Int8 be UnsignedInt8 in misc/pilutil.py? In-Reply-To: References: Message-ID: sending again ---------- Forwarded message ---------- From: Albert Huang Date: Jan 22, 2006 11:39 PM Subject: should Int8 be UnsignedInt8 in misc/pilutil.py? To: scipy-dev at scipy.net Hello, pilutil.imread currently converts RGB images to arrays of signed integers. I think it should be unsigned integers instead. If you do an imshow(imread( ... ) ) on any brightly colored image, then all the RGB values > 127 get whacked. Also, it's probably not very intuitive to say that -1 is brighter than 127. The attached patch fixes it for me. I just changed all occurences of 'b' to 'B', but am not too sure what the effect is in toimage(). Regards, Albert -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- Index: Lib/misc/pilutil.py =================================================================== --- Lib/misc/pilutil.py (revision 1581) +++ Lib/misc/pilutil.py (working copy) @@ -68,7 +68,7 @@ mode = 'L' adjust = 1 str = im.tostring() - type = 'b' + type = 'B' if mode == 'F': type = 'f' if mode == 'I': @@ -132,12 +132,12 @@ image.putpalette(asarray(pal,dtype=_UInt8).tostring()) # Becomes a mode='P' automagically. elif mode == 'P': # default gray-scale - pal = arange(0,256,1,dtype='b')[:,NewAxis] * \ - ones((3,),dtype='b')[NewAxis,:] + pal = arange(0,256,1,dtype='B')[:,NewAxis] * \ + ones((3,),dtype='B')[NewAxis,:] image.putpalette(asarray(pal,dtype=_UInt8).tostring()) return image if mode == '1': # high input gives threshold for 1 - bytedata = ((data > high)*255).astype('b') + bytedata = ((data > high)*255).astype('B') image = Image.fromstring('L',shape,bytedata.tostring()) image = image.convert(mode='1') return image From travis at enthought.com Tue Jan 31 12:22:57 2006 From: travis at enthought.com (Travis N. Vaught) Date: Tue, 31 Jan 2006 11:22:57 -0600 Subject: [SciPy-dev] [SciPy-user] where to get version 0.3.2? In-Reply-To: <43DF89C5.5080801@enthought.com> References: <200601311417.29222.meesters@uni-mainz.de> <43DF7872.3050009@ftw.at> <200601311644.31853.meesters@uni-mainz.de> <43DF89C5.5080801@enthought.com> Message-ID: <43DF9CF1.5050003@enthought.com> Travis N. Vaught wrote: > Christian Meesters wrote: > >> ... >> >> Perhaps somebody has the installer saved and could send it to me? >> >> >> > I'm uploading some of the scipy 0.3.2 binaries to the sourceforge site > now--I'll repost when they're available (and update the wiki page). > > Travis > [Apologies for the cross-post] I've finished uploading the "legacy" binaries to the sourceforge site. Let me know if there are any errors or omissions. I've also disabled (from non-developer view) the forums and trackers at the sourceforge site--there are folks who have been posting there, but no one is monitoring these tools. Please use the mailing lists for questions: scipy-user at scipy.org -- subscribe at http://www.scipy.net/mailman/listinfo/scipy-user scipy-dev at scipy.org -- subscribe at http://www.scipy.net/mailman/listinfo/scipy-dev and use the Tracker for issue/bugs/feature requests: http://projects.scipy.org/scipy/scipy Thanks, Travis p.s. Developers, please help move any relevant issues from sourceforge over to the developer tracker.) From jh at oobleck.astro.cornell.edu Tue Jan 31 12:28:51 2006 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Tue, 31 Jan 2006 12:28:51 -0500 Subject: [SciPy-dev] The new SciPy.org References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> Message-ID: <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> More random comments on the web site... I edited the front-page text, so if you care about that, please take a look. In particular, I moved the web-editing stuff to Developer Zone and removed the reference to the non-existent demos. Mailing lists are hidden under Documentation. I don't think anyone will find them there (I found them by accident, after assuming they were not yet migrated). They should be under a nav tab of their own. I propose adding a heading for them. Possible names (in my preference order): Community Mailing Lists Forums Discussion I also merged and updated the content of the old ASP page into Developer Zone, moving the ASP blurb itself to the bottom. My hope is that project leads will identify themselves in the areas they work in, and that people who are looking to help in areas without a lead will likewise sign up there. It's also important to have a complete listing (with links to project pages elsewhere) of all the doc, tutorial, cookbook, etc. efforts that are underway. The Documentation tab just has the finished products, as is appropriate. Does it make sense to put docs under Trac? The plotting tutorial looks like a lot of the recipes. Should it be moved? The link to the old version is broken. Demos are specifically to attract non-users to try out the packages. Do we need separate Demos and Cookbook pages? Or perhaps a Demos page that just links to a selection of good recipes? Or an indicator on the Cookbook page that a given demo is safe for beginners? I'm concerned about sending new users into the Cookbook page to explore, and the first thing they find is a NumPy demo that shows how to use the C-API. Some of these people know that C is a programming language, and what API stands for...some... The rest will run screaming. As time goes on, I'll have many more comments like this. I could just go make the changes, but I don't want to step on others' toes. We should agree on what kinds of changes are ok just to go and make, and what needs discussion. --jh-- From travis at enthought.com Tue Jan 31 12:43:51 2006 From: travis at enthought.com (Travis N. Vaught) Date: Tue, 31 Jan 2006 11:43:51 -0600 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> Message-ID: <43DFA1D7.3040802@enthought.com> Joe Harrington wrote: > More random comments on the web site... > > I edited the front-page text, so if you care about that, please take a > look. In particular, I moved the web-editing stuff to Developer Zone > and removed the reference to the non-existent demos. > > Mailing lists are hidden under Documentation. I don't think anyone > will find them there (I found them by accident, after assuming they > were not yet migrated). They should be under a nav tab of their own. > I propose adding a heading for them. +1 > Possible names (in my preference > order): > my votes: > Community 0 > Mailing Lists +1 > Forums -1 > Discussion 0 > > I also merged and updated the content of the old ASP page into > Developer Zone, moving the ASP blurb itself to the bottom. My hope is > that project leads will identify themselves in the areas they work in, > and that people who are looking to help in areas without a lead will > likewise sign up there. It's also important to have a complete > listing (with links to project pages elsewhere) of all the doc, > tutorial, cookbook, etc. efforts that are underway. The Documentation > tab just has the finished products, as is appropriate. Does it make > sense to put docs under Trac? > Which docs? The API docs (as soon as we have some) or something else? > The plotting tutorial looks like a lot of the recipes. Should it be > moved? +1 > The link to the old version is broken. > > Demos are specifically to attract non-users to try out the packages. > Do we need separate Demos and Cookbook pages? +1 > Or perhaps a Demos page > that just links to a selection of good recipes? Or an indicator on > the Cookbook page that a given demo is safe for beginners? I'm > concerned about sending new users into the Cookbook page to explore, > and the first thing they find is a NumPy demo that shows how to use > the C-API. Some of these people know that C is a programming > language, and what API stands for...some... The rest will run > screaming. > > As time goes on, I'll have many more comments like this. I could just > go make the changes, but I don't want to step on others' toes. We > should agree on what kinds of changes are ok just to go and make, and > what needs discussion. > > --jh- At this point, I think we should embrace the wiki way--if anyone objects to content you've added, your work will be "refactored." If things start to deteriorate, we can start yelling at folks to stop changing things. At this point, however, we're "unencumbered" by a cacophony of too many contributers. That said, the organization of the "contents" should be set (and soon), if not by community consensus, at least by a small few contributers with the mandate to do it (that would be you and Andrew and anyone else so motivated). Travis From arnd.baecker at web.de Tue Jan 31 12:58:01 2006 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 31 Jan 2006 18:58:01 +0100 (CET) Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> Message-ID: On Tue, 31 Jan 2006, Joe Harrington wrote: > More random comments on the web site... > > I edited the front-page text, so if you care about that, please take a > look. In particular, I moved the web-editing stuff to Developer Zone > and removed the reference to the non-existent demos. > > Mailing lists are hidden under Documentation. I don't think anyone > will find them there (I found them by accident, after assuming they > were not yet migrated). They should be under a nav tab of their own. > I propose adding a heading for them. Possible names (in my preference > order): > > Community > Mailing Lists > Forums > Discussion Any of these looks fine - I like "Mailing lists" best (FWIW). Concerning the navigation tabs on the left hand side: Maybe "Topical Software" should be moved further down (maybe just before "Developer zone" ?) Are all those * Login * MoinMoin Powered * Python Powered * Valid HTML 4.01 under "User" necessary? The new site is a big step - many thanks to all who have made this possible! Arnd From Fernando.Perez at colorado.edu Tue Jan 31 15:14:19 2006 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 31 Jan 2006 13:14:19 -0700 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <43DF4783.50007@enthought.com> References: <43DF4783.50007@enthought.com> Message-ID: <43DFC51B.4070108@colorado.edu> Joe Cooper wrote: > Hi all, > > As threatened, the new Moin wiki has taken over the SciPy.org site. I join the chorus of Thank You! (to A. Straw also, and the others putting in serious elbow grease to make this happen). This is looking to be a good year for python and scientific computing... Cheers, f From joe at enthought.com Tue Jan 31 16:37:56 2006 From: joe at enthought.com (Joe Cooper) Date: Tue, 31 Jan 2006 15:37:56 -0600 Subject: [SciPy-dev] [SciPy-user] The new SciPy.org In-Reply-To: <43DF8EE6.2000004@gmail.com> References: <43DF4783.50007@enthought.com> <43DF8EE6.2000004@gmail.com> Message-ID: <43DFD8B4.7080604@enthought.com> Robert Kern wrote: > Joe Cooper wrote: > > >>Holler if you see anything glaringly stupid caused by the change. > > > All svn.scipy.org links redirect back to http://www.scipy.org/Wiki . My SVN > client is not happy with this. > D'oh! Will be fixed in a few minutes. From joe at enthought.com Tue Jan 31 17:03:28 2006 From: joe at enthought.com (Joe Cooper) Date: Tue, 31 Jan 2006 16:03:28 -0600 Subject: [SciPy-dev] [SciPy-user] The new SciPy.org In-Reply-To: <43DF8EE6.2000004@gmail.com> References: <43DF4783.50007@enthought.com> <43DF8EE6.2000004@gmail.com> Message-ID: <43DFDEB0.30707@enthought.com> Robert Kern wrote: > Joe Cooper wrote: > > >>Holler if you see anything glaringly stupid caused by the change. > > > All svn.scipy.org links redirect back to http://www.scipy.org/Wiki . My SVN > client is not happy with this. Fixed. From strawman at astraw.com Tue Jan 31 20:24:04 2006 From: strawman at astraw.com (Andrew Straw) Date: Tue, 31 Jan 2006 17:24:04 -0800 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <43DFA1D7.3040802@enthought.com> References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> <43DFA1D7.3040802@enthought.com> Message-ID: <43E00DB4.2080209@astraw.com> Travis N. Vaught wrote: >At this point, I think we should embrace the wiki way--if anyone objects >to content you've added, your work will be "refactored." If things >start to deteriorate, we can start yelling at folks to stop changing >things. At this point, however, we're "unencumbered" by a cacophony of >too many contributers. > +1 I'm incredibly busy with work for the next couple of weeks, so I'm not sure how much I'll be able to help. But I'm very pleased with the way things are turning out. I'm glad so many people are working on the wiki. The one thing I _can_ do is direct someone how to change what gets displayed in the left toolbar of the wiki. If you are in the "scipy" group in your login to new.scipy.org, you can edit /home/scipy/wiki/wikiconfig.py, which has a variable called "navi_bar". That controls what is in the navigation bar. I agree with jh that the "Topical Software" name is a little non-intuitive. I'm +1 for changing it. Moin supports page redirects. #redirect New_Page_Name A page with the above as its contents will redirect to New_Page_Name. Thus, pages can be moved around without too much harm done. From jonathan.taylor at utoronto.ca Tue Jan 31 20:47:07 2006 From: jonathan.taylor at utoronto.ca (Jonathan Taylor) Date: Tue, 31 Jan 2006 20:47:07 -0500 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <43E00DB4.2080209@astraw.com> References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> <43DFA1D7.3040802@enthought.com> <43E00DB4.2080209@astraw.com> Message-ID: <463e11f90601311747h35d963d3q8a13459ed2ed74ed@mail.gmail.com> Thanks for all the great work guys... It looks great. In the spirit of looking great, it would be good to change the top blue color slightly to match the scipy logo, or vice versa. Cheers. J. On 1/31/06, Andrew Straw wrote: > Travis N. Vaught wrote: > > >At this point, I think we should embrace the wiki way--if anyone objects > >to content you've added, your work will be "refactored." If things > >start to deteriorate, we can start yelling at folks to stop changing > >things. At this point, however, we're "unencumbered" by a cacophony of > >too many contributers. > > > +1 > > I'm incredibly busy with work for the next couple of weeks, so I'm not > sure how much I'll be able to help. But I'm very pleased with the way > things are turning out. I'm glad so many people are working on the wiki. > The one thing I _can_ do is direct someone how to change what gets > displayed in the left toolbar of the wiki. If you are in the "scipy" > group in your login to new.scipy.org, you can edit > /home/scipy/wiki/wikiconfig.py, which has a variable called "navi_bar". > That controls what is in the navigation bar. > > I agree with jh that the "Topical Software" name is a little > non-intuitive. I'm +1 for changing it. Moin supports page redirects. > > #redirect New_Page_Name > > A page with the above as its contents will redirect to New_Page_Name. Thus, pages can be moved around without too much harm done. > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From cookedm at physics.mcmaster.ca Tue Jan 31 21:53:32 2006 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 31 Jan 2006 21:53:32 -0500 Subject: [SciPy-dev] Changing return types in C API Message-ID: One thing I like about numarray's "native" C API is that for functions that return arrays, the return type is PyArrayObject *, as opposed to the Numeric-compatible API, where the return types are PyObject *. Returning PyObject * in this case basically means you're *always* doing a cast like this: PyObject *o; PyArrayObject *a; a = (PyArrayObject *)PyArray_ContiguousFromObject(o, PyArray_DOUBLE, 2, 2); Could we change the API so that PyArrayObject * is used as the return type? On a similar note: the 'data' member of PyArrayObject's is a char *, where it really should be a void *. Being a void pointer would have the advantage of not needing the cast in cases like this: double *adata = (double)(a->data); It would also mean that accidentally dereferencing the pointer would be a compiler error: currently, a->data[0] is valid, but if a->data was a void *, it wouldn't be. Both of these changes wouldn't require recompiling extensions (since we're just relabelling what the pointer types are), and would be source-compatible with old code (since the casts used would now be superfluous). Newer code will also be easier to write. I'll work on the changes in a svn branch if people think this is a good idea. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ndarray at mac.com Tue Jan 31 22:26:46 2006 From: ndarray at mac.com (Sasha) Date: Tue, 31 Jan 2006 22:26:46 -0500 Subject: [SciPy-dev] Changing return types in C API In-Reply-To: References: Message-ID: This probably belongs to numpy-discussion. More below. On 1/31/06, David M. Cooke wrote: > ... > Could we change the API so that PyArrayObject * is used as the return > type? > -1 Exposing the definition of the object structs is discouraged in python code. See for example a comment in intobject.h /*The type PyIntObject is (unfortunately) exposed here so we can declare _Py_TrueStruct and _Py_ZeroStruct in boolobject.h; don't use this. */ In numpy this is necessary for performance reasons so that accessors can be implemented as macros. Making return types PyArrayObject * will likely lead people to use direct access to struct members instead of macros. > On a similar note: the 'data' member of PyArrayObject's is a char *, > where it really should be a void *. Being a void pointer would have > the advantage of not needing the cast in cases like this: > > double *adata = (double)(a->data); I think you meant (double*) > > It would also mean that accidentally dereferencing the pointer would > be a compiler error: currently, a->data[0] is valid, but if a->data > was a void *, it wouldn't be. > +1 Note, however that implicit cast is illegal in C++ and may generate warnings on some compilers (I think gcc -pedantic would, but not sure). Another problem with void* is that a->data + n will become invalid, which is used in many places if I recall correctly. (a->data + n code is actually suboptimal when n is not constant because most CPUs have special opcodes for pointer arithmetics that compilers can generate if n is compile time constant) From travis at enthought.com Tue Jan 31 23:12:03 2006 From: travis at enthought.com (Travis N. Vaught) Date: Tue, 31 Jan 2006 22:12:03 -0600 Subject: [SciPy-dev] The new SciPy.org In-Reply-To: <463e11f90601311747h35d963d3q8a13459ed2ed74ed@mail.gmail.com> References: <200601311438.k0VEcrhp008472@oobleck.astro.cornell.edu> <200601311728.k0VHSpCU009122@oobleck.astro.cornell.edu> <43DFA1D7.3040802@enthought.com> <43E00DB4.2080209@astraw.com> <463e11f90601311747h35d963d3q8a13459ed2ed74ed@mail.gmail.com> Message-ID: <43E03513.8050104@enthought.com> Jonathan Taylor wrote: >Thanks for all the great work guys... It looks great. In the spirit >of looking great, it would be good to change the top blue color >slightly to match the scipy logo, or vice versa. > >Cheers. >J. > > Good catch...should be fixed now. Travis