From matthieu.brucher at gmail.com Mon Oct 1 01:20:32 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Mon, 1 Oct 2007 07:20:32 +0200 Subject: [SciPy-dev] Nifti support In-Reply-To: References: Message-ID: Excellent news !! Matthieu 2007/10/1, Jarrod Millman : > > On 9/30/07, Matthieu Brucher wrote: > > I saw that nifti file support was currently in the pipeline. If the > license > > issue is solved, I can provide you a nifticlib.i file that is compatible > > with C89 compilers (the converted file needs a C99 compiler, so > compiling it > > with MSVC is not possible), as I've done this for my hobbies some weeks > ago. > > Hey Matthieu, > > The pynifti license issue has been resolved. Michael Hanke, the > author, has released his software under an MIT license: > > http://sourceforge.net/project/shownotes.php?group_id=126549&release_id=543252 > > Early this week, Chris Burns and I will check this code into the SciPy > repository. Once we do, it would be great if you could either > integrate your nifticlib.i into the codebase or just send us your > code. > > Thanks, > > -- > Jarrod Millman > Computational Infrastructure for Research Labs > 10 Giannini Hall, UC Berkeley > phone: 510.643.4014 > http://cirl.berkeley.edu/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Tue Oct 2 13:24:56 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Tue, 2 Oct 2007 10:24:56 -0700 Subject: [SciPy-dev] adopting Python Style Guide for classes Message-ID: Hello, For those of you not on the Numpy developer's list, we have been talking about adopting the Python class naming convention to conform with Guido's style guide as closely as possible: http://www.python.org/dev/peps/pep-0008/ According to Guido, class names should use the CapWords convention. Most Python projects (eg, ETS, matploltlib) adhere to the Python naming conventions and it is confusing that NumPy and SciPy don't. Currently, both NumPy and SciPy use either lower_underscore_separated or CapWords for class names. NumPy ====== $ grep -r '^class [A-Z]' --include "*.py" * | wc -l 1014 $ grep -r '^class' --include "*.py" * | grep -v 'class [A-Z]' | wc -l 207 SciPy ===== $ grep -r '^class [A-Z]' --include "*.py" * | wc -l 587 $ grep -r '^class' --include "*.py" * | grep -v 'class [A-Z]' | wc -l 565 So far, there is Universal support for the basic idea of conforming to the Python Style Guide. Since it should effect the user API, we have already updated the TestCase classes. First, we checked in a change to allow TestCase classes to be prefixed with either 'test' or 'Test': http://projects.scipy.org/scipy/numpy/changeset/4144 Then, we updated both SciPy and NumPy to use CapWords for TestCase classes: http://projects.scipy.org/scipy/numpy/changeset/4151 http://projects.scipy.org/scipy/scipy/changeset/3388 In order to run the SciPy tests on the trunk you will need to install NumPy from svn revision 4144 or greater. Before SciPy 0.7.0 is released, we will release NumPy 1.0.4. Also, if you are adding a new class to NumPy or SciPy, please use CapWords. Now we need to decide what to do about the remaining lower_underscore_separated class names. Obviously, it is important that we are careful to not break a lot of code just to bring our class names up to standards. Cheers, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From perry at stsci.edu Tue Oct 2 14:40:56 2007 From: perry at stsci.edu (Perry Greenfield) Date: Tue, 2 Oct 2007 14:40:56 -0400 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: References: Message-ID: <58415922-2830-48FE-A7EC-089A736EEA03@stsci.edu> On Oct 2, 2007, at 1:24 PM, Jarrod Millman wrote: > So far, there is Universal support for the basic idea of conforming to > the Python Style Guide. Since it should effect the user API, we have I'm somewhat startled to see this. Perhaps my memory is very bad, but I seem to recall that there was an explicit decision in the past not to use the Python conventions for class names. The reason I recall was that scientists and engineers don't like them (or something like that). I also recall that this was a strongly held opinion by some very influential in the scipy effort ;-) Mind you, I wasn't one of those that felt that strongly about it, but do realize that the people that are voting here are not your typical users. So before this is considered a "universal" opinion, I'd like to see some of those that made the decision the other way bless it (and some rationale for the switch that addresses the original decision). Eric, Travis, what say you? Perry From rkern at enthought.com Tue Oct 2 14:49:18 2007 From: rkern at enthought.com (Robert Kern) Date: Tue, 02 Oct 2007 13:49:18 -0500 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <58415922-2830-48FE-A7EC-089A736EEA03@stsci.edu> References: <58415922-2830-48FE-A7EC-089A736EEA03@stsci.edu> Message-ID: <470292AE.10100@enthought.com> Perry Greenfield wrote: > On Oct 2, 2007, at 1:24 PM, Jarrod Millman wrote: > >> So far, there is Universal support for the basic idea of conforming to >> the Python Style Guide. Since it should effect the user API, we have > > I'm somewhat startled to see this. Perhaps my memory is very bad, but > I seem to recall that there was an explicit decision in the past not > to use the Python conventions for class names. The reason I recall > was that scientists and engineers don't like them (or something like > that). I also recall that this was a strongly held opinion by some > very influential in the scipy effort ;-) The reason that was recorded was to give everything the same interface, regardless of what it was, for the sake of non-programmers who are learning to program ("For those that object, you are skilled enough to deal with the limitation."). However, we are beginning to deviate strongly from the rest of the community, and (at least in my opinion) any benefit you get from internal consistency is extremely weak, but the annoyance of being inconsistent with the rest of the Python world is fairly strong. No one uses just numpy and scipy. > Mind you, I wasn't one of those that felt that strongly about it, but > do realize that the people that are voting here are not your typical > users. So before this is considered a "universal" opinion, I'd like > to see some of those that made the decision the other way bless it > (and some rationale for the switch that addresses the original > decision). > > Eric, Travis, what say you? They're in on it. However, they're also busy at a customer site for the week, so I get to speak for them. :-) -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From millman at berkeley.edu Tue Oct 2 19:06:45 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Tue, 2 Oct 2007 16:06:45 -0700 Subject: [SciPy-dev] Fwd: Scientific Programmer Position at UC Berkeley In-Reply-To: <1986C867-66BF-41AC-B1AC-AB7EB88C5563@berkeley.edu> References: <8098257D-E870-40FF-8981-DFD7F7295C67@berkeley.edu> <1986C867-66BF-41AC-B1AC-AB7EB88C5563@berkeley.edu> Message-ID: Hello everyone, This position will involve a fairly heavy amount of Python programming. Feel free to contact either Fritz or myself if you have any questions. Thanks, Jarrod ---------- Forwarded message ---------- From: Fritz Sommer Date: Oct 2, 2007 4:04 PM Subject: Fwd: Scientific Programmer Position at UC Berkeley To: Jarrod Millman SCIENTIFIC PROGRAMMER POSITION The Sommer lab at UC Berkeley seeks a scientific programmer to assume an integral role in the design and maintenance of the Core Services of the new NSF-Initiative Data-Sharing in Neuroscience, hosted at the Redwood Center for Theoretical Neuroscience and the Helen Wills Neuroscience Institute. Specifically, work involves the design and administration of the data repository and website used for this initiative. This includes the integration and design of these resources, the development and documentation of tools for data sharing, maintaining a data server, design and maintenance of a website, as well as support and interaction with data contributors. Qualifications: *Interest in Neuroscience *Expertise in Python, Matlab, XML and HTML *Experience with content management systems (PLONE) and version control (CVS or SVN) *Experience with binary data formats and hierarchical data formats (HDF5) *Familiarity with different unix-like platforms (Linux and Mac OS X). *Strong problem-solving abilities Salary and start date The monthly salary range is $3987 - $7318, although most offers will not exceed midpoint of the salary range. Open: immediately, with preferred start date 15 Oct 2007 or earlier. Minimum one year commitment preferred. How to apply U.C. Berkeley has an online applicant website. Visit: http://jobs.berkeley.edu/ and search for job #007217. If you have any questions, please contact Fritz Sommer: fsommer at berkeley.edu ------------------------------------------------------------------------ ----- Friedrich T. Sommer, Ph.D., Associate Adjunct Professor University of California, Berkeley Redwood Center for Theoretical Neuroscience & HWNI 3210F Tolman Hall MC 3192 Berkeley, CA 94720 phone (510) 642-7251 fax (510) 642-7206 http://redwood.berkeley.edu/wiki/Fritz_Sommer From stefan at sun.ac.za Wed Oct 3 08:44:21 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 3 Oct 2007 14:44:21 +0200 Subject: [SciPy-dev] Setting up your editor for NumPy/SciPy Message-ID: <20071003124421.GE7241@mentat.za.net> Hi all, Since we are busy cleaning up the NumPy and SciPy sources, I'd like to draw your attention to the guidelines regarding whitespace. We use 4 spaces per indentation level, in Python and C code alike (see PEP 7: http://www.python.org/dev/peps/pep-0007/ under the heading Python 3000). Lines should be a maximum of 79 characters long (to facilitate reading in text terminals), and must not have trailing whitespace. PEP 8 (http://www.python.org/dev/peps/pep-0008/) states: """ The preferred way of wrapping long lines is by using Python's implied line continuation inside parentheses, brackets and braces. If necessary, you can add an extra pair of parentheses around an expression, but sometimes using a backslash looks better. Make sure to indent the continued line appropriately. """ I attach a file, containing some common errors, which you can use to setup your editor. I also attach the settings I use under Emacs to highlight the problems. Regards St?fan -------------- next part -------------- A non-text attachment was scrubbed... Name: bad_whitespace.py Type: text/x-python Size: 440 bytes Desc: not available URL: -------------- next part -------------- ========================================= Configuring Emacs for NumPy/SciPy editing ========================================= .. contents :: .. note :: Downloaded lisp (``.el``) files should be placed in a directory on the Emacs path. I typically use ``~/elisp`` and add it to the search path using :: (add-to-list 'load-path "~/elisp") Essential to producing well-formed code ======================================= Never use tabs -------------- :: (setq-default indent-tabs-mode nil) Clean up tabs and trailing whitespace ------------------------------------- ``M-x untabify`` and ``M-x whitespace-cleanup`` Highlight unnecessary whitespace -------------------------------- Download `show-wspace.el `__ :: ; Show whitespace (require 'show-wspace) (add-hook 'python-mode-hook 'highlight-tabs) (add-hook 'font-lock-mode-hook 'highlight-trailing-whitespace) Wrap lines longer than 79 characters ------------------------------------ :: (setq fill-column 79) The ``fill-paragraph`` command (``M-q`` or ``ESC-q``) also comes in handy. Other useful tools ================== Highlight column 79 ------------------- Prevent lines from exceeding 79 characters in length. Download `column-marker.el `__ :: (require 'column-marker) (add-hook 'font-lock-mode-hook (lambda () (interactive) (column-marker-1 80))) Show a ruler with the current column position --------------------------------------------- :: (require 'ruler-mode) (add-hook 'font-lock-mode-hook 'ruler-mode) Enable restructured text (ReST) editing --------------------------------------- :: (require 'rst) (add-hook 'text-mode-hook 'rst-text-mode-bindings) Fix outline-mode to work with Python ------------------------------------ :: (add-hook 'python-mode-hook 'my-python-hook) (defun py-outline-level () "This is so that `current-column` DTRT in otherwise-hidden text" ;; from ada-mode.el (let (buffer-invisibility-spec) (save-excursion (skip-chars-forward "\t ") (current-column)))) :: ; this fragment originally came from the web somewhere, but the outline-regexp ; was horribly broken and is broken in all instances of this code floating ; around. Finally fixed by Charl P. Botha ; <http://cpbotha.net/> (defun my-python-hook () (setq outline-regexp "[^ \t\n]\\|[ \t]*\\(def[ \t]+\\|class[ \t]+\\)") ; enable our level computation (setq outline-level 'py-outline-level) ; do not use their \C-c@ prefix, too hard to type. Note this overides ;some python mode bindings ;(setq outline-minor-mode-prefix "\C-c") ; turn on outline mode (outline-minor-mode t) ; initially hide all but the headers ; (hide-body) (show-paren-mode 1) ) From perry at stsci.edu Wed Oct 3 11:43:04 2007 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 3 Oct 2007 11:43:04 -0400 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: References: Message-ID: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> To follow on to my previous posting on this topic given Robert's response. As I said previously, I was never strongly committed to one approach or the other. But since the v1 release has been made, I think more care needs to be given to consideration of proposals like this before actually charging off to make the changes. 1) Even though Robert is speaking for Travis, I think given Travis's role in numpy, it is important for Travis to speak directly to this when he gets the chance. 2) API changes should only be made in major releases, not minor releases (IMHO). 3) Greater time should be provided to accommodate the transition. For example, there should not be deprecation warnings in the first version that this API appears in. The first release of this should not lead to nuisance messages for those that have other software that depends on this. (A tool that allows conditional messages would be good, but the default should be no message). The next release, sure. As a result, it means that the old API can't be removed until at least two releases after that. 4) More information should be provided as to what actually will change in the public interface. I suspect that it isn't obvious to many what will change. From the mailing list discussions there doesn't even seem to be consensus on the factory functions or type objects (more on these later). Many of the remaining objects are probably used internally (essentially private) and will affect few outside of the numpy developers. Since users typically deal mostly with factory functions, and other functions, they may not deal with classes much (outside of types). So listing the public classes so affected will help people understand what changes typical users will see, and what changes advanced users will see. While this is annoying, I think someone needs to write up an explicit list of those public classes that will be changed (and those that won't) so we all know what we will face. It may be a very small list and thus alleviate concern about the process. It may show some surprises that people hadn't thought about. Not doing this before making the changes seems very unwise. 5) In my humble opinion, we would be nuts--absolutely nuts--to change either the type classes or the factory functions. This would be foolish consistency at it's worst. We *just* went through the exercise of changing Int32 to int32 and so forth and we would have to change back again? This cannot be seriously considered. Perry Greenfield On Oct 2, 2007, at 1:24 PM, Jarrod Millman wrote: > Hello, > > For those of you not on the Numpy developer's list, we have been > talking about adopting the Python class naming convention to conform > with Guido's style guide as closely as possible: > http://www.python.org/dev/peps/pep-0008/ > According to Guido, class names should use the CapWords convention. > Most Python projects (eg, ETS, matploltlib) adhere to the Python > naming conventions and it is confusing that NumPy and SciPy don't. > > Currently, both NumPy and SciPy use either lower_underscore_separated > or CapWords for class names. > > NumPy > ====== > $ grep -r '^class [A-Z]' --include "*.py" * | wc -l > 1014 > $ grep -r '^class' --include "*.py" * | grep -v 'class [A-Z]' | wc -l > 207 > > SciPy > ===== > $ grep -r '^class [A-Z]' --include "*.py" * | wc -l > 587 > $ grep -r '^class' --include "*.py" * | grep -v 'class [A-Z]' | wc -l > 565 > > So far, there is Universal support for the basic idea of conforming to > the Python Style Guide. Since it should effect the user API, we have > already updated the TestCase classes. First, we checked in a change > to allow TestCase classes to be prefixed with > either 'test' or 'Test': > http://projects.scipy.org/scipy/numpy/changeset/4144 > Then, we updated both SciPy and NumPy to use CapWords for TestCase > classes: > http://projects.scipy.org/scipy/numpy/changeset/4151 > http://projects.scipy.org/scipy/scipy/changeset/3388 > In order to run the SciPy tests on the trunk you will need to install > NumPy from svn revision 4144 or greater. Before SciPy 0.7.0 is > released, we will release NumPy 1.0.4. > > Also, if you are adding a new class to NumPy or SciPy, please use > CapWords. Now we need to decide what to do about the remaining > lower_underscore_separated class names. Obviously, it is important > that we are careful to not break a lot of code just to bring our class > names up to standards. > > Cheers, > > -- > Jarrod Millman > Computational Infrastructure for Research Labs > 10 Giannini Hall, UC Berkeley > phone: 510.643.4014 > http://cirl.berkeley.edu/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From ondrej at certik.cz Wed Oct 3 12:04:56 2007 From: ondrej at certik.cz (Ondrej Certik) Date: Wed, 3 Oct 2007 18:04:56 +0200 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> Message-ID: <85b5c3130710030904ye7028e2r90c890cf1c823732@mail.gmail.com> > As I said previously, I was never strongly committed to one approach > or the other. But since the v1 release has been made, I think more > care needs to be given to consideration of proposals like this before > actually charging off to make the changes. > > 1) Even though Robert is speaking for Travis, I think given Travis's > role in numpy, it is important for Travis to speak directly to this > when he gets the chance. > > 2) API changes should only be made in major releases, not minor > releases (IMHO). > > 3) Greater time should be provided to accommodate the transition. For > example, there should not be deprecation warnings in the first > version that this API appears in. The first release of this should > not lead to nuisance messages for those that have other software that > depends on this. (A tool that allows conditional messages would be > good, but the default should be no message). The next release, sure. > As a result, it means that the old API can't be removed until at > least two releases after that. > > 4) More information should be provided as to what actually will > change in the public interface. I suspect that it isn't obvious to > many what will change. From the mailing list discussions there > doesn't even seem to be consensus on the factory functions or type > objects (more on these later). Many of the remaining objects are > probably used internally (essentially private) and will affect few > outside of the numpy developers. Since users typically deal mostly > with factory functions, and other functions, they may not deal with > classes much (outside of types). So listing the public classes so > affected will help people understand what changes typical users will > see, and what changes advanced users will see. While this is > annoying, I think someone needs to write up an explicit list of those > public classes that will be changed (and those that won't) so we all > know what we will face. It may be a very small list and thus > alleviate concern about the process. It may show some surprises that > people hadn't thought about. Not doing this before making the changes > seems very unwise. > > 5) In my humble opinion, we would be nuts--absolutely nuts--to change > either the type classes or the factory functions. This would be > foolish consistency at it's worst. We *just* went through the > exercise of changing Int32 to int32 and so forth and we would have to > change back again? This cannot be seriously considered. I strongly agree with all 1) to 4) above. It's really important not to break things for the end user at the end and 1) through 4) is the way to do it. (I don't know much background about 5) to make a judgement). Ondrej Certik From millman at berkeley.edu Wed Oct 3 14:26:29 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 3 Oct 2007 11:26:29 -0700 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> Message-ID: On 10/3/07, Perry Greenfield wrote: > 2) API changes should only be made in major releases, not minor > releases (IMHO). +1 > 3) Greater time should be provided to accommodate the transition. For > example, there should not be deprecation warnings in the first > version that this API appears in. The first release of this should > not lead to nuisance messages for those that have other software that > depends on this. (A tool that allows conditional messages would be > good, but the default should be no message). The next release, sure. > As a result, it means that the old API can't be removed until at > least two releases after that. I am not sure I agree with this. For example, I think it would be acceptable for NumPy 1.1.0 to have deprecation warning about changed APIs. Perhaps you were saying that NumPy 1.0.4 could use the new class names in addition to the old names without complaint. That sounds reasonable to me. Then when NumPy 1.1.0 comes out the old style names would raise deprecation warnings. > 4) More information should be provided as to what actually will > change in the public interface. I suspect that it isn't obvious to > many what will change. From the mailing list discussions there > doesn't even seem to be consensus on the factory functions or type > objects (more on these later). Many of the remaining objects are > probably used internally (essentially private) and will affect few > outside of the numpy developers. Since users typically deal mostly > with factory functions, and other functions, they may not deal with > classes much (outside of types). So listing the public classes so > affected will help people understand what changes typical users will > see, and what changes advanced users will see. While this is > annoying, I think someone needs to write up an explicit list of those > public classes that will be changed (and those that won't) so we all > know what we will face. It may be a very small list and thus > alleviate concern about the process. It may show some surprises that > people hadn't thought about. Not doing this before making the changes > seems very unwise. As long as we agree in principle how we want classes named, I think there is little need to rush to change existing class names. > 5) In my humble opinion, we would be nuts--absolutely nuts--to change > either the type classes or the factory functions. This would be > foolish consistency at it's worst. We *just* went through the > exercise of changing Int32 to int32 and so forth and we would have to > change back again? This cannot be seriously considered. I think that the general consensus is that we should keep int32, rather than switch to Int32. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From perry at stsci.edu Wed Oct 3 15:08:01 2007 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 3 Oct 2007 15:08:01 -0400 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> Message-ID: <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> On Oct 3, 2007, at 2:26 PM, Jarrod Millman wrote: > >> 3) Greater time should be provided to accommodate the transition. For >> example, there should not be deprecation warnings in the first >> version that this API appears in. The first release of this should >> not lead to nuisance messages for those that have other software that >> depends on this. (A tool that allows conditional messages would be >> good, but the default should be no message). The next release, sure. >> As a result, it means that the old API can't be removed until at >> least two releases after that. > > I am not sure I agree with this. For example, I think it would be > acceptable for NumPy 1.1.0 to have deprecation warning about changed > APIs. Perhaps you were saying that NumPy 1.0.4 could use the new > class names in addition to the old names without complaint. That > sounds reasonable to me. Then when NumPy 1.1.0 comes out the old > style names would raise deprecation warnings. > The situation I'm trying to avoid is a too tight coupling between numpy changes and client applications that use numpy. Suppose we distribute an application that uses numpy. We could make the changes to our application before the api-change release comes out (from svn) and then when we release our new version (very soon after the api- changed numpy comes out) we effectively force all of our users to update immediately. The problem is that they may not want to update on our schedule. They become annoyed at us. So we take the other tack, we wait for a while before changing our code to require the new numpy. This give the user community time to switch their stuff too. But, now our code generates annoying deprecation warnings that are useless to the people we distribute applications to if they update to the new numpy before we do. Here's where I display some ignorance. If the warnings use the standard lib warning module, I'm guessing that we can add warnings filters to suppress any warnings that arise from our code (but not having much experience with it, it isn't clear to me whether the filter suppresses all warnings arising from numpy or whether one can suppress only those associated with ones that are from the applications use). But it's good to clarify this point. If they are present by default, an application needs to be able to suppress them. >> 5) In my humble opinion, we would be nuts--absolutely nuts--to change >> either the type classes or the factory functions. This would be >> foolish consistency at it's worst. We *just* went through the >> exercise of changing Int32 to int32 and so forth and we would have to >> change back again? This cannot be seriously considered. > > I think that the general consensus is that we should keep int32, > rather than switch to Int32. That's good. But what about array(), zeros(), ones(), arange(), etc.? Perry From robert.kern at gmail.com Wed Oct 3 15:16:35 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 03 Oct 2007 14:16:35 -0500 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> Message-ID: <4703EA93.2010004@gmail.com> Perry Greenfield wrote: > On Oct 3, 2007, at 2:26 PM, Jarrod Millman wrote: > >>> 3) Greater time should be provided to accommodate the transition. For >>> example, there should not be deprecation warnings in the first >>> version that this API appears in. The first release of this should >>> not lead to nuisance messages for those that have other software that >>> depends on this. (A tool that allows conditional messages would be >>> good, but the default should be no message). The next release, sure. >>> As a result, it means that the old API can't be removed until at >>> least two releases after that. >> I am not sure I agree with this. For example, I think it would be >> acceptable for NumPy 1.1.0 to have deprecation warning about changed >> APIs. Perhaps you were saying that NumPy 1.0.4 could use the new >> class names in addition to the old names without complaint. That >> sounds reasonable to me. Then when NumPy 1.1.0 comes out the old >> style names would raise deprecation warnings. >> > The situation I'm trying to avoid is a too tight coupling between > numpy changes and client applications that use numpy. Suppose we > distribute an application that uses numpy. We could make the changes > to our application before the api-change release comes out (from svn) > and then when we release our new version (very soon after the api- > changed numpy comes out) we effectively force all of our users to > update immediately. The problem is that they may not want to update > on our schedule. They become annoyed at us. > > So we take the other tack, we wait for a while before changing our > code to require the new numpy. This give the user community time to > switch their stuff too. But, now our code generates annoying > deprecation warnings that are useless to the people we distribute > applications to if they update to the new numpy before we do. Here's > where I display some ignorance. If the warnings use the standard lib > warning module, I'm guessing that we can add warnings filters to > suppress any warnings that arise from our code (but not having much > experience with it, it isn't clear to me whether the filter > suppresses all warnings arising from numpy or whether one can > suppress only those associated with ones that are from the > applications use). But it's good to clarify this point. If they are > present by default, an application needs to be able to suppress them. The warnings module allows one to filter warnings very specifically, down to the line number in the module where the warning is issued. If you can catalog the warnings that are raised by your application (say by running your test suite), then you can filter out just those. >>> 5) In my humble opinion, we would be nuts--absolutely nuts--to change >>> either the type classes or the factory functions. This would be >>> foolish consistency at it's worst. We *just* went through the >>> exercise of changing Int32 to int32 and so forth and we would have to >>> change back again? This cannot be seriously considered. >> I think that the general consensus is that we should keep int32, >> rather than switch to Int32. > > That's good. But what about array(), zeros(), ones(), arange(), etc.? They're all functions and not affected. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From perry at stsci.edu Wed Oct 3 15:51:08 2007 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 3 Oct 2007 15:51:08 -0400 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <4703EA93.2010004@gmail.com> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> <4703EA93.2010004@gmail.com> Message-ID: <18E62A87-727A-478A-A484-DBF7EA272260@stsci.edu> On Oct 3, 2007, at 3:16 PM, Robert Kern wrote: > > The warnings module allows one to filter warnings very > specifically, down to the > line number in the module where the warning is issued. If you can > catalog the > warnings that are raised by your application (say by running your > test suite), > then you can filter out just those. > Yeah, I know, but the thing that wasn't obvious is which module is listed when the error is raised. If the warning is raised in numpy, isn't that the module it is associated with? After all, how does it know where in the call stack you really are interested in associating that error with? Or does the warnings module see if that module you have listed is in the call tree rather than the warning originating in that module? From the PEP and online docs, it seems that module is associated with the place the warning is raised. That would work ok, but it means you turn it off for everything, not just your code, right? Another point is, that there has to be time for the code with the error suppression specific stuff to be released. It's risky to put that in our stuff and release it and find that some changes in numpy make it incorrect if numpy is released after it is. Just this aspect drives release dependencies more than I would like. There isn't a simple solution to this, but if the first version has optional warnings, it makes it easier to test against it for the applications release. >>>> 5) In my humble opinion, we would be nuts--absolutely nuts--to >>>> change >>>> either the type classes or the factory functions. This would be >>>> foolish consistency at it's worst. We *just* went through the >>>> exercise of changing Int32 to int32 and so forth and we would >>>> have to >>>> change back again? This cannot be seriously considered. >>> I think that the general consensus is that we should keep int32, >>> rather than switch to Int32. >> >> That's good. But what about array(), zeros(), ones(), arange(), etc.? > > They're all functions and not affected. Good. I know they were functions, but I think some were arguing that even though they were functions, that they should adopt class-name conventions because they were object factories so it is good to be explicit about this as well. Perry From robert.kern at gmail.com Wed Oct 3 16:07:54 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 03 Oct 2007 15:07:54 -0500 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <18E62A87-727A-478A-A484-DBF7EA272260@stsci.edu> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> <4703EA93.2010004@gmail.com> <18E62A87-727A-478A-A484-DBF7EA272260@stsci.edu> Message-ID: <4703F69A.4020209@gmail.com> Perry Greenfield wrote: > On Oct 3, 2007, at 3:16 PM, Robert Kern wrote: > >> The warnings module allows one to filter warnings very >> specifically, down to the >> line number in the module where the warning is issued. If you can >> catalog the >> warnings that are raised by your application (say by running your >> test suite), >> then you can filter out just those. >> > Yeah, I know, but the thing that wasn't obvious is which module is > listed when the error is raised. It's the line that actually has warnings.warn() (or the location of the call to the wrapper function that calls warnings.warn() if the wrapper function uses the stacklevel argument; unfortunately, we can't use that to solve your problem exactly). > If the warning is raised in numpy, > isn't that the module it is associated with? Yes. > After all, how does it > know where in the call stack you really are interested in associating > that error with? Or does the warnings module see if that module you > have listed is in the call tree rather than the warning originating > in that module? From the PEP and online docs, it seems that module is > associated with the place the warning is raised. That would work ok, > but it means you turn it off for everything, not just your code, right? Yes. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From stefan at sun.ac.za Wed Oct 3 18:18:26 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 4 Oct 2007 00:18:26 +0200 Subject: [SciPy-dev] adopting Python Style Guide for classes In-Reply-To: <18E62A87-727A-478A-A484-DBF7EA272260@stsci.edu> References: <92C0EE9C-0998-4A82-9A63-A7A3AE92D5C9@stsci.edu> <41A7993F-ED30-45CD-987B-9C488452A59B@stsci.edu> <4703EA93.2010004@gmail.com> <18E62A87-727A-478A-A484-DBF7EA272260@stsci.edu> Message-ID: <20071003221826.GN7241@mentat.za.net> On Wed, Oct 03, 2007 at 03:51:08PM -0400, Perry Greenfield wrote: > Yeah, I know, but the thing that wasn't obvious is which module is > listed when the error is raised. If the warning is raised in numpy, > isn't that the module it is associated with? After all, how does it > know where in the call stack you really are interested in associating > that error with? Or does the warnings module see if that module you > have listed is in the call tree rather than the warning originating > in that module? From the PEP and online docs, it seems that module is > associated with the place the warning is raised. That would work ok, > but it means you turn it off for everything, not just your code, > right? A person can also filter out messages by category, for example suppress all warnings that are subclasses of numpy.APIDeprecationWarning. That way, you can still see all "normal" warnings raised. Regards St?fan From chanley at stsci.edu Thu Oct 4 09:43:14 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Thu, 04 Oct 2007 09:43:14 -0400 Subject: [SciPy-dev] swig dependency Message-ID: <4704EDF2.8020807@stsci.edu> Hi, I see that the latest revision of scipy (r3410) now has a swig dependency. What version of swig is now required? I am unable to build scipy with swig version 1.1 (Patch 5). Also, is there a place on the scipy site that I can find a listing of all the dependencies and associated version numbers? Thanks, Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From robince at gmail.com Thu Oct 4 14:56:08 2007 From: robince at gmail.com (Robin) Date: Thu, 4 Oct 2007 19:56:08 +0100 Subject: [SciPy-dev] latest SVN (3412) build fails on Windows Message-ID: Hi, I found that having updated scipy to svn 3412 the new io.nifti module fails: creating build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\io creating build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\io\nifti compile options: '-Iscipy\io\nifti -Iscipy\io\nifti\nifticlib\fsliolib -Iscipy\i o\nifti\nifticlib\niftilib -Iscipy\io\nifti\nifticlib\znzlib -Ic:\Python25\lib\s ite-packages\numpy\core\include -Ic:\Python25\include -Ic:\Python25\PC -c' gcc -mno-cygwin -O2 -Wall -Wstrict-prototypes -Iscipy\io\nifti -Iscipy\io\nifti\ nifticlib\fsliolib -Iscipy\io\nifti\nifticlib\niftilib -Iscipy\io\nifti\nifticli b\znzlib -Ic:\Python25\lib\site-packages\numpy\core\include -Ic:\Python25\includ e -Ic:\Python25\PC -c build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.c -o bui ld\temp.win32- 2.5\Release\build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.o g++ -mno-cygwin -shared build\temp.win32- 2.5\Release\build\src.win32-2.5\scipy\i o\nifti\nifticlib_wrap.o -Lc:\Python25\libs -Lc:\Python25\PCBuild -Lbuild\temp.w in32-2.5 -lniftiio -lfslio -lznz -lpython25 -lmsvcr71 -o build\lib.win32- 2.5\sci py\io\nifti\_nifticlib.pyd Found executable C:\cygwin\bin\g++.exe build\temp.win32- 2.5\Release\build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.o :nifticlib_wrap.c:(.text+0x368c): undefined reference to `_znzprintf' collect2: ld returned 1 exit status error: Command "g++ -mno-cygwin -shared build\temp.win32- 2.5\Release\build\src.w in32-2.5\scipy\io\nifti\nifticlib_wrap.o -Lc:\Python25\libs -Lc:\Python25\PCBuil d -Lbuild\temp.win32-2.5 -lniftiio -lfslio -lznz -lpython25 -lmsvcr71 -o build\l ib.win32-2.5\scipy\io\nifti\_nifticlib.pyd" failed with exit status 1 I am using Cygwin gcc 3.4.4 on Windows XP. Should I submit a ticket or is more information required (or is there a simple fix for something in my setup)? Robin -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Thu Oct 4 15:23:36 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 4 Oct 2007 12:23:36 -0700 Subject: [SciPy-dev] swig dependency In-Reply-To: <4704EDF2.8020807@stsci.edu> References: <4704EDF2.8020807@stsci.edu> Message-ID: On 10/4/07, Christopher Hanley wrote: > I see that the latest revision of scipy (r3410) now has a swig > dependency. What version of swig is now required? I am unable to build > scipy with swig version 1.1 (Patch 5). SciPy has required SWIG for at least a year. I am just starting to work with SWIG, but I am using SWIG 1.3.31. It looks like SWIG 1.3 was released in 2000. > Also, is there a place on the scipy site that I can find a listing of > all the dependencies and associated version numbers? I took a quick look and couldn't find a list of prereqs. If noone else gets to it first, I will try and update the site. Thanks, Jarrod From ollinger at wisc.edu Thu Oct 4 15:35:39 2007 From: ollinger at wisc.edu (John Ollinger) Date: Thu, 4 Oct 2007 14:35:39 -0500 Subject: [SciPy-dev] Nifti support In-Reply-To: References: Message-ID: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> I would like to put in my two-cents worth regarding the nifti format. I wrote a library in python that handles dicom, afni, and nifti format with the appropriate conversions between coordinate systems. The goal was to be able to script FSL, AFNI, SPM and our in-house python code without doing a file conversion at every step. Unfortunately, AFNI defines the quaternion transformation relative to LPI coordinates, while FSL defines it as going to RAI. I don't know how SPM does it - we are still in the process of scripting that. The result is that either the fans of fslview or the fans of AFNI are upset about any given solution. I don't know if this is covered elsewhere or not, but it should receive some attention. John p.s. I would contribute the code if anyone is interested. I completely rewrote the mess that I offered last spring, and it is probably respectable python code now although it could stand being refactored again. On 10/1/07, Matthieu Brucher wrote: > > Excellent news !! > > Matthieu > > 2007/10/1, Jarrod Millman : > > > > On 9/30/07, Matthieu Brucher wrote: > > > I saw that nifti file support was currently in the pipeline. If the > > license > > > issue is solved, I can provide you a nifticlib.i file that is > > compatible > > > with C89 compilers (the converted file needs a C99 compiler, so > > compiling it > > > with MSVC is not possible), as I've done this for my hobbies some > > weeks ago. > > > > Hey Matthieu, > > > > The pynifti license issue has been resolved. Michael Hanke, the > > author, has released his software under an MIT license: > > > > http://sourceforge.net/project/shownotes.php?group_id=126549&release_id=543252 > > > > Early this week, Chris Burns and I will check this code into the SciPy > > repository. Once we do, it would be great if you could either > > integrate your nifticlib.i into the codebase or just send us your > > code. > > > > Thanks, > > > > -- > > Jarrod Millman > > Computational Infrastructure for Research Labs > > 10 Giannini Hall, UC Berkeley > > phone: 510.643.4014 > > http://cirl.berkeley.edu/ > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -- John Ollinger University of Wisconsin Waisman Center, T233 1500 Highland Ave Madison, WI 53711 http://brainimaging.waisman.wisc.edu/~jjo/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Thu Oct 4 15:57:24 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 4 Oct 2007 21:57:24 +0200 Subject: [SciPy-dev] Nifti support In-Reply-To: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> References: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> Message-ID: Hi, Here is the nifticlib.i file, not a lot of differences. I would like to put in my two-cents worth regarding the nifti format. I > wrote a library in python that handles dicom, afni, and nifti format with > the appropriate conversions between coordinate systems. Are you able to read DTI image files in DICOM ? That could interest me a lot. The goal was to be able to script FSL, AFNI, SPM and our in-house python > code without doing a file conversion at every step. Do you have a wrapper over FSL and SPM in Python ? Unfortunately, AFNI defines the quaternion transformation relative to LPI > coordinates, while FSL defines it as going to RAI. I don't know how SPM > does it - we are still in the process of scripting that. The result is > that either the fans of fslview or the fans of AFNI are upset about any > given solution. We have the same problem when converting DICOM to analyze, the two pieces of software we use (MRIConvert and MRICRO or something like that) don't use the same coordinate system... I don't know if this is covered elsewhere or not, but it should receive some > attention. > You have mine :) Matthieu -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: nifticlib.i URL: From millman at berkeley.edu Thu Oct 4 15:43:42 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 4 Oct 2007 12:43:42 -0700 Subject: [SciPy-dev] Nifti support In-Reply-To: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> References: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> Message-ID: Hey John, Please send me the new version of your code. The nifti code that has been checked in uses niftilib, the official reference implementation of nifti that all the major packages are using. The python wrapper is the official implementation as well. Once we get the build issues sorted out, it would be great if you would be willing to test out the code. If there are any issues that it doesn't get right, we should get them fixed upstream. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From ollinger at wisc.edu Thu Oct 4 16:56:18 2007 From: ollinger at wisc.edu (John Ollinger) Date: Thu, 4 Oct 2007 15:56:18 -0500 Subject: [SciPy-dev] Nifti support In-Reply-To: References: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> Message-ID: <5b9ba9310710041356n29703013pa23d7c96813dc782@mail.gmail.com> Jarrod, I'll send you the code (probably tomorrow - I'm writing a chunk of grant proposal today). I would be happy to test the new code. The unit tests I wrote for my own code can be easily adapted. John On 10/4/07, Jarrod Millman wrote: > > Hey John, > > Please send me the new version of your code. The nifti code that has > been checked in uses niftilib, the official reference implementation > of nifti that all the major packages are using. The python wrapper is > the official implementation as well. Once we get the build issues > sorted out, it would be great if you would be willing to test out the > code. If there are any issues that it doesn't get right, we should > get them fixed upstream. > > Thanks, > > -- > Jarrod Millman > Computational Infrastructure for Research Labs > 10 Giannini Hall, UC Berkeley > phone: 510.643.4014 > http://cirl.berkeley.edu/ > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- John Ollinger University of Wisconsin Waisman Center, T233 1500 Highland Ave Madison, WI 53711 http://brainimaging.waisman.wisc.edu/~jjo/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Oct 4 17:04:11 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 04 Oct 2007 16:04:11 -0500 Subject: [SciPy-dev] swig dependency In-Reply-To: References: <4704EDF2.8020807@stsci.edu> Message-ID: <4705554B.2030707@gmail.com> Jarrod Millman wrote: > On 10/4/07, Christopher Hanley wrote: >> I see that the latest revision of scipy (r3410) now has a swig >> dependency. What version of swig is now required? I am unable to build >> scipy with swig version 1.1 (Patch 5). > > SciPy has required SWIG for at least a year. That's not entirely true. The UMFPACK bindings are optional, and sparsetools has the generated files checked in so that no one except those modifying those wrappers needs to have SWIG installed. Checking generated files into source control is somewhat icky, but so is having to answer this question over and over. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From ollinger at wisc.edu Thu Oct 4 17:28:54 2007 From: ollinger at wisc.edu (John Ollinger) Date: Thu, 4 Oct 2007 16:28:54 -0500 Subject: [SciPy-dev] Nifti support In-Reply-To: References: <5b9ba9310710041235s3fa26818i405b875e80047afc@mail.gmail.com> Message-ID: <5b9ba9310710041428h20097bal696add89ddf82770@mail.gmail.com> Hi, It should read DTI images, but I have only tested it on GE scanners. The code does support Siemens scanners, so it wouldn't take much to get it to work. I did not wrap FSL or SPM. I am not sure how to wrap programs written in matlab. Other than to use matlab itself. The code return a 4x4 transformation matrix that takes the raw data into RAI coordinates. It doesn't do anything to the data itself. I will put the code and some documentation (yet to be written) on my home page, probably tomorrow but maybe Saturday. John On 10/4/07, Matthieu Brucher wrote: > > Hi, > > Here is the nifticlib.i file, not a lot of differences. > > > I would like to put in my two-cents worth regarding the nifti format. I > > wrote a library in python that handles dicom, afni, and nifti format with > > the appropriate conversions between coordinate systems. > > > > Are you able to read DTI image files in DICOM ? That could interest me a > lot. > > > The goal was to be able to script FSL, AFNI, SPM and our in-house python > > code without doing a file conversion at every step. > > > > Do you have a wrapper over FSL and SPM in Python ? > > > Unfortunately, AFNI defines the quaternion transformation relative to LPI > > coordinates, while FSL defines it as going to RAI. I don't know how SPM > > does it - we are still in the process of scripting that. The result is > > that either the fans of fslview or the fans of AFNI are upset about any > > given solution. > > > We have the same problem when converting DICOM to analyze, the two pieces > of software we use (MRIConvert and MRICRO or something like that) don't use > the same coordinate system... > > > I don't know if this is covered elsewhere or not, but it should receive > > some attention. > > > > You have mine :) > > Matthieu > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > -- John Ollinger University of Wisconsin Waisman Center, T233 1500 Highland Ave Madison, WI 53711 http://brainimaging.waisman.wisc.edu/~jjo/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brian.lee.hawthorne at gmail.com Thu Oct 4 15:55:21 2007 From: brian.lee.hawthorne at gmail.com (Brian Hawthorne) Date: Thu, 4 Oct 2007 12:55:21 -0700 Subject: [SciPy-dev] swig dependency In-Reply-To: References: <4704EDF2.8020807@stsci.edu> Message-ID: <796269930710041255r7ad136d8u7dd19cd6db4c0da9@mail.gmail.com> Is scipy eggable? if so, the egg setup should have a list of dependencies somewhere. If scipy were to be properly eggified, then you could just post the egg dependency spec to the site for those installing the "old fashioned" way. > Also, is there a place on the scipy site that I can find a listing of > > all the dependencies and associated version numbers? > > I took a quick look and couldn't find a list of prereqs. If noone > else gets to it first, I will try and update the site. > > Thanks, > Jarrod > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cburns at berkeley.edu Thu Oct 4 18:07:34 2007 From: cburns at berkeley.edu (Christopher Burns) Date: Thu, 4 Oct 2007 15:07:34 -0700 Subject: [SciPy-dev] latest SVN (3412) build fails on Windows In-Reply-To: References: Message-ID: <764e38540710041507v431abb61u3e7f9ceca416ca1f@mail.gmail.com> Robin, Sorry for the build failure. We've temporarily removed inclusion of the nifti module until we resolve the Windows build. The znzprintf function is specifically not included in the Windows build. I'll contact the nifticlib maintainers regarding a fix. Thanks for letting us know about the bug. On 10/4/07, Robin wrote: > > Hi, > > I found that having updated scipy to svn 3412 the new io.nifti module > fails: > > creating build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\io > creating build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\io\nifti > compile options: '-Iscipy\io\nifti -Iscipy\io\nifti\nifticlib\fsliolib > -Iscipy\i > o\nifti\nifticlib\niftilib -Iscipy\io\nifti\nifticlib\znzlib > -Ic:\Python25\lib\s > ite-packages\numpy\core\include -Ic:\Python25\include -Ic:\Python25\PC -c' > > gcc -mno-cygwin -O2 -Wall -Wstrict-prototypes -Iscipy\io\nifti > -Iscipy\io\nifti\ > nifticlib\fsliolib -Iscipy\io\nifti\nifticlib\niftilib > -Iscipy\io\nifti\nifticli > b\znzlib -Ic:\Python25\lib\site-packages\numpy\core\include > -Ic:\Python25\includ > e -Ic:\Python25\PC -c build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.c-o bui > ld\temp.win32- > 2.5\Release\build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.o > g++ -mno-cygwin -shared build\temp.win32- > 2.5\Release\build\src.win32-2.5\scipy\i > o\nifti\nifticlib_wrap.o -Lc:\Python25\libs -Lc:\Python25\PCBuild > -Lbuild\temp.w > in32-2.5 -lniftiio -lfslio -lznz -lpython25 -lmsvcr71 -o build\lib.win32- > 2.5\sci > py\io\nifti\_nifticlib.pyd > Found executable C:\cygwin\bin\g++.exe > build\temp.win32- > 2.5\Release\build\src.win32-2.5\scipy\io\nifti\nifticlib_wrap.o > :nifticlib_wrap.c:(.text+0x368c): undefined reference to `_znzprintf' > collect2: ld returned 1 exit status > error: Command "g++ -mno-cygwin -shared build\temp.win32- > 2.5\Release\build\src.w > in32-2.5\scipy\io\nifti\nifticlib_wrap.o -Lc:\Python25\libs > -Lc:\Python25\PCBuil > d -Lbuild\temp.win32-2.5 -lniftiio -lfslio -lznz -lpython25 -lmsvcr71 -o > build\l > ib.win32-2.5\scipy\io\nifti\_nifticlib.pyd " failed with exit status 1 > > I am using Cygwin gcc 3.4.4 on Windows XP. > > Should I submit a ticket or is more information required (or is there a > simple fix for something in my setup)? > > Robin > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > -- Christopher Burns, Software Engineer Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Thu Oct 4 18:19:02 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 04 Oct 2007 17:19:02 -0500 Subject: [SciPy-dev] swig dependency In-Reply-To: <796269930710041255r7ad136d8u7dd19cd6db4c0da9@mail.gmail.com> References: <4704EDF2.8020807@stsci.edu> <796269930710041255r7ad136d8u7dd19cd6db4c0da9@mail.gmail.com> Message-ID: <470566D6.7070102@gmail.com> Brian Hawthorne wrote: > Is scipy eggable? Yes. > if so, the egg setup should have a list of > dependencies somewhere. If scipy were to be properly eggified, then you > could just post the egg dependency spec to the site for those installing > the "old fashioned" way. That dependency information is for Python packages only. There is no place there to specify SWIG or C/FORTRAN libraries. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From millman at berkeley.edu Thu Oct 4 18:49:29 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 4 Oct 2007 15:49:29 -0700 Subject: [SciPy-dev] swig dependency In-Reply-To: <4705554B.2030707@gmail.com> References: <4704EDF2.8020807@stsci.edu> <4705554B.2030707@gmail.com> Message-ID: On 10/4/07, Robert Kern wrote: > Jarrod Millman wrote: > > SciPy has required SWIG for at least a year. > > That's not entirely true. The UMFPACK bindings are optional, and sparsetools has > the generated files checked in so that no one except those modifying those > wrappers needs to have SWIG installed. Checking generated files into source > control is somewhat icky, but so is having to answer this question over and over. Sorry, I misunderstood how SWIG was being used (this is my first time using it). So just to be clear, it sounds like we will need to include the generated files just like sparsetools does. Until we resolve this and a few other issues, Chris and I have disabled nifti builds. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From wnbell at gmail.com Thu Oct 4 20:59:02 2007 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 4 Oct 2007 19:59:02 -0500 Subject: [SciPy-dev] swig dependency In-Reply-To: <4705554B.2030707@gmail.com> References: <4704EDF2.8020807@stsci.edu> <4705554B.2030707@gmail.com> Message-ID: On 10/4/07, Robert Kern wrote: > > That's not entirely true. The UMFPACK bindings are optional, and sparsetools has > the generated files checked in so that no one except those modifying those > wrappers needs to have SWIG installed. Checking generated files into source > control is somewhat icky, but so is having to answer this question over and over. > An added complication is that sparsetools requires the SVN version of SWIG, rather than the current stable release 1.3.31. While I do feel silly checking in 900K worth of SWIG generated C++ code periodically, it's really the only way to ensure sparsetools works. For UMFPACK bindings, it's more reasonable to use the client's SWIG since they're also responsible for installing UMFPACK. Icky is right :) -- Nathan Bell wnbell at gmail.com From david at ar.media.kyoto-u.ac.jp Thu Oct 4 22:16:23 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 05 Oct 2007 11:16:23 +0900 Subject: [SciPy-dev] swig dependency In-Reply-To: <4705554B.2030707@gmail.com> References: <4704EDF2.8020807@stsci.edu> <4705554B.2030707@gmail.com> Message-ID: <47059E77.5090004@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > Jarrod Millman wrote: > >> On 10/4/07, Christopher Hanley wrote: >> >>> I see that the latest revision of scipy (r3410) now has a swig >>> dependency. What version of swig is now required? I am unable to build >>> scipy with swig version 1.1 (Patch 5). >>> >> SciPy has required SWIG for at least a year. >> > > That's not entirely true. The UMFPACK bindings are optional, and sparsetools has > the generated files checked in so that no one except those modifying those > wrappers needs to have SWIG installed. Checking generated files into source > control is somewhat icky, but so is having to answer this question over and over. > Something which may be nice, but I don't know if it is feasable, would be to regularly test whether the files can be regenerated. For example, when I wanted to improve scipy.cluster, I could not make swig work at all, and I ended up rewriting the whole thing in plain C (the module was really small). I am afraid that if generated code stays for a long time without being regularly updated, nobody will be able to work on it anymore. David From matthieu.brucher at gmail.com Fri Oct 5 01:45:44 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 5 Oct 2007 07:45:44 +0200 Subject: [SciPy-dev] latest SVN (3412) build fails on Windows In-Reply-To: <764e38540710041507v431abb61u3e7f9ceca416ca1f@mail.gmail.com> References: <764e38540710041507v431abb61u3e7f9ceca416ca1f@mail.gmail.com> Message-ID: 2007/10/5, Christopher Burns : > > Robin, > > Sorry for the build failure. We've temporarily removed inclusion of the > nifti module until we resolve the Windows build. The znzprintf function is > specifically not included in the Windows build. I'll contact the nifticlib > maintainers regarding a fix. > > Thanks for letting us know about the bug. Strange, I never ran into this. If WIN32 is defined, znzprintf is never called. Matthieu -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Mon Oct 8 15:49:55 2007 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 08 Oct 2007 14:49:55 -0500 Subject: [SciPy-dev] Should NIfTI be in scipy? Message-ID: <470A89E3.6030403@gmail.com> Before we re-enable building scipy.io.nifti, can we discuss whether or not it should be in scipy at all? I don't believe it was ever discussed on the list. In my personal opinion, I don't think we should be including code which is as domain-specific as NIfTI is. It also seems to me that PyNIfTI had a fairly successful life on its own, to the point of being included in some Linux distributions; I don't see what either scipy of PyNIfTI gains by absorbing it into scipy. However, I do see that the build process of scipy getting worse (hopefully, this is temporary) and also that PyNIfTI users would now depend on a much larger library that is mostly irrelevant to them. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From millman at berkeley.edu Mon Oct 8 16:15:46 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 8 Oct 2007 13:15:46 -0700 Subject: [SciPy-dev] Should NIfTI be in scipy? In-Reply-To: <470A89E3.6030403@gmail.com> References: <470A89E3.6030403@gmail.com> Message-ID: On 10/8/07, Robert Kern wrote: > Before we re-enable building scipy.io.nifti, can we discuss whether or not it > should be in scipy at all? I don't believe it was ever discussed on the list. In > my personal opinion, I don't think we should be including code which is as > domain-specific as NIfTI is. It also seems to me that PyNIfTI had a fairly > successful life on its own, to the point of being included in some Linux > distributions; I don't see what either scipy of PyNIfTI gains by absorbing it > into scipy. However, I do see that the build process of scipy getting worse > (hopefully, this is temporary) and also that PyNIfTI users would now depend on a > much larger library that is mostly irrelevant to them. I apologize for the build issues, we have certainly been too eager to get the PyNIfTI code into SciPy. The main reason we were trying to get this code in is that we are in the process of adding some image processing functionality to SciPy. Since most of the images we use are NIfTI format, it makes it easier for us if we can read our images using SciPy as well. If the consensus is that NIfTI format shouldn't be supported by scipy.io then we can just keep this functionality inside NIPY. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From aisaac at american.edu Mon Oct 8 16:41:42 2007 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 8 Oct 2007 16:41:42 -0400 Subject: [SciPy-dev] Should NIfTI be in scipy? In-Reply-To: References: <470A89E3.6030403@gmail.com> Message-ID: On Mon, 8 Oct 2007, Jarrod Millman apparently wrote: > The main reason we were trying to get this code in is that > we are in the process of adding some image processing > functionality to SciPy. Since most of the images we use > are NIfTI format, it makes it easier for us if we can read > our images using SciPy as well. Is this too narrowly focused to be a SciKit? Cheers, Alan Isaac From robert.kern at gmail.com Mon Oct 8 16:42:54 2007 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 08 Oct 2007 15:42:54 -0500 Subject: [SciPy-dev] Should NIfTI be in scipy? In-Reply-To: References: <470A89E3.6030403@gmail.com> Message-ID: <470A964E.3030000@gmail.com> Alan G Isaac wrote: > On Mon, 8 Oct 2007, Jarrod Millman apparently wrote: >> The main reason we were trying to get this code in is that >> we are in the process of adding some image processing >> functionality to SciPy. Since most of the images we use >> are NIfTI format, it makes it easier for us if we can read >> our images using SciPy as well. > > Is this too narrowly focused to be a SciKit? scikits can be as focused as they like. However, PyNIfTI already exists as a project and seems to be kept up (the last release being on 2007-09-30). I'm still not clear on why a fork is being made, wherever it happens to go. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From millman at berkeley.edu Mon Oct 8 18:57:38 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 8 Oct 2007 15:57:38 -0700 Subject: [SciPy-dev] Should NIfTI be in scipy? In-Reply-To: <470A964E.3030000@gmail.com> References: <470A89E3.6030403@gmail.com> <470A964E.3030000@gmail.com> Message-ID: Hello, I have removed the NIfTI code from SciPy. For now, PyNIfTI will continue to be developed by Michael Hanke here: http://niftilib.sourceforge.net/pynifti/ Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From bart.vandereycken at cs.kuleuven.be Tue Oct 9 06:15:19 2007 From: bart.vandereycken at cs.kuleuven.be (Bart Vandereycken) Date: Tue, 09 Oct 2007 12:15:19 +0200 Subject: [SciPy-dev] Sparse matvec inconsistency Message-ID: Hi all, the matvec (A*x) routine of sparse shows an inconsistency: the shape of the returned vector is not the same as the vector x when shape==(n,1). A simple change in the _matvec routines in sparse.py should do the trick... BTW I would have posted a ticket for this, but I lost my password for my account... Maybe someone can make a lost password link on the trac? -- bart import scipy as SY import scipy.sparse as SP A = SP.spidentity(10) x = SY.ones((10,1)) y = A*x # returns (10,) print y.shape x = SY.ones((10,2)) y = A*x # returns (10,2) print y.shape From stefan at sun.ac.za Tue Oct 9 20:07:51 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Wed, 10 Oct 2007 02:07:51 +0200 Subject: [SciPy-dev] ndimage problems In-Reply-To: <468A7893.1050600@stsci.edu> References: <468A7893.1050600@stsci.edu> Message-ID: <20071010000751.GA9365@mentat.za.net> Hi Chris On Tue, Jul 03, 2007 at 12:25:55PM -0400, Christopher Hanley wrote: > We have found two problems with ndimage. I have filed a ticket #455 on > the scipy trac page. The first problem can be seen with this example: > > > import numpy as n > > from scipy import ndimage as nd > > a = n.ones((10,5),dtype=n.float32) * 12.3 > > x = nd.rotate(a,90.0) > > x > Out[17]: > array([[ 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 0. , 0. ], > [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019], > [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019], > [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019], > [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019, 12.30000019, 12.30000019, > 12.30000019, 12.30000019]], dtype=float32) > }}} > > Notice that the last two entries of the first row are now 0. This is due to a slight round-off error when calculating the affine transformation matrix. cos(pi/180 * angle) isn't exactly zero, so you find that elements that fall *just* outside the boundary gets converted to the cval (0). You can work around the problem in two ways. The first is to setup your own transformation matrix: nd.affine_transform(a,[[0,1],[-1,0]],offset=[0,4],output_shape=(5,10)) The second is to use 'mirror' boundary mode, instead of 'cval', thereby nudging those outliers back into place. > The second problem has to do with the reversing of byte order if you > have big-endian data on a little endian machine. Please see the example > below: Thanks, I'll have a look at that. Regards St?fan From chanley at stsci.edu Tue Oct 9 20:50:46 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 09 Oct 2007 20:50:46 -0400 Subject: [SciPy-dev] ndimage problems In-Reply-To: <20071010000751.GA9365@mentat.za.net> References: <468A7893.1050600@stsci.edu> <20071010000751.GA9365@mentat.za.net> Message-ID: <470C21E6.10403@stsci.edu> Stefan van der Walt wrote: > Hi Chris > > On Tue, Jul 03, 2007 at 12:25:55PM -0400, Christopher Hanley wrote: > >> We have found two problems with ndimage. I have filed a ticket #455 on >> the scipy trac page. The first problem can be seen with this example: >> >> > import numpy as n >> > from scipy import ndimage as nd >> > a = n.ones((10,5),dtype=n.float32) * 12.3 >> > x = nd.rotate(a,90.0) >> > x >> Out[17]: >> array([[ 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 0. , 0. ], >> [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019], >> [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019], >> [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019], >> [ 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019, 12.30000019, 12.30000019, >> 12.30000019, 12.30000019]], dtype=float32) >> }}} >> >> Notice that the last two entries of the first row are now 0. >> > > This is due to a slight round-off error when calculating the affine > transformation matrix. cos(pi/180 * angle) isn't exactly zero, so you > find that elements that fall *just* outside the boundary gets > converted to the cval (0). You can work around the problem in two > ways. > > The first is to setup your own transformation matrix: > > nd.affine_transform(a,[[0,1],[-1,0]],offset=[0,4],output_shape=(5,10)) > > The second is to use 'mirror' boundary mode, instead of 'cval', > thereby nudging those outliers back into place. > > >> The second problem has to do with the reversing of byte order if you >> have big-endian data on a little endian machine. Please see the example >> below: >> > > Thanks, I'll have a look at that. > > Regards > St?fan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > Stefan. I have patched the second issue. The fix went out with the last release. Thank you for the solution to the first problem. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From chanley at stsci.edu Tue Oct 9 20:55:03 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 09 Oct 2007 20:55:03 -0400 Subject: [SciPy-dev] ndimage problems In-Reply-To: <20071010000751.GA9365@mentat.za.net> References: <468A7893.1050600@stsci.edu> <20071010000751.GA9365@mentat.za.net> Message-ID: <470C22E7.4090602@stsci.edu> The byteswapping issue was resolved in r3157. Chris From lars at voxdahl.com Wed Oct 10 11:32:51 2007 From: lars at voxdahl.com (Lars Voxen Hansen) Date: Wed, 10 Oct 2007 17:32:51 +0200 Subject: [SciPy-dev] Saving in MATLAB Level 5 MAT-File Format Message-ID: <00196FDD-43B1-48CB-B762-BD8AAD645449@voxdahl.com> I need to save in Matlab Level 5 MAT-File Format. The reson is that level 4 only support matrices (2d arrays), I also need to save some 3d arrays and load them into matlab. I have modified mio5.py in ./scipy/io such that this is posible now (see example below). The modified file mio5.py is attached to this mail. I have tested this on windows XP and mac OS X, where it works for me. Is it posible that this modification can be included in scipy? Best Regards and thanks for developing scipy Lars Voxen Hansen example: ---------------- import numpy as N from scipy import io A = N.ones((10,20,30)) f = open('test.mat','wb') MW = io.mio5.MatFile5Writer(f,do_compression=True,unicode_strings=True) MW.put_variables({'A1':A,'A2':A+1j*A,'s1':'string1','s2':u'string2'}) f.close() d = io.load('test.mat','rb') ------------------ -------------- next part -------------- A non-text attachment was scrubbed... Name: mio5.py Type: text/x-python-script Size: 28114 bytes Desc: not available URL: From matthew.brett at gmail.com Wed Oct 10 11:42:58 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 10 Oct 2007 16:42:58 +0100 Subject: [SciPy-dev] Saving in MATLAB Level 5 MAT-File Format In-Reply-To: <00196FDD-43B1-48CB-B762-BD8AAD645449@voxdahl.com> References: <00196FDD-43B1-48CB-B762-BD8AAD645449@voxdahl.com> Message-ID: <1e2af89e0710100842k716f07eey50938eff88d21cb0@mail.gmail.com> Excellent - thank you. I will have a look and put it into mio soon, Matthew On 10/10/07, Lars Voxen Hansen wrote: > I need to save in Matlab Level 5 MAT-File Format. The reson is that > level 4 only support matrices (2d arrays), I also need to save some > 3d arrays and load them into matlab. > > I have modified mio5.py in ./scipy/io such that this is posible now > (see example below). The modified file mio5.py is attached to this > mail. I have tested this on windows XP and mac OS X, where it works > for me. > > Is it posible that this modification can be included in scipy? > > Best Regards and thanks for developing scipy > Lars Voxen Hansen > > > example: > ---------------- > > import numpy as N > from scipy import io > > A = N.ones((10,20,30)) > f = open('test.mat','wb') > MW = io.mio5.MatFile5Writer(f,do_compression=True,unicode_strings=True) > MW.put_variables({'A1':A,'A2':A+1j*A,'s1':'string1','s2':u'string2'}) > f.close() > > d = io.load('test.mat','rb') > > ------------------ > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > From matthieu.brucher at gmail.com Wed Oct 10 15:54:08 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 10 Oct 2007 21:54:08 +0200 Subject: [SciPy-dev] weave not supporting other compilers than gnu ? Message-ID: Hi, I'm trying to get a simple code to work with weave, the blitz converter and ICC, but it seems that only the gnu headers are included in the repository. This means that weave cannot be used with something else than GCC, which is not what is said in the tutorial (BTW, it is outdated, and the examples are as well :()) Is there a reason why those headers are not available ? Matthieu -------------- next part -------------- An HTML attachment was scrubbed... URL: From lars at voxdahl.com Wed Oct 10 16:05:09 2007 From: lars at voxdahl.com (Lars Voxen Hansen) Date: Wed, 10 Oct 2007 22:05:09 +0200 Subject: [SciPy-dev] Saving in MATLAB Level 5 MAT-File Format In-Reply-To: <1e2af89e0710100842k716f07eey50938eff88d21cb0@mail.gmail.com> References: <00196FDD-43B1-48CB-B762-BD8AAD645449@voxdahl.com> <1e2af89e0710100842k716f07eey50938eff88d21cb0@mail.gmail.com> Message-ID: Sorry Matthew, I made the mistake to do some last corrections to the file just be posting. I changed all tabs to spaces, it resulted in some indentation errors. I hope I have not wasted you time dealing with this. A new file is attached. Lars -------------- next part -------------- A non-text attachment was scrubbed... Name: mio5.py Type: text/x-python-script Size: 28133 bytes Desc: not available URL: -------------- next part -------------- > d = io.load('test.mat','rb') Den 10/10/2007 kl. 17.42 skrev Matthew Brett: > Excellent - thank you. I will have a look and put it into mio soon, > > Matthew > > On 10/10/07, Lars Voxen Hansen wrote: >> I need to save in Matlab Level 5 MAT-File Format. The reson is that >> level 4 only support matrices (2d arrays), I also need to save some >> 3d arrays and load them into matlab. >> >> I have modified mio5.py in ./scipy/io such that this is posible now >> (see example below). The modified file mio5.py is attached to this >> mail. I have tested this on windows XP and mac OS X, where it works >> for me. >> >> Is it posible that this modification can be included in scipy? >> >> Best Regards and thanks for developing scipy >> Lars Voxen Hansen >> >> >> example: >> ---------------- >> >> import numpy as N >> from scipy import io >> >> A = N.ones((10,20,30)) >> f = open('test.mat','wb') >> MW = io.mio5.MatFile5Writer >> (f,do_compression=True,unicode_strings=True) >> MW.put_variables({'A1':A,'A2':A+1j*A,'s1':'string1','s2':u'string2'}) >> f.close() >> >> d = io.load('test.mat','rb') >> >> ------------------ >> >> >> >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> >> > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From robert.kern at gmail.com Wed Oct 10 16:11:00 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 10 Oct 2007 15:11:00 -0500 Subject: [SciPy-dev] Saving in MATLAB Level 5 MAT-File Format In-Reply-To: References: <00196FDD-43B1-48CB-B762-BD8AAD645449@voxdahl.com> <1e2af89e0710100842k716f07eey50938eff88d21cb0@mail.gmail.com> Message-ID: <470D31D4.8030206@gmail.com> Lars Voxen Hansen wrote: > Sorry Matthew, > > I made the mistake to do some last corrections to the file just be > posting. I changed all tabs to spaces, it resulted in some indentation > errors. > > I hope I have not wasted you time dealing with this. A new file is > attached. In the future, please submit patches to the files that are currently in SVN rather than entire replacement files themselves. Thanks. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From matthieu.brucher at gmail.com Wed Oct 10 17:07:58 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 10 Oct 2007 23:07:58 +0200 Subject: [SciPy-dev] How to use weave.ext_function() ? Message-ID: Hi again, I'm trying to put together a simple sample for the use of weave.ext_function. I was first amazed by the lack of the support_code argument, one has to go through customize. Here is the code I want to use : #!/usr/bin/python import scipy.weave support_code=""" #include template float somme(const Array& array) { return std::accumulate(array.begin(), array.end(), ++array); } """ code=""" return_val=somme(a) """ module = scipy.weave.ext_module("weave_ext_module") fonction = scipy.weave.ext_function("somme", code, ['a'], type_converters= scipy.weave.converters.blitz) fonction.customize.add_support_code(support_code) module.add_function(fonction) module.compile() import weave_ext_module print dir(weave_ext_module) import numpy a = numpy.array(((1., 2., 3.), (4., 5., 6.))) print weave_ext_module(a) print weave_ext_module(a=a) When it tries to compile the code, I have a KeyError in ext_tools line 421 for the variable a. I didn't have a problem with inline or blitz, so why this error occurs there ? Matthieu -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at ar.media.kyoto-u.ac.jp Wed Oct 10 23:55:46 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Thu, 11 Oct 2007 12:55:46 +0900 Subject: [SciPy-dev] For review: first milestone of scons support in numpy Message-ID: <470D9EC2.5000209@ar.media.kyoto-u.ac.jp> Hi, (if you are not interested in numpy developement, you can stop now :) ). Following the discussion a few days ago on using scons to build extensions in numpy, I have reached a somewhat usable milestone, in the numpy.scons branch of numpy, and would like to hear some comments, remarks, critics, etc...: Where to get/see: ----------------- svn repository : http://svn.scipy.org/svn/numpy/branches/numpy.scons looking at the code: http://projects.scipy.org/scipy/numpy/browser/branches/numpy.scons Examples: --------- To see how it feels from the package developer point of view, I have put three really simple examples: - Building a python extension: http://projects.scipy.org/scipy/numpy/browser/branches/numpy.scons/numpy/scons_fake/pyext - Building a ctypes-based package: http://projects.scipy.org/scipy/numpy/browser/branches/numpy.scons/numpy/scons_fake/ctypesext - An example on how to check for libraries and symbols in them: http://projects.scipy.org/scipy/numpy/browser/branches/numpy.scons/numpy/scons_fake/pyext For the numpy user, this should be totally transparent (no difference when building/installing). What: ----- This first milestone implements the following: - adding a scons command to numpy.distutils - adding an add_sconscript function to numpy.distutils setup, for packages willing to use scons - two builders: one for ctypes extension, and one for python extension - a basic implementation to check for libraries (the paths can be overwritten exactly like with distutils, using site.cfg; I have not yet implemented overwriting with environment variables). I have been testing this on the following platforms: - linux with gcc - linux with icc - linux with suncc - windows with MS toolikit 2003 - solaris studio express with suncc - mac os X (tiger, x86) And now ? --------- As discussed previously, I think numpy would benefit from exclusively using scons to build compiled extensions. I have started working on fortran support for scons (separate project, since this may be useful to all scons users, not just numpy): https://launchpad.net/numpy.scons.support and I can already do some non trivial things, not possible with numpy.distutils (automatically figuring out fortran mangling, flags for linking with C, blas/lapack flags). As expected, this is much more robust than distutils approach of hardcoding everything: although I used g77 for development, it worked without any change with ifort, gfortran and sun fortran compiler (on linux). There are still some issues for sure, but I don't see big problems. I don't want to do the work for nothing, though, so I would like to know the feeling of numpy developers first on this direction, in particular which platforms should work before merging consideration, etc... cheers, David From matthieu.brucher at gmail.com Thu Oct 11 02:53:40 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 11 Oct 2007 08:53:40 +0200 Subject: [SciPy-dev] How to use weave.ext_function() ? In-Reply-To: References: Message-ID: Sorry for the noise, I've found my mistake, the variable 'a' should have been declared first. Matthieu 2007/10/10, Matthieu Brucher : > > Hi again, > > I'm trying to put together a simple sample for the use of > weave.ext_function. I was first amazed by the lack of the support_code > argument, one has to go through customize. > > Here is the code I want to use : > > #!/usr/bin/python > > import scipy.weave > > support_code=""" > #include > > template > float somme(const Array& array) > { > return std::accumulate( array.begin(), array.end(), ++array); > } > """ > > code=""" > return_val=somme(a) > """ > > module = scipy.weave.ext_module("weave_ext_module") > fonction = scipy.weave.ext_function("somme", code, ['a'], type_converters= > scipy.weave.converters.blitz) > fonction.customize.add_support_code(support_code) > module.add_function(fonction) > module.compile() > > import weave_ext_module > print dir(weave_ext_module) > > import numpy > a = numpy.array(((1., 2., 3.), (4., 5., 6.))) > print weave_ext_module(a) > print weave_ext_module(a=a) > > When it tries to compile the code, I have a KeyError in ext_tools line 421 > for the variable a. > > I didn't have a problem with inline or blitz, so why this error occurs > there ? > > Matthieu > -------------- next part -------------- An HTML attachment was scrubbed... URL: From crwe at post.cz Thu Oct 11 08:23:17 2007 From: crwe at post.cz (=?us-ascii?Q?crwe=20crwe?=) Date: Thu, 11 Oct 2007 14:23:17 +0200 (CEST) Subject: [SciPy-dev] sparse eigensolvers Message-ID: <614.1040-3497-1706098585-1192105397@post.cz> Hello all, what is the current status of scipy support for algorithms such as eigenvalue decomposition on (very large) sparse matrices? I remember looking at scipy some time ago and it seemed to be in the state of 'if you do it, we will include it'. By 'it', I mean wrapping some existing libraries such as PROPACK, ARPACK etc. to work transparently with scipy (and scipy.sparse in particular). The reason I'm asking is that I have a student who would be interested in working on this topic (as his bachelor thesis). Please let me know how things stand from a broader perspective/planning point of view and what could still be done. I found some earlier replies in this mailing list which concerned the topic, but they seemed to deal more with details&particulars. Still, I'll be thankful for any links, cheers, Radim From ondrej at certik.cz Thu Oct 11 08:41:01 2007 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 11 Oct 2007 14:41:01 +0200 Subject: [SciPy-dev] sparse eigensolvers In-Reply-To: <614.1040-3497-1706098585-1192105397@post.cz> References: <614.1040-3497-1706098585-1192105397@post.cz> Message-ID: <85b5c3130710110541t71500453mc8025a07382fd438@mail.gmail.com> > what is the current status of scipy support for algorithms such as eigenvalue decomposition on (very large) sparse matrices? I remember looking at scipy some time ago and it seemed to be in the state of 'if you do it, we will include it'. By 'it', I mean wrapping some existing libraries such as PROPACK, ARPACK etc. to work transparently with scipy (and scipy.sparse in particular). > > The reason I'm asking is that I have a student who would be interested in working on this topic (as his bachelor thesis). Please let me know how things stand from a broader perspective/planning point of view and what could still be done. I found some earlier replies in this mailing list which concerned the topic, but they seemed to deal more with details&particulars. Still, I'll be thankful for any links, I'd be also interested of opinions on this issue. Since I think the best library for sparse matrices in Python is petsc4py at the moment: http://code.google.com/p/petsc4py/ (python-petsc4py package in Debian) So I think it's better to support just one way of doing sparse matrices in Python and do it well, not two or three. So I find it nice to support all the solvers from petsc and just provide python bindings to petsc. What are the disadvantages of this approach? Ondrej From hagberg at lanl.gov Thu Oct 11 08:58:23 2007 From: hagberg at lanl.gov (Aric Hagberg) Date: Thu, 11 Oct 2007 06:58:23 -0600 Subject: [SciPy-dev] sparse eigensolvers In-Reply-To: <614.1040-3497-1706098585-1192105397@post.cz> References: <614.1040-3497-1706098585-1192105397@post.cz> Message-ID: <20071011125823.GM16795@bigjim1.lanl.gov> On Thu, Oct 11, 2007 at 02:23:17PM +0200, crwe crwe wrote: > what is the current status of scipy support for algorithms such as eigenvalue decomposition on (very large) sparse matrices? I remember looking at scipy some time ago and it seemed to be in the state of 'if you do it, we will include it'. By 'it', I mean wrapping some existing libraries such as PROPACK, ARPACK etc. to work transparently with scipy (and scipy.sparse in particular). There is an ARPACK wrapper in the scipy sandbox. Right now the Python interface only handles the standard eigenvalue problem Ax=wx (symmetric and nonsymmetric) but the f2py wrapper should be able to handle the generalized eigenproblem and all of the various shifted modes. There are some design decisions to be made regarding the interface and some more code to be written to accomplish that. There is a description of the current state and some ideas for an interface at http://projects.scipy.org/scipy/scipy/ticket/231 http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper Aric From ondrej at certik.cz Thu Oct 11 09:55:17 2007 From: ondrej at certik.cz (Ondrej Certik) Date: Thu, 11 Oct 2007 15:55:17 +0200 Subject: [SciPy-dev] sparse eigensolvers In-Reply-To: <20071011125823.GM16795@bigjim1.lanl.gov> References: <614.1040-3497-1706098585-1192105397@post.cz> <20071011125823.GM16795@bigjim1.lanl.gov> Message-ID: <85b5c3130710110655u632f4e6au2ed8ad848d937484@mail.gmail.com> > There is an ARPACK wrapper in the scipy sandbox. Right now the Python > interface only handles the standard eigenvalue problem Ax=wx > (symmetric and nonsymmetric) but the f2py wrapper should be able to > handle the generalized eigenproblem and all of the various shifted modes. > There are some design decisions to be made regarding the interface > and some more code to be written to accomplish that. > > There is a description of the current state and some ideas for > an interface at > http://projects.scipy.org/scipy/scipy/ticket/231 > http://projects.scipy.org/scipy/scipy/wiki/ArpackWrapper Oh, I misread you are interested in eigensolvers, sorry about that. :) Yes, I don't know about any opensource systematic approach, in the same spirit as petsc4py. And I need this as well, for my master thesis. There is slepc4py: http://t2.unl.edu/documentation/slepc4py which is free, but slepc itself is not free. Arpack is unfortunately too archaic (doesn't perform well on my problems), but I have a very good experience with blzpack[1] and pysparse[2], also primme[3] is good. However, I'd be interested in some kind of unified interface in Python, the same way like one can use petsc4py for Ax=b problems. There are many libraries for eigensolvers and linear solvers though, so I am not sure if they all should be in SciPy? Maybe it's better to create a separater project for a unified python interface to eigensolvers, in a similar way like petsc4py is a unified python interface for almost all opensource linear solvers available? Ondrej [1] http://crd.lbl.gov/~osni/ [2] http://people.web.psi.ch/geus/pyfemax/pysparse.html, http://pysparse.sourceforge.net/ [3] http://www.cs.wm.edu/~andreas/software/ From wnbell at gmail.com Thu Oct 11 16:15:13 2007 From: wnbell at gmail.com (Nathan Bell) Date: Thu, 11 Oct 2007 15:15:13 -0500 Subject: [SciPy-dev] Sparse matvec inconsistency In-Reply-To: References: Message-ID: On 10/9/07, Bart Vandereycken wrote: > Hi all, > > the matvec (A*x) routine of sparse shows an inconsistency: the shape of > the returned vector is not the same as the vector x when shape==(n,1). A > simple change in the _matvec routines in sparse.py should do the trick... > > BTW I would have posted a ticket for this, but I lost my password for my > account... Maybe someone can make a lost password link on the trac? I don't have time to look at this right now, but I submitted a bug report: http://scipy.org/scipy/scipy/ticket/514 -- Nathan Bell wnbell at gmail.com From novin01 at gmail.com Fri Oct 12 12:06:57 2007 From: novin01 at gmail.com (Dave Hirschfeld) Date: Fri, 12 Oct 2007 16:06:57 +0000 (UTC) Subject: [SciPy-dev] Bug in scipy.stats.kde? Message-ID: Unless I'm mistaken it appears there's a bug in scipy.stats.kde as demonstrated in the example below. It appears that line 205 is referencing np (numpy) which has not been imported. At the top of the file ravel has been imported from numpy so the solution appears to be to simply remove the reference to np in the integrate_box_1d function. Regards, Dave Python 2.5.1 (r251:54863, Apr 18 2007, 08:51:08) [MSC v.1310 32 bit (Intel)] Type "copyright", "credits" or "license" for more information. IPython 0.8.1 -- An enhanced Interactive Python. In [1]: import scipy In [2]: scipy.__version__ Out[2]: '0.7.0.dev3433' In [3]: from scipy import randn, stats In [4]: pdf = stats.kde.gaussian_kde(randn(1000)) In [5]: pdf.integrate_box_1d(-3,3) --------------------------------------------------------------------------- Traceback (most recent call last) C:\development\sandbox\ in () C:\Python25\Lib\site-packages\scipy\stats\kde.py in integrate_box_1d(self, low, high) 203 raise ValueError("integrate_box_1d() only handles 1D pdfs") 204 --> 205 stdev = np.ravel(sqrt(self.covariance))[0] 206 207 normalized_low = ravel((low - self.dataset)/stdev) : global name 'np' is not defined In [6]: From robert.kern at gmail.com Fri Oct 12 13:17:03 2007 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 12 Oct 2007 12:17:03 -0500 Subject: [SciPy-dev] Bug in scipy.stats.kde? In-Reply-To: References: Message-ID: <470FAC0F.5000202@gmail.com> Dave Hirschfeld wrote: > Unless I'm mistaken it appears there's a bug in scipy.stats.kde as demonstrated > in the example below. > > It appears that line 205 is referencing np (numpy) which has not been imported. > At the top of the file ravel has been imported from numpy so the solution > appears to be to simply remove the reference to np in the integrate_box_1d > function. Fixed, thank you. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From jdh2358 at gmail.com Sat Oct 13 15:44:51 2007 From: jdh2358 at gmail.com (John Hunter) Date: Sat, 13 Oct 2007 14:44:51 -0500 Subject: [SciPy-dev] build error on OSX 10.3 Message-ID: <88e473830710131244h2eb0a450n7f1cf11b4cb9bd44@mail.gmail.com> I have OS X 10.3 installation w/ python2.5, and am trying to build scipy r3435 and am getting the following build error. Any ideas? scipy/io/numpyiomodule.c: In function `numpyio_fromfile': scipy/io/numpyiomodule.c:85: warning: `castfunc' might be used uninitialized in this function gcc -bundle -undefined dynamic_lookup build/temp.macosx-10.3-ppc-2.5/scipy/io/numpyiomodule.o -Lbuild/temp.macosx-10.3-ppc-2.5 -o build/lib.macosx-10.3-ppc-2.5/scipy/io/numpyio.so building 'scipy.lib.blas.fblas' extension compiling C sources C compiler: gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -DNDEBUG -g -O3 -Wall -Wstrict-prototypes creating build/temp.macosx-10.3-ppc-2.5/build/src.macosx-10.3-ppc-2.5/scipy/lib/blas compile options: '-DNO_ATLAS_INFO=3 -Ibuild/src.macosx-10.3-ppc-2.5 -I/usr/local/lib/python2.5/site-packages/numpy/core/include -I/usr/local/include/python2.5 -c' extra options: '-faltivec -I/System/Library/Frameworks/vecLib.framework/Headers' gcc: build/src.macosx-10.3-ppc-2.5/build/src.macosx-10.3-ppc-2.5/scipy/lib/blas/fblasmodule.c gcc: build/src.macosx-10.3-ppc-2.5/scipy/lib/blas/fblaswrap_veclib_c.c scipy/lib/blas/fblaswrap_veclib_c.c.src:12: error: parse error before '*' token scipy/lib/blas/fblaswrap_veclib_c.c.src:13: warning: function declaration isn't a prototype scipy/lib/blas/fblaswrap_veclib_c.c.src: In function `wcdotu_': scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `N' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: (Each undeclared identifier is reported only once scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: for each function it appears in.) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `X' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `incX' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `Y' undeclared (first use in this function) ...snipsnip.... scipy/lib/blas/fblaswrap_veclib_c.c.src:13: warning: function declaration isn't a prototype scipy/lib/blas/fblaswrap_veclib_c.c.src: In function `wzdotc_': scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `N' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `X' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `incX' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `Y' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `incY' undeclared (first use in this function) scipy/lib/blas/fblaswrap_veclib_c.c.src:14: error: `dotc' undeclared (first use in this function) error: Command "gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -DNO_ATLAS_INFO=3 -Ibuild/src.macosx-10.3-ppc-2.5 -I/usr/local/lib/python2.5/site-packages/numpy/core/include -I/usr/local/include/python2.5 -c build/src.macosx-10.3-ppc-2.5/scipy/lib/blas/fblaswrap_veclib_c.c -o build/temp.macosx-10.3-ppc-2.5/build/src.macosx-10.3-ppc-2.5/scipy/lib/blas/fblaswrap_veclib_c.o -faltivec -I/System/Library/Frameworks/vecLib.framework/Headers" failed with exit status 1 From david at ar.media.kyoto-u.ac.jp Sun Oct 14 04:32:20 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sun, 14 Oct 2007 17:32:20 +0900 Subject: [SciPy-dev] build error on OSX 10.3 In-Reply-To: <88e473830710131244h2eb0a450n7f1cf11b4cb9bd44@mail.gmail.com> References: <88e473830710131244h2eb0a450n7f1cf11b4cb9bd44@mail.gmail.com> Message-ID: <4711D414.2090405@ar.media.kyoto-u.ac.jp> John Hunter wrote: > I have OS X 10.3 installation w/ python2.5, and am trying to build > scipy r3435 and am getting the following build error. Any ideas? > > > Someone else had the same problem: http://projects.scipy.org/scipy/scipy/ticket/510 This seems to be 10.3 specific (it does not appear on 10.4). The problem seems to be that on panther, veclib does not define the complex type. If you know where to get information on the veclib, I would be grateful, because I didn't find a lot of information (in general, I find the documentation on apple frameworks/libs really sparse, or maybe I just do not know where to look). cheers, David From jdh2358 at gmail.com Sun Oct 14 08:18:38 2007 From: jdh2358 at gmail.com (John Hunter) Date: Sun, 14 Oct 2007 07:18:38 -0500 Subject: [SciPy-dev] build error on OSX 10.3 In-Reply-To: <4711D414.2090405@ar.media.kyoto-u.ac.jp> References: <88e473830710131244h2eb0a450n7f1cf11b4cb9bd44@mail.gmail.com> <4711D414.2090405@ar.media.kyoto-u.ac.jp> Message-ID: <88e473830710140518l5a9454a5yff6ec1b774ebbbff@mail.gmail.com> On 10/14/07, David Cournapeau wrote: > This seems to be 10.3 specific (it does not appear on 10.4). The problem > seems to be that on panther, veclib does not define the complex type. If > you know where to get information on the veclib, I would be grateful, > because I didn't find a lot of information (in general, I find the > documentation on apple frameworks/libs really sparse, or maybe I just do > not know where to look). OK, I added the complex.h header as described there to scipy/lib/blas/fblaswrap_veclib_c.c.src and scipy/linalg/src/fblaswrap_veclib_c.c and was able to get scipy compiled. I read a few of the links in the page you cited, and the only thing I could find about the brokenness of complex in veclib was: > Complex numbers support may be broken (some problems with passing/returning > complex values on 64-bit targets, and not checked against the requirements of the > C99 standard). However, the _Complex keyword now works. Are any 10.3 platforms 64 bit? If not, there may be little danger in simply adding it, or conditionally adding it for non-64 bit platforms. The other question is: what has recently changed vis-a-vis veclib that is now causing the compile on 10.3 to break. I compiled scipy from svn several months ago, though I don't have the revision number off the top of head. I had 9 errors on my test, but all were related to loadmat. My guess is this is that endianess is not being handled properly in the test. Everything else appears to pass. I'm attaching a patch against r3435 if you decide this should be added. JDH -------------- next part -------------- A non-text attachment was scrubbed... Name: veclib.patch Type: text/x-patch Size: 672 bytes Desc: not available URL: From david at ar.media.kyoto-u.ac.jp Mon Oct 15 00:17:34 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 15 Oct 2007 13:17:34 +0900 Subject: [SciPy-dev] build error on OSX 10.3 In-Reply-To: <88e473830710140518l5a9454a5yff6ec1b774ebbbff@mail.gmail.com> References: <88e473830710131244h2eb0a450n7f1cf11b4cb9bd44@mail.gmail.com> <4711D414.2090405@ar.media.kyoto-u.ac.jp> <88e473830710140518l5a9454a5yff6ec1b774ebbbff@mail.gmail.com> Message-ID: <4712E9DE.3000207@ar.media.kyoto-u.ac.jp> John Hunter wrote: > On 10/14/07, David Cournapeau wrote: > > > Complex numbers support may be broken (some problems with > passing/returning > > complex values on 64-bit targets, and not checked against the > requirements of the > > C99 standard). However, the _Complex keyword now works. > > Are any 10.3 platforms 64 bit? If not, there may be little danger in > simply adding it, or conditionally adding it for non-64 bit platforms. > Cpu-wise, I think G5 is 64 bits capable ? 64 bits platform is a bit ambiguous, or at last I am not sure what it really means. Does it mean that the platform can have more than 4 Gb of Ram ? Does it mean that each process can have more than 32 bits address space ? (Panther supports the former but not the later). In our case, the problem seems more related to an API problem (difference between 32 bits and 64 bits). > The other question is: what has recently changed vis-a-vis veclib that > is now causing the compile on 10.3 to break. I compiled scipy from > svn several months ago, though I don't have the revision number off > the top of head. > I submitted the above fix because otherwise, you have a problem when using complex values returned from fortran (which was causing the check_dot errors mentionned sometimes here: veclib fortran interface place complex return values in memory, but gfortran expects it to be in a register, or something like this). But because the information on veclib is so scarse, I did not find a lot of information: veclib gives ~25 000 hits only on google. In particular, I did not find any mention of differences between panther and tiger on this issue. I am a bit reluctant to add the complex.h before I can understand the issue (specially since panther uses gcc 3.3, which has some problems with complex support). cheers, David From dmitrey.kroshko at scipy.org Mon Oct 15 11:23:26 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Mon, 15 Oct 2007 18:23:26 +0300 Subject: [SciPy-dev] scikits svn doesn't work? Message-ID: <471385EE.9050908@scipy.org> Hi all, I have some problems with svn, when I try to commit or update, it yields (for example, for ralg.py): svn: Unrecognized URL scheme for 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers/UkrOpt/ralg.py' or (if I call from upper level) svn: Unrecognized URL scheme for 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers' Any idea? When I try to enter http page all works ok: http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers/ Regards, Dmitrey From robert.kern at gmail.com Mon Oct 15 12:48:30 2007 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 15 Oct 2007 11:48:30 -0500 Subject: [SciPy-dev] scikits svn doesn't work? In-Reply-To: <471385EE.9050908@scipy.org> References: <471385EE.9050908@scipy.org> Message-ID: <471399DE.2020700@gmail.com> dmitrey wrote: > Hi all, > I have some problems with svn, when I try to commit or update, it yields > (for example, for ralg.py): > > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers/UkrOpt/ralg.py' > or (if I call from upper level) > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers' > > Any idea? Are you using Ubuntu? Do you have feisty-proposed as one of your apt repositories? A broken subversion dpkg was uploaded to feisty-proposed in the past couple of days. It's missing the plugin that lets you access http:// SVN repositories. Downgrade to the version in feisty: 1.4.3dfsg1-1ubuntu1. This convinced me not to use the *-proposed repositories anymore. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From dmitrey.kroshko at scipy.org Mon Oct 15 13:20:25 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Mon, 15 Oct 2007 20:20:25 +0300 Subject: [SciPy-dev] scikits svn doesn't work? In-Reply-To: <471399DE.2020700@gmail.com> References: <471385EE.9050908@scipy.org> <471399DE.2020700@gmail.com> Message-ID: <4713A159.4060100@scipy.org> Robert Kern wrote: > dmitrey wrote: > >> Hi all, >> I have some problems with svn, when I try to commit or update, it yields >> (for example, for ralg.py): >> >> svn: Unrecognized URL scheme for >> 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers/UkrOpt/ralg.py' >> or (if I call from upper level) >> svn: Unrecognized URL scheme for >> 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers' >> >> Any idea? >> > > Are you using Ubuntu? Do you have feisty-proposed as one of your apt > repositories? A broken subversion dpkg was uploaded to feisty-proposed in the > past couple of days. It's missing the plugin that lets you access http:// SVN > repositories. Downgrade to the version in feisty: 1.4.3dfsg1-1ubuntu1. > > This convinced me not to use the *-proposed repositories anymore. > > Yes, that's the matter. How it can be done? I have tried to install the version 1.4.3dfsg1-1ubuntu1 but it yields "error - more later version is already installed". D. From robert.kern at gmail.com Mon Oct 15 14:43:32 2007 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 15 Oct 2007 13:43:32 -0500 Subject: [SciPy-dev] scikits svn doesn't work? In-Reply-To: <4713A159.4060100@scipy.org> References: <471385EE.9050908@scipy.org> <471399DE.2020700@gmail.com> <4713A159.4060100@scipy.org> Message-ID: <4713B4D4.2020503@gmail.com> dmitrey wrote: > Robert Kern wrote: >> dmitrey wrote: >> >>> Hi all, >>> I have some problems with svn, when I try to commit or update, it yields >>> (for example, for ralg.py): >>> >>> svn: Unrecognized URL scheme for >>> 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers/UkrOpt/ralg.py' >>> or (if I call from upper level) >>> svn: Unrecognized URL scheme for >>> 'http://svn.scipy.org/svn/scikits/trunk/openopt/scikits/openopt/solvers' >>> >>> Any idea? >>> >> Are you using Ubuntu? Do you have feisty-proposed as one of your apt >> repositories? A broken subversion dpkg was uploaded to feisty-proposed in the >> past couple of days. It's missing the plugin that lets you access http:// SVN >> repositories. Downgrade to the version in feisty: 1.4.3dfsg1-1ubuntu1. >> >> This convinced me not to use the *-proposed repositories anymore. >> > Yes, that's the matter. How it can be done? I have tried to install the > version 1.4.3dfsg1-1ubuntu1 but it yields "error - more later version is > already installed". I use aptitude. I searched for the "subversion" package and pressed Enter to view its details. At the bottom, it shows you the versions that are available, and I selected 1.4.3dfsg1-1ubuntu1. It may help to remove feisty-proposed from your list of repositories and update the package list. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From wnbell at gmail.com Mon Oct 15 15:31:32 2007 From: wnbell at gmail.com (Nathan Bell) Date: Mon, 15 Oct 2007 14:31:32 -0500 Subject: [SciPy-dev] Sparse matvec inconsistency In-Reply-To: References: Message-ID: On 10/11/07, Nathan Bell wrote: > On 10/9/07, Bart Vandereycken wrote: > > Hi all, > > > > the matvec (A*x) routine of sparse shows an inconsistency: the shape of > > the returned vector is not the same as the vector x when shape==(n,1). A > > simple change in the _matvec routines in sparse.py should do the trick... > > > > BTW I would have posted a ticket for this, but I lost my password for my > > account... Maybe someone can make a lost password link on the trac? > > I don't have time to look at this right now, but I submitted a bug report: > http://scipy.org/scipy/scipy/ticket/514 This problem should be fixed in r3436 http://scipy.org/scipy/scipy/changeset/3436 I've also added some dimension checking to the sparse matrix matvec(). There was some confusion about why the previous checks were disabled, so let me know if the current ones need to be relaxed. -- Nathan Bell wnbell at gmail.com From matthieu.brucher at gmail.com Wed Oct 17 03:44:21 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 17 Oct 2007 09:44:21 +0200 Subject: [SciPy-dev] swig dependency In-Reply-To: <47059E77.5090004@ar.media.kyoto-u.ac.jp> References: <4704EDF2.8020807@stsci.edu> <4705554B.2030707@gmail.com> <47059E77.5090004@ar.media.kyoto-u.ac.jp> Message-ID: > > > > That's not entirely true. The UMFPACK bindings are optional, and > sparsetools has > > the generated files checked in so that no one except those modifying > those > > wrappers needs to have SWIG installed. Checking generated files into > source > > control is somewhat icky, but so is having to answer this question over > and over. > > > Something which may be nice, but I don't know if it is feasable, would > be to regularly test whether the files can be regenerated. For example, > when I wanted to improve scipy.cluster, I could not make swig work at > all, and I ended up rewriting the whole thing in plain C (the module was > really small). I am afraid that if generated code stays for a long time > without being regularly updated, nobody will be able to work on it > anymore. > > David I'm thinking about the same thing. With Pyrex files, it is simple to fall back to the generated files when the Pyrex module cannot be imported (Pyrex has an example with this), and it would be great to have the same thing for distutils (and this would solve the problem of the dependency wouldn't it ?). Matthieu -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Wed Oct 17 13:27:10 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 17 Oct 2007 19:27:10 +0200 Subject: [SciPy-dev] Recent changes in scipy.io Message-ID: Hi all, scipy.test(1) results in (0.7.0.dev3440) ====================================================================== ERROR: check loadmat case cell_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case cellnest_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case emptycell_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case object_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case stringarray_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case struct_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case structarr_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case structnest_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 327, in matrix_writer_factory raise TypeError, 'Cannot save object arrays in Mat4' TypeError: Cannot save object arrays in Mat4 ====================================================================== ERROR: check loadmat case unicode_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 92, in cc savemat(mat_stream, expected, format) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 131, in savemat MW.put_variables(mdict) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 340, in put_variables matrix_writer_factory(self.file_stream, var, name).write() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 286, in write st = st_arr.item().encode('ascii') UnicodeEncodeError: 'ascii' codec can't encode characters in position 11-33: ordinal not in range(128) ====================================================================== FAIL: check loadmat case 3dmatrix_round_trip ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 94, in cc self._check_case(name, [mat_stream], expected) File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 79, in _check_case self._check_level(k_label, expected, matdict[k]) File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 66, in _check_level assert_array_almost_equal(actual, expected, err_msg=label, decimal=5) File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 201, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal test 3dmatrix_round_trip; file , variable test3dmatrix (shapes (6, 4), (2, 3, 4) mismatch) x: array([[ 1., 7., 13., 19.], [ 3., 9., 15., 21.], [ 5., 11., 17., 23.],... y: array([[[ 1, 7, 13, 19], [ 3, 9, 15, 21], [ 5, 11, 17, 23]],... ====================================================================== FAIL: test_explicit (scipy.tests.test_odr.TestODR) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/odr/tests/test_odr.py", line 50, in test_explicit -8.7849712165253724e-02]), File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 217, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1.26462971e+03, -5.42545890e+01, -8.64250389e-02]) y: array([ 1.26465481e+03, -5.40184100e+01, -8.78497122e-02]) ====================================================================== FAIL: test_multi (scipy.tests.test_odr.TestODR) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/odr/tests/test_odr.py", line 191, in test_multi 0.5101147161764654, 0.5173902330489161]), File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 217, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 4.31272063, 2.44289312, 7.76215871, 0.55995622, 0.46423343]) y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) Nils ---------------------------------------------------------------------- Ran 1793 tests in 4.942s FAILED (failures=3, errors=9) From dmitrey.kroshko at scipy.org Wed Oct 17 14:02:34 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 17 Oct 2007 21:02:34 +0300 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? Message-ID: <47164E3A.9080806@scipy.org> Hi all, I have been asked about scipy online documentation and all I had to say was this URL http://www.scipy.org -> Documentation -> SciPy API documentation http://www.scipy.org/doc/api_docs/ However, don't you think the documentation from 2006.08.14 is obsolete? I had already mentioned it some weeks ago, but seems like noone had interested. Regards, D. From wnbell at gmail.com Wed Oct 17 16:06:08 2007 From: wnbell at gmail.com (Nathan Bell) Date: Wed, 17 Oct 2007 15:06:08 -0500 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: <47164E3A.9080806@scipy.org> References: <47164E3A.9080806@scipy.org> Message-ID: On 10/17/07, dmitrey wrote: > However, don't you think the documentation from 2006.08.14 is obsolete? Absolutely, it describes my beloved scipy.sparse as "rudimentary" :) -- Nathan Bell wnbell at gmail.com From aisaac at american.edu Wed Oct 17 17:06:32 2007 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 17 Oct 2007 17:06:32 -0400 Subject: [SciPy-dev] =?utf-8?q?don=27t_you_think_the_documentation_from_20?= =?utf-8?b?MDYuMDguMTQgaXMJb2Jzb2xldGU/?= In-Reply-To: <47164E3A.9080806@scipy.org> References: <47164E3A.9080806@scipy.org> Message-ID: Well there is more: http://www.scipy.org/Documentation But I guess your point is that it seems time to regenerate the SciPy API documentation. I do not know who would be in charge of this. Is there a specific "gap" in the documentation that concerns you? Cheers, Alan Isaac From travis at enthought.com Wed Oct 17 17:03:40 2007 From: travis at enthought.com (Travis Vaught) Date: Wed, 17 Oct 2007 16:03:40 -0500 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: References: <47164E3A.9080806@scipy.org> Message-ID: Greetings, I've just refreshed the api_docs using endo again. Perhaps Charles Harris would like to chime in about the status of an epydoc-generated set, or some other approach. This is for SciPy 0.6.0 (forgot to note that in the header--I'll regenerate and upload again). Nathan, scipy.sparse is still "rudimentary" according to the doc string in 0.6.0 (and in svn). Feel free to update the doc string in svn and we'll have it listed with the proper respect in the next version of the docs. Best, Travis V. On Oct 17, 2007, at 3:06 PM, Nathan Bell wrote: > On 10/17/07, dmitrey wrote: >> However, don't you think the documentation from 2006.08.14 is >> obsolete? > > Absolutely, it describes my beloved scipy.sparse as "rudimentary" :) > > -- > Nathan Bell wnbell at gmail.com > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From bgoli at sun.ac.za Thu Oct 18 06:35:41 2007 From: bgoli at sun.ac.za (Brett Olivier) Date: Thu, 18 Oct 2007 12:35:41 +0200 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <20070926221107.GH32704@mentat.za.net> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> Message-ID: <200710181235.41279.bgoli@sun.ac.za> On Thursday 27 September 2007 00:11:07 Stefan van der Walt wrote: > > generation of a binary structure 4 ... ok > > generic filter 1Illegal instruction > > > > Hunting around, I found the offending test in ndimage/tests. If > > I rename this directory (so the tests are skipped), the tests > > proceed without crashing. > > This looks like > > http://projects.scipy.org/scipy/scipy/ticket/404 > > I'd be grateful if you can help me narrow it down. Hi Stefan I've just run into this again with the new Mandriva (32bit) that uses: gcc version 4.2.2 20070909 (prerelease) (4.2.2-0.RC.1mdv2008.0) GCC 4.2 dumped some build errors that may be useful: (Python 2.5.1, ATLAS 3.8.0, NumPy 1.0.3.1, SciPy 0.6.0) ++++ creating build/temp.linux-i686-2.5/scipy/ndimage creating build/temp.linux-i686-2.5/scipy/ndimage/src compile options: '-Iscipy/ndimage/src -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c' gcc: scipy/ndimage/src/ni_filters.c gcc: scipy/ndimage/src/nd_image.c scipy/ndimage/src/nd_image.c: In function ?Py_Filter1DFunc?: scipy/ndimage/src/nd_image.c:273: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:273: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c:274: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:274: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c: In function ?Py_FilterFunc?: scipy/ndimage/src/nd_image.c:351: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:351: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c: In function ?Py_Histogram?: scipy/ndimage/src/nd_image.c:1100: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:1100: note: if this code is reached, the program will abort ++++ Cheers Brett -- Brett G. Olivier PhD http://www.jjj.sun.ac.za/members#brett Stellenbosch University, South African National Bioinformatics Network From matthew.brett at gmail.com Thu Oct 18 06:38:06 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 18 Oct 2007 11:38:06 +0100 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <200710181235.41279.bgoli@sun.ac.za> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> Message-ID: <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> Excellent - thanks for sending this. I will have a look. Matthew On 10/18/07, Brett Olivier wrote: > On Thursday 27 September 2007 00:11:07 Stefan van der Walt wrote: > > > generation of a binary structure 4 ... ok > > > generic filter 1Illegal instruction > > > > > > Hunting around, I found the offending test in ndimage/tests. If > > > I rename this directory (so the tests are skipped), the tests > > > proceed without crashing. > > > > This looks like > > > > http://projects.scipy.org/scipy/scipy/ticket/404 > > > > I'd be grateful if you can help me narrow it down. > > Hi Stefan I've just run into this again with the new Mandriva (32bit) that > uses: gcc version 4.2.2 20070909 (prerelease) (4.2.2-0.RC.1mdv2008.0) > > GCC 4.2 dumped some build errors that may be useful: > (Python 2.5.1, ATLAS 3.8.0, NumPy 1.0.3.1, SciPy 0.6.0) > > ++++ > creating build/temp.linux-i686-2.5/scipy/ndimage > creating build/temp.linux-i686-2.5/scipy/ndimage/src > compile > options: '-Iscipy/ndimage/src -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c' > gcc: scipy/ndimage/src/ni_filters.c > gcc: scipy/ndimage/src/nd_image.c > scipy/ndimage/src/nd_image.c: In function 'Py_Filter1DFunc': > scipy/ndimage/src/nd_image.c:273: warning: function called through a > non-compatible type > scipy/ndimage/src/nd_image.c:273: note: if this code is reached, the program > will abort > scipy/ndimage/src/nd_image.c:274: warning: function called through a > non-compatible type > scipy/ndimage/src/nd_image.c:274: note: if this code is reached, the program > will abort > scipy/ndimage/src/nd_image.c: In function 'Py_FilterFunc': > scipy/ndimage/src/nd_image.c:351: warning: function called through a > non-compatible type > scipy/ndimage/src/nd_image.c:351: note: if this code is reached, the program > will abort > scipy/ndimage/src/nd_image.c: In function 'Py_Histogram': > scipy/ndimage/src/nd_image.c:1100: warning: function called through a > non-compatible type > scipy/ndimage/src/nd_image.c:1100: note: if this code is reached, the program > will abort > ++++ > > Cheers > Brett > > -- > Brett G. Olivier PhD > http://www.jjj.sun.ac.za/members#brett > Stellenbosch University, South African National Bioinformatics Network > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From matthew.brett at gmail.com Thu Oct 18 09:28:55 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 18 Oct 2007 14:28:55 +0100 Subject: [SciPy-dev] Recent changes in scipy.io In-Reply-To: References: Message-ID: <1e2af89e0710180628k87afcc7o267a945786706320@mail.gmail.com> Thanks for the report... On 10/17/07, Nils Wagner wrote: > Hi all, > > scipy.test(1) results in (0.7.0.dev3440) > > ====================================================================== > ERROR: check loadmat case cell_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case cellnest_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case emptycell_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case object_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case stringarray_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case struct_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case structarr_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case structnest_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 327, in matrix_writer_factory > raise TypeError, 'Cannot save object arrays in Mat4' > TypeError: Cannot save object arrays in Mat4 > > ====================================================================== > ERROR: check loadmat case unicode_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 92, in cc > savemat(mat_stream, expected, format) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 131, in savemat > MW.put_variables(mdict) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 340, in put_variables > matrix_writer_factory(self.file_stream, var, > name).write() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 286, in write > st = st_arr.item().encode('ascii') > UnicodeEncodeError: 'ascii' codec can't encode characters > in position 11-33: ordinal not in range(128) > > ====================================================================== > FAIL: check loadmat case 3dmatrix_round_trip > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 94, in cc > self._check_case(name, [mat_stream], expected) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 79, in _check_case > self._check_level(k_label, expected, matdict[k]) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 66, in _check_level > assert_array_almost_equal(actual, expected, > err_msg=label, decimal=5) > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 201, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > test 3dmatrix_round_trip; file at 0x103e180>, variable test3dmatrix > (shapes (6, 4), (2, 3, 4) mismatch) > x: array([[ 1., 7., 13., 19.], > [ 3., 9., 15., 21.], > [ 5., 11., 17., 23.],... > y: array([[[ 1, 7, 13, 19], > [ 3, 9, 15, 21], > [ 5, 11, 17, 23]],... > > ====================================================================== > FAIL: test_explicit (scipy.tests.test_odr.TestODR) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 50, in test_explicit > -8.7849712165253724e-02]), > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 1.26462971e+03, -5.42545890e+01, > -8.64250389e-02]) > y: array([ 1.26465481e+03, -5.40184100e+01, > -8.78497122e-02]) > > ====================================================================== > FAIL: test_multi (scipy.tests.test_odr.TestODR) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 191, in test_multi > 0.5101147161764654, 0.5173902330489161]), > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File > "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", > line 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 4.31272063, 2.44289312, 7.76215871, > 0.55995622, 0.46423343]) > y: array([ 4.37998803, 2.43330576, 8.00288459, > 0.51011472, 0.51739023]) > > Nils > > > ---------------------------------------------------------------------- > Ran 1793 tests in 4.942s > > FAILED (failures=3, errors=9) > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From travis at enthought.com Thu Oct 18 10:50:30 2007 From: travis at enthought.com (Travis Vaught) Date: Thu, 18 Oct 2007 09:50:30 -0500 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: References: <47164E3A.9080806@scipy.org> Message-ID: <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> So that a motivated volunteer can know how to pitch in, I've documented the steps I used for generating the docs here: http://projects.scipy.org/scipy/scipy/wiki/APIDocumentation When another preferred method comes along, feel free to update the page. Best, Travis V. On Oct 17, 2007, at 4:06 PM, Alan G Isaac wrote: > Well there is more: > http://www.scipy.org/Documentation > > But I guess your point is that it seems > time to regenerate the SciPy API documentation. > I do not know who would be in charge of this. > > Is there a specific "gap" in the documentation > that concerns you? > > Cheers, > Alan Isaac > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From dmitrey.kroshko at scipy.org Thu Oct 18 13:06:56 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Thu, 18 Oct 2007 20:06:56 +0300 Subject: [SciPy-dev] isn't it a bug in scipy<->LAPACK connection? Message-ID: <471792B0.3040905@scipy.org> Hi all, isn't it a bug in scipy<->LAPACK connection? from scipy.linalg.flapack import dgelss #print dgelss.__doc__ from scipy import rand N = 100 M = 10 #!!with M >= N all works ok!! A = rand(M,N) b = 15+rand(M) v,x,s,rank,info = dgelss(A,b) So it yields "failed in converting 2nd argument `b' of flapack.dgelss to C/Fortran array". dgelss routine allows M to be less, equal or greater to N, you can easily see it from documentation, for example: https://web.kudpc.kyoto-u.ac.jp/doc/HPC-WG/Manual/lapack/dgelss.html Regards, Dmitrey From matthew.brett at gmail.com Thu Oct 18 20:24:37 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 19 Oct 2007 01:24:37 +0100 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> Message-ID: <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> Hi, > > gcc: scipy/ndimage/src/ni_filters.c > > gcc: scipy/ndimage/src/nd_image.c > > scipy/ndimage/src/nd_image.c: In function 'Py_Filter1DFunc': > > scipy/ndimage/src/nd_image.c:273: warning: function called through a > > non-compatible type > > scipy/ndimage/src/nd_image.c:273: note: if this code is reached, the program > > will abort Hmm - well - this is beyond my rudimentary C, and I'd be very grateful for any help from a C person here, but: The problem seems to be this set of lines in nd_image.h (starting line 276): #define NA_OutputArray (*(PyArrayObject* (*) (PyObject*,NumarrayType,int) ) (void *) NA_OutputArray) #define NA_IoArray (*(PyArrayObject* (*) (PyObject*,NumarrayType,int) ) (void *) NA_IoArray) #define NA_NewArray (*(PyArrayObject* (*) (void* buffer, NumarrayType, int, ...) ) (void *) NA_NewArray ) #define NA_elements (*(unsigned long (*) (PyArrayObject*) ) (void *) NA_elements) #define NA_InputArray (*(PyArrayObject* (*) (PyObject*,NumarrayType,int) ) (void *) NA_InputArray) Commenting out these lines allows the code to compile and the ndimage tests to pass. For the offending lines in nd_image.c, (line 273 for example above), the C preprocessor generates this kind of thing: py_ibuffer = (*(PyArrayObject* (*) (void* buffer, NumarrayType, int, ...) ) (void *) NA_NewArray )(iline, NPY_DOUBLE, 1, &ilen); Now, I've no idea what the #define lines are for (it is late, but that's not the reason). Can anyone enlighten me? Matthew. From david at ar.media.kyoto-u.ac.jp Thu Oct 18 20:43:51 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 19 Oct 2007 09:43:51 +0900 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> Message-ID: <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> Matthew Brett wrote: > > > The problem seems to be this set of lines in nd_image.h (starting line > 276): > > #define NA_OutputArray (*(PyArrayObject* (*) > (PyObject*,NumarrayType,int) ) (void *) NA_OutputArray) > #define NA_IoArray (*(PyArrayObject* (*) (PyObject*,NumarrayType,int) > ) (void *) NA_IoArray) > #define NA_NewArray (*(PyArrayObject* (*) (void* buffer, > NumarrayType, int, ...) ) (void *) NA_NewArray ) > #define NA_elements (*(unsigned long (*) (PyArrayObject*) ) (void *) > NA_elements) > #define NA_InputArray (*(PyArrayObject* (*) > (PyObject*,NumarrayType,int) ) (void *) NA_InputArray) > Wow, nice function pointers. I did not even know this syntax was valid. Redefines the functions with the define is a good obsfucation, too. As far as I understand, the line #define NA_NewArray (*(PyArrayObject* (*) (void* buffer, NumarrayType, int, ...) ) (void *) NA_NewArray ) Means that the previously defined NA_NewArray functions has its output casted, output which is a function pointer. So basically, this is equivalent to #define NA_NewArray (*(func_ptr) (void *) NA_NewArray ) with func_ptr a function pointer, function taking (void* buffer, NumarrayType, int, ...) as arguments, and returning PyArrayObject*. The first * is strange, though, I don't understand what than means (you cannot dereference a function pointer, normally ?). But anyway, the fact that the compiler says that it emits non supported opcodes may suggest that this is relying on undefined behaviour ? cheers, David From dlenski at gmail.com Thu Oct 18 21:26:40 2007 From: dlenski at gmail.com (Dan Lenski) Date: Fri, 19 Oct 2007 01:26:40 +0000 (UTC) Subject: [SciPy-dev] scipy.signal documentation suggestions Message-ID: Hi all, I have been using SciPy, NumPy, and Matplotlib heavily for my nanoelectronics PhD research the last few months, and am a huge fan. Recently, I've started using the scipy.signal module and found the documentation somewhat unclear. The lfilter() function has very nice documentation, but I was trying to convert some basic analog filters to digital filters, and got hung up by the bilinear() function... which doesn't document its inputs or outputs. Having now figured out how it works, I thought I would suggest an expanded documentation string modeled after that of lfilter(), below. Sorry if this isn't the right place to send documentation suggestions. I will be happy to add more doc strings as I learn how to use this package, if anyone wants them. Thanks! Dan Lenski --- bilinear.__doc__ = '''Return a digital filter from an analog filter using the bilinear transform. Description The bilinear transform converts a filter in the continuous-time domain (an analog filter) to a filter in the discrete-time domain (a digital filter). Inputs: b -- The numerator coefficient vector of the analog transform in a 1-D sequence. a -- The denominator coefficient vector of the analog transform in a 1-D sequence. If a[0] is not 1, then both a and b are normalized by a[0]. fs -- The desired sampling frequency of the digital transform. (*Default* = 1.0) Outputs: (bd, ad) bd -- The numerator coefficient vector of the digital transform. ad -- The denominator coefficient vector of the digital transform. Both a and b are normalized such that a[0]=1. Algorithm: Given an analog filter, with rational transfer function in the s-domain: -1 -nb b[0] + b[1]s + ... + b[nb] s H(z) = ---------------------------------- -1 -na a[0] + a[1]s + ... + a[na] s The bilinear transform maps from the s-plane to the z-plane by substituting s = (2*fs)(z-1)/(z+1), where fs is the sampling frequency of the digital filter. This gives the rational transfer function in the z-domain: -1 -nbd bd[0] + bd[1]z + ... + b[nbd] z Y(z) = -------------------------------------- X(z) -1 -nad ad[0] + ad[1]z + ... + a[nad] z Example: Consider a simple first-order low-pass analog filter, with corner frequency w. Its transfer function is: -1 -1 -1 1 0 + w s b[0] = 0 b[1] = w H(z) = --------- = ------------- => -1 1 + s w -1 -1 a[0] = 1 a[1] = w 1 + w s A bilinear transform on this filter will produce a digital filter, with (non-normalized) transfer function: -1 w + w z b[0] = w b[1] = w Y(z) = --------------------- => -1 a[0] = w+2fs a[1] = w-2fs w+2fs + (w-2fs) z ''' From millman at berkeley.edu Thu Oct 18 21:39:20 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 18 Oct 2007 18:39:20 -0700 Subject: [SciPy-dev] scipy.signal documentation suggestions In-Reply-To: References: Message-ID: Hey, Thanks a lot for the expanded docstring. We are very interested in improving the documentation and appreciate any help you can give us. I will take a look at the new docstring you submitted. In the meantime, please read over our current coding/documentation guidelines: http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines Here is an example: http://svn.scipy.org/svn/numpy/trunk/numpy/doc/example.py And here is what it looks like when it is rendered: http://www.scipy.org/doc/example/ Thanks again, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From dlenski at gmail.com Thu Oct 18 23:53:24 2007 From: dlenski at gmail.com (Dan Lenski) Date: Fri, 19 Oct 2007 03:53:24 +0000 (UTC) Subject: [SciPy-dev] scipy.signal documentation suggestions References: Message-ID: Jarrod Millman berkeley.edu> writes: > Thanks a lot for the expanded docstring. We are very interested in > improving the documentation and appreciate any help you can give us. > I will take a look at the new docstring you submitted. Sounds good. Thanks for making all this stuff work :-) > In the meantime, please read over our current coding/documentation guidelines: > http://projects.scipy.org/scipy/numpy/wiki/CodingStyleGuidelines Didn't realize that docstrings were supposed to be in reST format, oops. Here's an attempt at getting it right, includes an example in doctest format I hope this version is more useful. Dan --- bilinear.__doc__ = ''''''Return a digital filter from an analog filter using the bilinear transform. The bilinear transform converts a filter in the continuous-time domain (an analog filter) to a filter in the discrete-time domain (a digital filter). *Parameters*: b : {array-like} The numerator coefficient vector of the analog transform. a : {array-like} The denominator coefficient vector of the analog transform. If ``a[0]`` is not 1, then both ``a`` and ``b`` are normalized by ``a[0]``. fs : {float} The desired sampling frequency of the digital transform. (*Default* = 1.0) *Returns*: bd : {array} The numerator coefficient vector of the digital transform. ad : {array} The denominator coefficient vector of the digital transform. Both ``a`` and ``b`` are normalized such that ``a[0]=1``. *Algorithm*: Given an analog filter, with rational transfer function in the s-domain:: -1 -nb b[0] + b[1]s + ... + b[nb] s H(z) = ---------------------------------- -1 -na a[0] + a[1]s + ... + a[na] s The bilinear transform maps from the s-plane to the z-plane by substituting ``s = (2*fs)(z-1)/(z+1)``, where ``fs`` is the sampling frequency of the digital filter. This gives the rational transfer function in the z-domain:: -1 -nbd bd[0] + bd[1]z + ... + b[nbd] z Y(z) = -------------------------------------- X(z) -1 -nad ad[0] + ad[1]z + ... + a[nad] z *Example*: Consider a simple first-order low-pass analog filter, with corner frequency ``w``. Its transfer function is:: -1 1 0 + w s b[0] = 0 b[1] = w H(z) = --------- = ------------- => 1 + s/w -1 a[0] = 1 a[1] = w 1 + w s A bilinear transform on this filter will produce a digital filter which can be used as input to `lfilter`. For example: >>> from scipy.signal import * >>> from numpy import * >>> from pylab import * >>> >>> w = 10.0 # corner frequency >>> fs = 1000.0 # sampling rate >>> >>> t = arange(0, 2*pi, 1/fs) >>> x = sin(1*t) + sin(100*t) # test signal >>> >>> b, a = bilinear([0,w], [1,w], fs) >>> y = lfilter(b, a, x) >>> >>> plot(t, x, label="unfiltered") #doctest: +ELLIPSIS [] >>> plot(t, y, label="low-pass filtered") #doctest: +ELLIPSIS [] >>> legend() #doctest: +ELLIPSIS >>> show() >>> ''' From matthieu.brucher at gmail.com Fri Oct 19 01:42:13 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 19 Oct 2007 07:42:13 +0200 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> Message-ID: This is not very different from the Numpy API interface. You have an array of void* that contains your functions, and then you define macros that use this array with a cast to get the real function. Here, it seems that there is no array, but the result is identical (but it shouldn't redefine an already defined function :|) Matthieu 2007/10/19, David Cournapeau : > > Matthew Brett wrote: > > > > > > > The problem seems to be this set of lines in nd_image.h (starting line > > 276): > > > > #define NA_OutputArray (*(PyArrayObject* (*) > > (PyObject*,NumarrayType,int) ) (void *) NA_OutputArray) > > #define NA_IoArray (*(PyArrayObject* (*) (PyObject*,NumarrayType,int) > > ) (void *) NA_IoArray) > > #define NA_NewArray (*(PyArrayObject* (*) (void* buffer, > > NumarrayType, int, ...) ) (void *) NA_NewArray ) > > #define NA_elements (*(unsigned long (*) (PyArrayObject*) ) (void *) > > NA_elements) > > #define NA_InputArray (*(PyArrayObject* (*) > > (PyObject*,NumarrayType,int) ) (void *) NA_InputArray) > > > Wow, nice function pointers. I did not even know this syntax was valid. > Redefines the functions with the define is a good obsfucation, too. As > far as I understand, the line > > #define NA_NewArray (*(PyArrayObject* (*) (void* buffer, NumarrayType, > int, ...) ) (void *) NA_NewArray ) > > Means that the previously defined NA_NewArray functions has its output > casted, output which is a function pointer. So basically, this is > equivalent to > > #define NA_NewArray (*(func_ptr) (void *) NA_NewArray ) > > with func_ptr a function pointer, function taking (void* buffer, > NumarrayType, int, ...) as arguments, and returning PyArrayObject*. The > first * is strange, though, I don't understand what than means (you > cannot dereference a function pointer, normally ?). > > But anyway, the fact that the compiler says that it emits non supported > opcodes may suggest that this is relying on undefined behaviour ? > > cheers, > > David > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmitrey.kroshko at scipy.org Fri Oct 19 03:07:59 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Fri, 19 Oct 2007 10:07:59 +0300 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: References: <47164E3A.9080806@scipy.org> Message-ID: <471857CF.9050600@scipy.org> Travis Vaught wrote: > So that a motivated volunteer can know how to pitch in, I've > documented the steps I used for generating the docs here: > http://projects.scipy.org/scipy/scipy/wiki/APIDocumentation > When another preferred method comes along, feel free to update the page. > Best, > Travis V. I suppose you'd better inform scipy-user list as well. IIRC there are mirror websites to scipy API docs like for example http://docs.neuroinf.de/api/scipy/, so they need to be updated as well. Alan G Isaac wrote: > Is there a specific "gap" in the documentation > that concerns you? > No, just some minor changes due to my summer tickets. From david at ar.media.kyoto-u.ac.jp Fri Oct 19 04:24:28 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 19 Oct 2007 17:24:28 +0900 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> Message-ID: <471869BC.1040607@ar.media.kyoto-u.ac.jp> Matthieu Brucher wrote: > This is not very different from the Numpy API interface. You have an > array of void* that contains your functions, and then you define > macros that use this array with a cast to get the real function. Here, > it seems that there is no array, but the result is identical (but it > shouldn't redefine an already defined function :|) Yes, sure, arrays of function pointers are nothing unusual. But for some time, I was really wondering how could a preprocessor symbol refer to itself (it did not, it was just a function). I still do not understand how a function pointer can generate invalid opcode (the error message seems to appear only with gcc 4.2, not with 4.1, which is still really common), though. cheers, David From matthew.brett at gmail.com Fri Oct 19 10:03:16 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 19 Oct 2007 15:03:16 +0100 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <471869BC.1040607@ar.media.kyoto-u.ac.jp> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> Message-ID: <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> Hi, > Yes, sure, arrays of function pointers are nothing unusual. But for some > time, I was really wondering how could a preprocessor symbol refer to > itself (it did not, it was just a function). I still do not understand > how a function pointer can generate invalid opcode (the error message > seems to appear only with gcc 4.2, not with 4.1, which is still really > common), though. Well, simply deleting those #define lines allow the code to compile without warnings and all the tests pass. Can anyone who understands this better than me have a guess as to whether deleting these lines will have some adverse effect I am not anticipating? Matthew From nwagner at iam.uni-stuttgart.de Fri Oct 19 10:33:02 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 19 Oct 2007 16:33:02 +0200 Subject: [SciPy-dev] Typo in r3439 Message-ID: Hi all, This is to let you know that r3439 has a bug from scipy.sandbox import lobpcg Traceback (most recent call last): File "", line 1, in ? File "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sandbox/lobpcg/__init__.py", line 4, in ? import lobpcg File "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sandbox/lobpcg/lobpcg.py", line 18, in ? import scipy.sparse as sp File "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sparse/__init__.py", line 5, in ? from sparse import * File "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sparse/sparse.py", line 179 for ind in xrange(start,stop)) + '\n' ^ SyntaxError: invalid syntax Nils From nwagner at iam.uni-stuttgart.de Fri Oct 19 13:57:44 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 19 Oct 2007 19:57:44 +0200 Subject: [SciPy-dev] Typo in r3439 In-Reply-To: References: Message-ID: On Fri, 19 Oct 2007 16:33:02 +0200 "Nils Wagner" wrote: > Hi all, > > This is to let you know that r3439 has a bug > > from scipy.sandbox import lobpcg > Traceback (most recent call last): > File "", line 1, in ? > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sandbox/lobpcg/__init__.py", > line 4, in ? > import lobpcg > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sandbox/lobpcg/lobpcg.py", > line 18, in ? > import scipy.sparse as sp > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sparse/__init__.py", > line 5, in ? > from sparse import * > File > "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/sparse/sparse.py", > line 179 > for ind in xrange(start,stop)) + '\n' > ^ > SyntaxError: invalid syntax > > > Nils > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev I cannot reproduce the SyntaxError with python2.5. Is this problem restricted to python2.3 ? Nils From nwagner at iam.uni-stuttgart.de Fri Oct 19 14:03:55 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 19 Oct 2007 20:03:55 +0200 Subject: [SciPy-dev] Changeset 3443 Message-ID: ====================================================================== ERROR: test_Robust (scipy.stats.models.tests.test_rlm.TestRegression) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/tests/test_rlm.py", line 18, in test_Robust results = model.fit(Y) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/rlm.py", line 75, in fit self.scale = self.results.scale = self.estimate_scale(self.results) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/rlm.py", line 65, in estimate_scale return scale.MAD(resid)**2 File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 35, in MAD unsqueeze(d, axis, a.shape) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 22, in unsqueeze data.shape = newshape AttributeError: attribute 'shape' of 'numpy.generic' objects is not writable ====================================================================== ERROR: test_Robustdegenerate (scipy.stats.models.tests.test_rlm.TestRegression) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/tests/test_rlm.py", line 26, in test_Robustdegenerate results = model.fit(Y) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/rlm.py", line 75, in fit self.scale = self.results.scale = self.estimate_scale(self.results) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/rlm.py", line 65, in estimate_scale return scale.MAD(resid)**2 File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 35, in MAD unsqueeze(d, axis, a.shape) File "/usr/local/lib64/python2.5/site-packages/scipy/stats/models/robust/scale.py", line 22, in unsqueeze data.shape = newshape AttributeError: attribute 'shape' of 'numpy.generic' objects is not writable From robert.kern at gmail.com Fri Oct 19 14:06:02 2007 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 19 Oct 2007 13:06:02 -0500 Subject: [SciPy-dev] Typo in r3439 In-Reply-To: References: Message-ID: <4718F20A.3040604@gmail.com> Nils Wagner wrote: > I cannot reproduce the SyntaxError with python2.5. > Is this problem restricted to python2.3 ? Fixed in SVN. It was a generator expression, which is not available in 2.3. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From matthew.brett at gmail.com Sat Oct 20 05:37:38 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 20 Oct 2007 05:37:38 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> Message-ID: <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> Hi, On 10/19/07, Matthew Brett wrote: > Hi, > > > Yes, sure, arrays of function pointers are nothing unusual. But for some > > time, I was really wondering how could a preprocessor symbol refer to > > itself (it did not, it was just a function). I still do not understand > > how a function pointer can generate invalid opcode (the error message > > seems to appear only with gcc 4.2, not with 4.1, which is still really > > common), though. > > Well, simply deleting those #define lines allow the code to compile > without warnings and all the tests pass. Can anyone who understands > this better than me have a guess as to whether deleting these lines > will have some adverse effect I am not anticipating? Well - I will assume that the lines were trying to solve some obscure compiler incompatibility, and delete them in SVN this evening, in the hope that either a) no-one has any problems or b) we find the previous problem and fix it in a better way. Please someone let me know if they think that's a bad idea. Matthew > > Matthew > From dmitrey.kroshko at scipy.org Sat Oct 20 07:05:06 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Sat, 20 Oct 2007 14:05:06 +0300 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> References: <47164E3A.9080806@scipy.org> <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> Message-ID: <4719E0E2.9030400@scipy.org> In the page http://scipy.org/Documentation both numpy and scipy API docs are still marked "updated 2006.08.14". I would fix it by myself but *numpy API doc is not regenerated* (according to the page http://www.scipy.org/doc/numpy_api_docs/ ) Regards, D. Travis Vaught wrote: > So that a motivated volunteer can know how to pitch in, I've > documented the steps I used for generating the docs here: > > http://projects.scipy.org/scipy/scipy/wiki/APIDocumentation > > When another preferred method comes along, feel free to update the page. > > Best, > > Travis V. > > > On Oct 17, 2007, at 4:06 PM, Alan G Isaac wrote: > > >> Well there is more: >> http://www.scipy.org/Documentation >> >> But I guess your point is that it seems >> time to regenerate the SciPy API documentation. >> I do not know who would be in charge of this. >> >> Is there a specific "gap" in the documentation >> that concerns you? >> >> Cheers, >> Alan Isaac >> >> >> >> _______________________________________________ >> Scipy-dev mailing list >> Scipy-dev at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > > > > From dmitrey.kroshko at scipy.org Sat Oct 20 07:15:48 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Sat, 20 Oct 2007 14:15:48 +0300 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: <4719E0E2.9030400@scipy.org> References: <47164E3A.9080806@scipy.org> <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> <4719E0E2.9030400@scipy.org> Message-ID: <4719E364.6090305@scipy.org> dmitrey wrote: > In the page > http://scipy.org/Documentation > both numpy and scipy API docs are still marked "updated 2006.08.14". > I would fix it by myself but *numpy API doc is not regenerated* > (according to the page > http://www.scipy.org/doc/numpy_api_docs/ > ) > Regards, D. > Excuse my English, I meant the page http://www.scipy.org/doc/numpy_api_docs/ still remains last changed 2006.08.14. D. From millman at berkeley.edu Sat Oct 20 09:57:02 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Sat, 20 Oct 2007 06:57:02 -0700 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: <4719E0E2.9030400@scipy.org> References: <47164E3A.9080806@scipy.org> <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> <4719E0E2.9030400@scipy.org> Message-ID: On 10/20/07, dmitrey wrote: > In the page > http://scipy.org/Documentation > both numpy and scipy API docs are still marked "updated 2006.08.14". > I would fix it by myself but *numpy API doc is not regenerated* > (according to the page > http://www.scipy.org/doc/numpy_api_docs/ I have removed the dates from the page: http://scipy.org/Documentation It is a bad idea to duplicate the date on that page since it will inevitably get out of date. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From millman at berkeley.edu Sat Oct 20 09:59:10 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Sat, 20 Oct 2007 06:59:10 -0700 Subject: [SciPy-dev] don't you think the documentation from 2006.08.14 is obsolete? In-Reply-To: <4719E364.6090305@scipy.org> References: <47164E3A.9080806@scipy.org> <022929FD-5BB2-4F57-B1D8-D7BD9A4878CC@enthought.com> <4719E0E2.9030400@scipy.org> <4719E364.6090305@scipy.org> Message-ID: On 10/20/07, dmitrey wrote: > Excuse my English, I meant the page > http://www.scipy.org/doc/numpy_api_docs/ > still remains last changed 2006.08.14. If no one else gets to it, I will update the NumPy API documentation later this weekend. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From robince at gmail.com Sun Oct 21 18:45:10 2007 From: robince at gmail.com (Robin) Date: Sun, 21 Oct 2007 23:45:10 +0100 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> Message-ID: On 10/20/07, Matthew Brett wrote: > Well - I will assume that the lines were trying to solve some obscure > compiler incompatibility, and delete them in SVN this evening, in the > hope that either a) no-one has any problems or b) we find the previous > problem and fix it in a better way. Please someone let me know if > they think that's a bad idea. > The most recent build, which I assume includes this change (I saw the header file was updated) also seems to fixed the crashes I was getting on Windows in the test suite on generic 1d filter. Thanks, Robin -------------- next part -------------- An HTML attachment was scrubbed... URL: From loredo at astro.cornell.edu Mon Oct 22 00:19:06 2007 From: loredo at astro.cornell.edu (Tom Loredo) Date: Mon, 22 Oct 2007 00:19:06 -0400 Subject: [SciPy-dev] Possible PPC float bug with special.ndtr Message-ID: <1193026746.471c24ba2936b@astrosun2.astro.cornell.edu> Hi folks- A bit of code misbehaved with my first test case in a bewildering manner. I was coding on a PPC; it seems the test failure may be due to a bug on PPC, as the code works fine on OS X Intel and Linux Intel. It boils down to this simple example. Here is the (expected) behavior on both Intel platforms: In [1]: from scipy.special import ndtr In [2]: ndtr(1.) Out[2]: 0.841344746069 In [3]: arg=(1.2-1.)/.2 In [4]: arg Out[4]: 0.99999999999999978 In [5]: ndtr(arg) Out[5]: 0.841344746069 Here is the (unexpected) behavior on PPC (OS X, Python 2.4.4, numpy 1.0.3.1, scipy 0.5.2.1): In [1]: from scipy.special import ndtr In [2]: ndtr(1.) Out[2]: 0.841344746069 In [3]: arg = (1.2-1.)/.2 In [4]: arg Out[4]: 0.99999999999999978 In [5]: ndtr(arg) Out[5]: nan In [6]: ndtr(arg+1.e-16) Out[6]: nan In [7]: ndtr(arg+2.e-16) Out[7]: 0.841344746069 In [8]: ndtr(arg-1.e-10) Out[8]: nan In [9]: ndtr(arg-1.e-9) Out[9]: 0.841344745827 I.e, there is a sliver of arguments near 1.0 where ndtr (or perhaps erf or erfc, which it relies on) misbehaves (giving nan), but only on PPC. Anyone else see this on PPC? Should I submit this to Trac, or is it something already dealt with? I took a peek at ndtr.c, but it's not obvious what might be going on. It looks like various constants get defined in an architecture-dependent manner, and perhaps an inaccurate definition is underlying this problem. Thanks, Tom ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ From guyer at nist.gov Mon Oct 22 09:00:34 2007 From: guyer at nist.gov (Jonathan Guyer) Date: Mon, 22 Oct 2007 09:00:34 -0400 Subject: [SciPy-dev] Possible PPC float bug with special.ndtr In-Reply-To: <1193026746.471c24ba2936b@astrosun2.astro.cornell.edu> References: <1193026746.471c24ba2936b@astrosun2.astro.cornell.edu> Message-ID: <5A580A40-D4B9-4853-BD51-26108AF36D9E@nist.gov> On Oct 22, 2007, at 12:19 AM, Tom Loredo wrote: > I.e, there is a sliver of arguments near 1.0 where ndtr (or perhaps > erf or erfc, which it relies on) misbehaves (giving nan), but only > on PPC. > > Anyone else see this on PPC? Yes, we have trouble with erf on PPC Mac's, too. From chanley at stsci.edu Mon Oct 22 09:50:57 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 22 Oct 2007 09:50:57 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> Message-ID: <471CAAC1.7060400@stsci.edu> > Well - I will assume that the lines were trying to solve some obscure > compiler incompatibility, and delete them in SVN this evening, in the > hope that either a) no-one has any problems or b) we find the previous > problem and fix it in a better way. Please someone let me know if > they think that's a bad idea. > > Matthew > Hi, I now have one of the ndimage unittest failing. ====================================================================== FAIL: zoom 1 ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/chanley/dev/site-packages/lib/python/scipy/ndimage/tests/test_ndi mage.py", line 2128, in test_zoom1 assert numpy.all(arr <= 24) AssertionError ---------------------------------------------------------------------- This was done I may Intel Mac running OS X 10.4.10. I will see if I can reproduce this error in other locations. -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From chanley at stsci.edu Mon Oct 22 09:54:56 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 22 Oct 2007 09:54:56 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <471CAAC1.7060400@stsci.edu> References: <1190837209.46fabbd97850a@astrosun2.astro.cornell.edu> <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> Message-ID: <471CABB0.90902@stsci.edu> Christopher Hanley wrote: >> Well - I will assume that the lines were trying to solve some obscure >> compiler incompatibility, and delete them in SVN this evening, in the >> hope that either a) no-one has any problems or b) we find the previous >> problem and fix it in a better way. Please someone let me know if >> they think that's a bad idea. >> >> Matthew >> > Hi, > > I now have one of the ndimage unittest failing. > > ====================================================================== > FAIL: zoom 1 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/chanley/dev/site-packages/lib/python/scipy/ndimage/tests/test_ndi > mage.py", line 2128, in test_zoom1 > assert numpy.all(arr <= 24) > AssertionError > > ---------------------------------------------------------------------- > > This was done I may Intel Mac running OS X 10.4.10. > > I will see if I can reproduce this error in other locations. > > > This is also an issue on my RHE 3 Linux box. ====================================================================== FAIL: zoom 1 ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/sparty1/dev/site-packages/lib/python/scipy/ndimage/tests/test_ndimage.py", line 2125, in test_zoom1 assert numpy.all(arr[-1,:] >= 20) AssertionError ---------------------------------------------------------------------- I build both scipy and numpy nightly from svn so I have the bleeding edge of both versions. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From stefan at sun.ac.za Mon Oct 22 12:09:20 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Mon, 22 Oct 2007 18:09:20 +0200 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <471CAAC1.7060400@stsci.edu> References: <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> Message-ID: <20071022160920.GQ19079@mentat.za.net> Hi Chris, Would you please send me the output of from scipy import ndimage import numpy arr = numpy.array(range(25)).reshape((5,5)).astype(float) arr = ndimage.zoom(arr, 2, order=2) print arr, '\n' This isn't due to Matthew's change, but to mine. Thanks, St?fan On Mon, Oct 22, 2007 at 09:50:57AM -0400, Christopher Hanley wrote: > > > Well - I will assume that the lines were trying to solve some obscure > > compiler incompatibility, and delete them in SVN this evening, in the > > hope that either a) no-one has any problems or b) we find the previous > > problem and fix it in a better way. Please someone let me know if > > they think that's a bad idea. > > > > Matthew > > From chanley at stsci.edu Mon Oct 22 12:27:56 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 22 Oct 2007 12:27:56 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <20071022160920.GQ19079@mentat.za.net> References: <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> <20071022160920.GQ19079@mentat.za.net> Message-ID: <471CCF8C.3090101@stsci.edu> Stefan van der Walt wrote: > Hi Chris, > > Would you please send me the output of > > from scipy import ndimage > import numpy > > arr = numpy.array(range(25)).reshape((5,5)).astype(float) > arr = ndimage.zoom(arr, 2, order=2) > print arr, '\n' > > This isn't due to Matthew's change, but to mine. > > Thanks, > St?fan > > Hi Stefan, This is what I get on my RHE 3 machine: >>> from scipy import ndimage >>> import numpy >>> >>> arr = numpy.array(range(25)).reshape((5,5)).astype(float) >>> arr = ndimage.zoom(arr, 2, order=2) >>> print arr, '\n' [[ -1.04083409e-16 2.78867102e-01 8.66376180e-01 1.36601307e+00 1.79084967e+00 2.20915033e+00 2.63398693e+00 3.13362382e+00 3.72113290e+00 4.00000000e+00] [ 1.39433551e+00 1.67320261e+00 2.26071169e+00 2.76034858e+00 3.18518519e+00 3.60348584e+00 4.02832244e+00 4.52795933e+00 5.11546841e+00 5.39433551e+00] [ 4.33188090e+00 4.61074800e+00 5.19825708e+00 5.69789397e+00 6.12273057e+00 6.54103123e+00 6.96586783e+00 7.46550472e+00 8.05301380e+00 8.33188090e+00] [ 6.83006536e+00 7.10893246e+00 7.69644154e+00 8.19607843e+00 8.62091503e+00 9.03921569e+00 9.46405229e+00 9.96368918e+00 1.05511983e+01 1.08300654e+01] [ 8.95424837e+00 9.23311547e+00 9.82062455e+00 1.03202614e+01 1.07450980e+01 1.11633987e+01 1.15882353e+01 1.20878722e+01 1.26753813e+01 1.29542484e+01] [ 1.10457516e+01 1.13246187e+01 1.19121278e+01 1.24117647e+01 1.28366013e+01 1.32549020e+01 1.36797386e+01 1.41793755e+01 1.47668845e+01 1.50457516e+01] [ 1.31699346e+01 1.34488017e+01 1.40363108e+01 1.45359477e+01 1.49607843e+01 1.53790850e+01 1.58039216e+01 1.63035585e+01 1.68910675e+01 1.71699346e+01] [ 1.56681191e+01 1.59469862e+01 1.65344953e+01 1.70341322e+01 1.74589688e+01 1.78772694e+01 1.83021060e+01 1.88017429e+01 1.93892520e+01 1.96681191e+01] [ 1.86056645e+01 1.88845316e+01 1.94720407e+01 1.99716776e+01 2.03965142e+01 2.08148148e+01 2.12396514e+01 2.17392883e+01 2.23267974e+01 2.26056645e+01] [ 2.00000000e+01 2.02788671e+01 2.08663762e+01 2.13660131e+01 2.17908497e+01 2.22091503e+01 2.26339869e+01 2.31336238e+01 2.37211329e+01 2.40000000e+01]] >>> Thanks, Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From chanley at stsci.edu Mon Oct 22 12:31:36 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 22 Oct 2007 12:31:36 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <20071022160920.GQ19079@mentat.za.net> References: <20070926221107.GH32704@mentat.za.net> <200710181235.41279.bgoli@sun.ac.za> <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> <20071022160920.GQ19079@mentat.za.net> Message-ID: <471CD068.9050208@stsci.edu> Stefan van der Walt wrote: > Hi Chris, > > Would you please send me the output of > > from scipy import ndimage > import numpy > > arr = numpy.array(range(25)).reshape((5,5)).astype(float) > arr = ndimage.zoom(arr, 2, order=2) > print arr, '\n' > > This isn't due to Matthew's change, but to mine. > > Thanks, > St?fan > Below is what I get from my Intel Mac: >>> from scipy import ndimage import numpy>>> import numpy >>> >>> arr = numpy.array(range(25)).reshape((5,5)).astype(float) >>> arr = ndimage.zoom(arr,2,order=2) >>> print arr, '\n' [[ 1.55431223e-15 2.78867102e-01 8.66376180e-01 1.36601307e+00 1.79084967e+00 2.20915033e+00 2.63398693e+00 3.13362382e+00 3.72113290e+00 4.00000000e+00] [ 1.39433551e+00 1.67320261e+00 2.26071169e+00 2.76034858e+00 3.18518519e+00 3.60348584e+00 4.02832244e+00 4.52795933e+00 5.11546841e+00 5.39433551e+00] [ 4.33188090e+00 4.61074800e+00 5.19825708e+00 5.69789397e+00 6.12273057e+00 6.54103123e+00 6.96586783e+00 7.46550472e+00 8.05301380e+00 8.33188090e+00] [ 6.83006536e+00 7.10893246e+00 7.69644154e+00 8.19607843e+00 8.62091503e+00 9.03921569e+00 9.46405229e+00 9.96368918e+00 1.05511983e+01 1.08300654e+01] [ 8.95424837e+00 9.23311547e+00 9.82062455e+00 1.03202614e+01 1.07450980e+01 1.11633987e+01 1.15882353e+01 1.20878722e+01 1.26753813e+01 1.29542484e+01] [ 1.10457516e+01 1.13246187e+01 1.19121278e+01 1.24117647e+01 1.28366013e+01 1.32549020e+01 1.36797386e+01 1.41793755e+01 1.47668845e+01 1.50457516e+01] [ 1.31699346e+01 1.34488017e+01 1.40363108e+01 1.45359477e+01 1.49607843e+01 1.53790850e+01 1.58039216e+01 1.63035585e+01 1.68910675e+01 1.71699346e+01] [ 1.56681191e+01 1.59469862e+01 1.65344953e+01 1.70341322e+01 1.74589688e+01 1.78772694e+01 1.83021060e+01 1.88017429e+01 1.93892520e+01 1.96681191e+01] [ 1.86056645e+01 1.88845316e+01 1.94720407e+01 1.99716776e+01 2.03965142e+01 2.08148148e+01 2.12396514e+01 2.17392883e+01 2.23267974e+01 2.26056645e+01] [ 2.00000000e+01 2.02788671e+01 2.08663762e+01 2.13660131e+01 2.17908497e+01 2.22091503e+01 2.26339869e+01 2.31336238e+01 2.37211329e+01 2.40000000e+01]] Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From stefan at sun.ac.za Mon Oct 22 16:34:58 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Mon, 22 Oct 2007 22:34:58 +0200 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <471CCF8C.3090101@stsci.edu> References: <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> <20071022160920.GQ19079@mentat.za.net> <471CCF8C.3090101@stsci.edu> Message-ID: <20071022203458.GR19079@mentat.za.net> Thanks, Chris. Does the test pass now? Regards St?fan On Mon, Oct 22, 2007 at 12:27:56PM -0400, Christopher Hanley wrote: > > > Stefan van der Walt wrote: > > Hi Chris, > > > > Would you please send me the output of > > > > from scipy import ndimage > > import numpy > > > > arr = numpy.array(range(25)).reshape((5,5)).astype(float) > > arr = ndimage.zoom(arr, 2, order=2) > > print arr, '\n' > > > > This isn't due to Matthew's change, but to mine. > > > > Thanks, > > St?fan > > > > > > Hi Stefan, > > This is what I get on my RHE 3 machine: > >>> from scipy import ndimage > >>> import numpy > >>> > >>> arr = numpy.array(range(25)).reshape((5,5)).astype(float) > >>> arr = ndimage.zoom(arr, 2, order=2) > >>> print arr, '\n' > [[ -1.04083409e-16 2.78867102e-01 8.66376180e-01 1.36601307e+00 > 1.79084967e+00 2.20915033e+00 2.63398693e+00 3.13362382e+00 > 3.72113290e+00 4.00000000e+00] > [ 1.39433551e+00 1.67320261e+00 2.26071169e+00 2.76034858e+00 > 3.18518519e+00 3.60348584e+00 4.02832244e+00 4.52795933e+00 > 5.11546841e+00 5.39433551e+00] > [ 4.33188090e+00 4.61074800e+00 5.19825708e+00 5.69789397e+00 > 6.12273057e+00 6.54103123e+00 6.96586783e+00 7.46550472e+00 > 8.05301380e+00 8.33188090e+00] > [ 6.83006536e+00 7.10893246e+00 7.69644154e+00 8.19607843e+00 > 8.62091503e+00 9.03921569e+00 9.46405229e+00 9.96368918e+00 > 1.05511983e+01 1.08300654e+01] > [ 8.95424837e+00 9.23311547e+00 9.82062455e+00 1.03202614e+01 > 1.07450980e+01 1.11633987e+01 1.15882353e+01 1.20878722e+01 > 1.26753813e+01 1.29542484e+01] > [ 1.10457516e+01 1.13246187e+01 1.19121278e+01 1.24117647e+01 > 1.28366013e+01 1.32549020e+01 1.36797386e+01 1.41793755e+01 > 1.47668845e+01 1.50457516e+01] > [ 1.31699346e+01 1.34488017e+01 1.40363108e+01 1.45359477e+01 > 1.49607843e+01 1.53790850e+01 1.58039216e+01 1.63035585e+01 > 1.68910675e+01 1.71699346e+01] > [ 1.56681191e+01 1.59469862e+01 1.65344953e+01 1.70341322e+01 > 1.74589688e+01 1.78772694e+01 1.83021060e+01 1.88017429e+01 > 1.93892520e+01 1.96681191e+01] > [ 1.86056645e+01 1.88845316e+01 1.94720407e+01 1.99716776e+01 > 2.03965142e+01 2.08148148e+01 2.12396514e+01 2.17392883e+01 > 2.23267974e+01 2.26056645e+01] > [ 2.00000000e+01 2.02788671e+01 2.08663762e+01 2.13660131e+01 > 2.17908497e+01 2.22091503e+01 2.26339869e+01 2.31336238e+01 > 2.37211329e+01 2.40000000e+01]] > > >>> > > Thanks, > Chris From chanley at stsci.edu Mon Oct 22 16:51:24 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 22 Oct 2007 16:51:24 -0400 Subject: [SciPy-dev] Illegal instruction in ndimage test In-Reply-To: <20071022203458.GR19079@mentat.za.net> References: <1e2af89e0710180338r265365d4nf89c4b5fe1296f4b@mail.gmail.com> <1e2af89e0710181724lc09569ft11c55c52607896af@mail.gmail.com> <4717FDC7.2090002@ar.media.kyoto-u.ac.jp> <471869BC.1040607@ar.media.kyoto-u.ac.jp> <1e2af89e0710190703u5ae40a01yaf5bb13961b5150a@mail.gmail.com> <1e2af89e0710200237v33227fc9u578781d3353b7d0a@mail.gmail.com> <471CAAC1.7060400@stsci.edu> <20071022160920.GQ19079@mentat.za.net> <471CCF8C.3090101@stsci.edu> <20071022203458.GR19079@mentat.za.net> Message-ID: <9CFDC971-1761-42E4-B20D-24C026EA7A58@stsci.edu> Yes. Thanks. Cheers, Chris On Oct 22, 2007, at 4:34 PM, Stefan van der Walt wrote: > Thanks, Chris. Does the test pass now? > > Regards > St?fan > > On Mon, Oct 22, 2007 at 12:27:56PM -0400, Christopher Hanley wrote: >> >> >> Stefan van der Walt wrote: >>> Hi Chris, >>> >>> Would you please send me the output of >>> >>> from scipy import ndimage >>> import numpy >>> >>> arr = numpy.array(range(25)).reshape((5,5)).astype(float) >>> arr = ndimage.zoom(arr, 2, order=2) >>> print arr, '\n' >>> >>> This isn't due to Matthew's change, but to mine. >>> >>> Thanks, >>> St?fan >>> >>> >> >> Hi Stefan, >> >> This is what I get on my RHE 3 machine: >>>>> from scipy import ndimage >>>>> import numpy >>>>> >>>>> arr = numpy.array(range(25)).reshape((5,5)).astype(float) >>>>> arr = ndimage.zoom(arr, 2, order=2) >>>>> print arr, '\n' >> [[ -1.04083409e-16 2.78867102e-01 8.66376180e-01 1.36601307e+00 >> 1.79084967e+00 2.20915033e+00 2.63398693e+00 3.13362382e+00 >> 3.72113290e+00 4.00000000e+00] >> [ 1.39433551e+00 1.67320261e+00 2.26071169e+00 2.76034858e+00 >> 3.18518519e+00 3.60348584e+00 4.02832244e+00 4.52795933e+00 >> 5.11546841e+00 5.39433551e+00] >> [ 4.33188090e+00 4.61074800e+00 5.19825708e+00 5.69789397e+00 >> 6.12273057e+00 6.54103123e+00 6.96586783e+00 7.46550472e+00 >> 8.05301380e+00 8.33188090e+00] >> [ 6.83006536e+00 7.10893246e+00 7.69644154e+00 8.19607843e+00 >> 8.62091503e+00 9.03921569e+00 9.46405229e+00 9.96368918e+00 >> 1.05511983e+01 1.08300654e+01] >> [ 8.95424837e+00 9.23311547e+00 9.82062455e+00 1.03202614e+01 >> 1.07450980e+01 1.11633987e+01 1.15882353e+01 1.20878722e+01 >> 1.26753813e+01 1.29542484e+01] >> [ 1.10457516e+01 1.13246187e+01 1.19121278e+01 1.24117647e+01 >> 1.28366013e+01 1.32549020e+01 1.36797386e+01 1.41793755e+01 >> 1.47668845e+01 1.50457516e+01] >> [ 1.31699346e+01 1.34488017e+01 1.40363108e+01 1.45359477e+01 >> 1.49607843e+01 1.53790850e+01 1.58039216e+01 1.63035585e+01 >> 1.68910675e+01 1.71699346e+01] >> [ 1.56681191e+01 1.59469862e+01 1.65344953e+01 1.70341322e+01 >> 1.74589688e+01 1.78772694e+01 1.83021060e+01 1.88017429e+01 >> 1.93892520e+01 1.96681191e+01] >> [ 1.86056645e+01 1.88845316e+01 1.94720407e+01 1.99716776e+01 >> 2.03965142e+01 2.08148148e+01 2.12396514e+01 2.17392883e+01 >> 2.23267974e+01 2.26056645e+01] >> [ 2.00000000e+01 2.02788671e+01 2.08663762e+01 2.13660131e+01 >> 2.17908497e+01 2.22091503e+01 2.26339869e+01 2.31336238e+01 >> 2.37211329e+01 2.40000000e+01]] >> >>>>> >> >> Thanks, >> Chris > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev From cimrman3 at ntc.zcu.cz Tue Oct 23 09:54:49 2007 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 23 Oct 2007 15:54:49 +0200 Subject: [SciPy-dev] ANN: SFE-00.31.06 release Message-ID: <471DFD29.5040007@ntc.zcu.cz> I am happy to announce the version 00.31.06 of SFE, featuring acoustic band gaps computation, rigid body motion constraints, new solver classes and reorganization, and regular bug fixes and updates, see http://ui505p06-mbs.ntc.zcu.cz/sfe. SFE is a finite element analysis software written almost entirely in Python. This version is released under BSD license. best wishes, r. From dmitrey.kroshko at scipy.org Fri Oct 26 06:57:01 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Fri, 26 Oct 2007 13:57:01 +0300 Subject: [SciPy-dev] scipy sparse documentation Message-ID: <4721C7FD.5090105@scipy.org> hi all, could you inform me where is the most appropriate documentation of using sparse matrices? is it possible to add the one to http://www.scipy.org/SciPy_packages? (currently it has just an empty page related to sparse) in FAQ there are 2 packages mentioned: sparse and sandbox pysparse. What about the latter - will it be next default scipy sparse toolbox or the one is obsolete? http://www.scipy.org/FAQ?highlight=%28sparse%29 Regards, D. From wnbell at gmail.com Fri Oct 26 11:31:28 2007 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 26 Oct 2007 10:31:28 -0500 Subject: [SciPy-dev] scipy sparse documentation In-Reply-To: <4721C7FD.5090105@scipy.org> References: <4721C7FD.5090105@scipy.org> Message-ID: On 10/26/07, dmitrey wrote: > could you inform me where is the most appropriate documentation of using > sparse matrices? I think this is the best we currently have: http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/sparse/info.py We should probably add some information about the indptr,indices, and data arrays which constitute the CSR/CSC formats. Let us know if anything else is missing. > in FAQ there are 2 packages mentioned: sparse and sandbox pysparse. > What about the latter - will it be next default scipy sparse toolbox or > the one is obsolete? > http://www.scipy.org/FAQ?highlight=%28sparse%29 PySparse was dropped a few months ago, so scipy.sparse is the only option. -- Nathan Bell wnbell at gmail.com From wnbell at gmail.com Fri Oct 26 16:02:35 2007 From: wnbell at gmail.com (Nathan Bell) Date: Fri, 26 Oct 2007 15:02:35 -0500 Subject: [SciPy-dev] changes to scipy.sparse Message-ID: There have been some recent changes to scipy.sparse that may potentially affect others. http://projects.scipy.org/scipy/scipy/changeset/3465 Sparse matrices (csr_matrix, csc_matrix, and coo_matrix) now support 64-bit indices and the following dtypes: int8, uint8, int16, int32, int64, float32, float64, complex64, complex128 Unsupported dtypes will be accepted by the sparse objects, but they'll be upcasted (by SWIG) to one of the above types when sent to one of the sparsetools functions. Operations between different data types should follow the expected upcasting procedure (e.g. complex64 * float64 -> complex128). The only issue I found when using non-floating point dtypes occurred in linsolve.spsolve() which assume a FP dtype. I created a method spmatrix.asfptype() that returns a sparse matrix withan appropriate FP type when necessary. In addition, the member variable spmatrix.fptype was removed since it pertained only to SuperLU. A few small edits to linsolve.spsolve were made to account for these changes. I have not tested the UMFPACK bindings yet, so let me know if there are any problems. I encountered an unexpected bug related to io.mio (matlab IO). The mio methods were creating sparse matrices with non-native byte order. The SWIG magic in sparsetools accounts for non-contiguous arrays, but not endianness. As a temporary fix I cast all input arrays to native byte order using: 44 def to_native(A): 45 if not A.dtype.isnative: 46 return A.astype(A.dtype.newbyteorder('native')) 47 else: 48 return A This seems to work, however I'd like to improve my SWIG to account for such issues instead. I believe the current SWIG bindings in the numpy SVN check for, but do not change byteorder. Does anyone have SWIG that accounts for this? With the additional data types, the SWIG output for sparsetools now weighs in a nearly 3MB and takes about 2.5 minutes to compile on my 1.8 GHz Athlon 64. While this is a rather large amount of C++ I don't think we can assume that end-users will have SWIG, let alone the a recent SVN version required by sparsetools for some systems. The size and compile time are due to the large number of functions (23*9*2 = 414) generated by the templates. A while ago I wrote a small C++ wrapper for numpy complex types so templated code (like sparsetools) can make sense of T + T and T == 0 etc. The numpy SWIG examples seem to lack complex support, so I thought this may be valuable to others. http://projects.scipy.org/scipy/scipy/browser/trunk/scipy/sparse/sparsetools/complex_ops.h Also, I plan to add support for block CSR and CSC formats to sparse in the near future. As always, comments and questions are welcome. -- Nathan Bell wnbell at gmail.com From nwagner at iam.uni-stuttgart.de Sat Oct 27 12:13:03 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 27 Oct 2007 18:13:03 +0200 Subject: [SciPy-dev] Bug in sparse.py Message-ID: Nathan, scipy.test(1) reveals a bug in sparse.py (recent svn) ====================================================================== ERROR: check loadmat case sparse ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 84, in cc self._check_case(name, files, expected) File "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", line 74, in _check_case matdict = loadmat(file_name) File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", line 96, in loadmat matfile_dict = MR.get_variables() File "/usr/local/lib64/python2.5/site-packages/scipy/io/miobase.py", line 277, in get_variables res = getter.get_array() File "/usr/local/lib64/python2.5/site-packages/scipy/io/miobase.py", line 317, in get_array arr = self.get_raw_array() File "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", line 174, in get_raw_array return scipy.sparse.csc_matrix((vals,ij), dims) File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", line 937, in __init__ dtype=self.dtype).tocsc() File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", line 2264, in __init__ self._check() File "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", line 2277, in _check raise TypeError,'row array has invalid dtype' TypeError: row array has invalid dtype Nils From matthew.brett at gmail.com Sat Oct 27 12:36:39 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 27 Oct 2007 12:36:39 -0400 Subject: [SciPy-dev] Bug in sparse.py In-Reply-To: References: Message-ID: <1e2af89e0710270936y133c046en28f6309558796b65@mail.gmail.com> Hi, I'm guessing that's my fault (mio) rather than Nathan's - I'll take a look. Matthew On 10/27/07, Nils Wagner wrote: > Nathan, > > scipy.test(1) reveals a bug in sparse.py (recent svn) > > ====================================================================== > ERROR: check loadmat case sparse > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 84, in cc > self._check_case(name, files, expected) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/tests/test_mio.py", > line 74, in _check_case > matdict = loadmat(file_name) > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio.py", > line 96, in loadmat > matfile_dict = MR.get_variables() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/miobase.py", > line 277, in get_variables > res = getter.get_array() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/miobase.py", > line 317, in get_array > arr = self.get_raw_array() > File > "/usr/local/lib64/python2.5/site-packages/scipy/io/mio4.py", > line 174, in get_raw_array > return scipy.sparse.csc_matrix((vals,ij), dims) > File > "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", > line 937, in __init__ > dtype=self.dtype).tocsc() > File > "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", > line 2264, in __init__ > self._check() > File > "/usr/local/lib64/python2.5/site-packages/scipy/sparse/sparse.py", > line 2277, in _check > raise TypeError,'row array has invalid dtype' > TypeError: row array has invalid dtype > > Nils > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > From wnbell at gmail.com Sat Oct 27 13:13:23 2007 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 27 Oct 2007 12:13:23 -0500 Subject: [SciPy-dev] Bug in sparse.py In-Reply-To: <1e2af89e0710270936y133c046en28f6309558796b65@mail.gmail.com> References: <1e2af89e0710270936y133c046en28f6309558796b65@mail.gmail.com> Message-ID: On 10/27/07, Matthew Brett wrote: > Hi, > > I'm guessing that's my fault (mio) rather than Nathan's - I'll take a look. Sorry, I forgot to commit my (trivial) change to mio4.py :) http://projects.scipy.org/scipy/scipy/changeset/3467 -- Nathan Bell wnbell at gmail.com From wnbell at gmail.com Sat Oct 27 14:30:47 2007 From: wnbell at gmail.com (Nathan Bell) Date: Sat, 27 Oct 2007 13:30:47 -0500 Subject: [SciPy-dev] Bug in sparse.py In-Reply-To: References: <1e2af89e0710270936y133c046en28f6309558796b65@mail.gmail.com> Message-ID: On 10/27/07, Nathan Bell wrote: > > http://projects.scipy.org/scipy/scipy/changeset/3467 > Since the previous implementation was more permissive in what data types it would accept (it always cast to int32) I replaced the exception you saw with a warning followed by a cast to int32. -- Nathan Bell wnbell at gmail.com From zelbier at gmail.com Sun Oct 28 08:42:03 2007 From: zelbier at gmail.com (Olivier Verdier) Date: Sun, 28 Oct 2007 13:42:03 +0100 Subject: [SciPy-dev] Scipy Logos Message-ID: Hi! Are there any scipy logos somewhere? I came across some scipy logos on the first page of scipy.org but nothing else. Could someone point me to a page where the scipy logos are? Then I think it would be a good idea to add those logos on the scipy.org site. thanks! == Olivier -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Mon Oct 29 01:46:31 2007 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 28 Oct 2007 23:46:31 -0600 Subject: [SciPy-dev] Setting up your editor for NumPy/SciPy In-Reply-To: <20071003124421.GE7241@mentat.za.net> References: <20071003124421.GE7241@mentat.za.net> Message-ID: Hi Stefan, On 10/3/07, Stefan van der Walt wrote: > Hi all, > > Since we are busy cleaning up the NumPy and SciPy sources, I'd like to > draw your attention to the guidelines regarding whitespace. > > We use 4 spaces per indentation level, in Python and C code alike (see > PEP 7: http://www.python.org/dev/peps/pep-0007/ under the heading > Python 3000). Lines should be a maximum of 79 characters long (to > facilitate reading in text terminals), and must not have trailing > whitespace. > > PEP 8 (http://www.python.org/dev/peps/pep-0008/) states: > > """ > The preferred way of wrapping long lines is by using Python's > implied line continuation inside parentheses, brackets and braces. > If necessary, you can add an extra pair of parentheses around an > expression, but sometimes using a backslash looks better. Make > sure to indent the continued line appropriately. > """ > > I attach a file, containing some common errors, which you can use to > setup your editor. I also attach the settings I use under Emacs to > highlight the problems. > I've attached a perl program borrowed from the LKML that removes trailing whitespace. Maybe we should put it on the scipy site somewhere. Chuck -------------- next part -------------- A non-text attachment was scrubbed... Name: cleanfile Type: application/octet-stream Size: 1122 bytes Desc: not available URL: From millman at berkeley.edu Mon Oct 29 11:05:46 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Mon, 29 Oct 2007 08:05:46 -0700 Subject: [SciPy-dev] Setting up your editor for NumPy/SciPy In-Reply-To: References: <20071003124421.GE7241@mentat.za.net> Message-ID: On 10/28/07, Charles R Harris wrote: > I've attached a perl program borrowed from the LKML that removes > trailing whitespace. Maybe we should put it on the scipy site > somewhere. The Python developer's tools provide a number of useful scripts including reindent.py: http://svn.python.org/view/python/trunk/Tools/scripts/reindent.py?rev=55804&view=markup -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From stefan at sun.ac.za Tue Oct 30 02:48:08 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 30 Oct 2007 08:48:08 +0200 Subject: [SciPy-dev] Setting up your editor for NumPy/SciPy In-Reply-To: References: <20071003124421.GE7241@mentat.za.net> Message-ID: <20071030064808.GL30790@mentat.za.net> On Mon, Oct 29, 2007 at 08:05:46AM -0700, Jarrod Millman wrote: > On 10/28/07, Charles R Harris wrote: > > I've attached a perl program borrowed from the LKML that removes > > trailing whitespace. Maybe we should put it on the scipy site > > somewhere. > > The Python developer's tools provide a number of useful scripts > including reindent.py: > http://svn.python.org/view/python/trunk/Tools/scripts/reindent.py... Thanks Charles, Jarrod. I have added these to http://scipy.org/scipy/numpy/wiki Regards St?fan From nwagner at iam.uni-stuttgart.de Tue Oct 30 03:36:46 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 30 Oct 2007 08:36:46 +0100 Subject: [SciPy-dev] python2.3 and scipy Message-ID: Hi all, I get some warnings and errors with recent svn and python2.3 e.g. Warning: FAILURE importing tests for /data/home/nwagner/local/lib64/python2.3/site-packages/scipy/io/tests/test_mio.py:235: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() (in TestMIOArray) Warning: FAILURE importing tests for /data/home/nwagner/local/lib64/python2.3/site-packages/numpy/testing/numpytest.py:389: SyntaxError: invalid syntax (test_pilutil.py, line 38) (in _get_module_tests) ERROR: test_factor1 (scipy.stats.models.tests.test_formula.TestFormula) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/nwagner/local/lib64/python2.3/site-packages/scipy/stats/models/tests/test_formula.py", line 190, in test_factor1 fac = formula.Factor('ff', set(f)) NameError: global name 'set' is not defined However I cannot reproduce them with python2.5. Any comments ? Nils