From gshuetrim at kpmg.com.au Tue Jan 2 01:56:21 2001 From: gshuetrim at kpmg.com.au (Shuetrim, Geoff) Date: Tue, 2 Jan 2001 17:56:21 +1100 Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) Message-ID: Apologies for asking an overly vague question like this but: on an Intel/win32 platform (I only have a windows version of Gauss), I am comparing Numpy matrix inversion with that of Gauss (very much the same type of software as Matlab which at least some of you used to work with). As the size of the matrix to be inverted increases, the speed of Numpy appears to asymptote (on my machine) to about half that of Gauss. For small matrices, it is much worse than that because of the various overheads that are dealt with by Numpy. Would this speed differential be largely eliminated if I was not using LAPACK-LITE? If so I will try to figure my way through hooking into Intel's MKL - anyone got hints on doing this - I saw mention of it in the mailing list archives. Would I be better off, speed-wise, eschewing win32 altogether and using native LAPACK and BLAS libraries on my Linux box? This is relevant to me in the context of a multivariate Kalman filtering module that I am working up to replace one I have been using on the Gauss platform for years. The Numpy version of my filter has a very similar logic structure to that of Gauss but is drastically slower. I have only been working with Numpy for a month or so which may mean that my code is relatively innefficient. I have been assuming that Numpy - as an interpreted language is mainly slowed by looping structures. Thanks in advance, Geoff Shuetrim ____________________________________________________________________________ ________ A simple version of the filter is given below: (Note that I have modified Matrix.py in my installation to include a transpose method for the Matrix class, T()). # *************************************************************** # kalman.py module by Geoff Shuetrim # # Please note - this code is thoroughly untested at this stage # # You may copy and use this module as you see fit with no # guarantee implied provided you keep this notice in all copies. # *************************************************************** # Minimization routines """kalman.py Routines to implement Kalman filtering and smoothing for multivariate linear state space representations of time-series models. Notation follows that contained in Harvey, A.C. (1989) "Forecasting Structural Time Series Models and the Kalman Filter". Filter --- Filter - condition on data to date """ # Import the necessary modules to use NumPy import math from Numeric import * from LinearAlgebra import * from RandomArray import * import MLab from Matrix import * # Initialise module constants Num = Numeric max = MLab.max min = MLab.min abs = Num.absolute __version__="0.0.1" # filtration constants _obs = 100 _k = 1 _n = 1 _s = 1 # Filtration global arrays _y = Matrix(cumsum(standard_normal((_obs,1,_n)))) _v = Matrix(zeros((_obs,1,_n),Float64)) _Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 _d = Matrix(zeros((_obs,1,_n),Float64)) _H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 _T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 _c = Matrix(zeros((_obs,1,_k),Float64)) _R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 _Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 _a = Matrix(zeros((_obs,1,_k),Float64)) _a0 = Matrix(zeros((_k,1),Float64)) _ap = _a _as = _a _P = Matrix(zeros((_obs,_k,_k),Float64)) _P0 = Matrix(zeros((_k,_k),Float64)) _Pp = _P _Ps = _P _LL = Matrix(zeros((_obs,1,1),Float64)) def Filter(): # Kalman filtering routine _ap[0] = _T[0] * _a0 + _c[0] _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() for t in range(1,_obs-1): _ap[t] = _T[t] * _a[t-1] + _c[t] _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] Ft_inverse = inverse(Ft) _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * _Pp[t] _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * Ft_inverse * _v[t].T()) Filter() ____________________________________________________________________________ ________ ********************************************************************** " This email is intended only for the use of the individual or entity named above and may contain information that is confidential and privileged. If you are not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this Email is strictly prohibited. When addressed to our clients, any opinions or advice contained in this Email are subject to the terms and conditions expressed in the governing KPMG client engagement letter. If you have received this Email in error, please notify us immediately by return email or telephone +61 2 93357000 and destroy the original message. Thank You. " **********************************************************************... From marimont at nxpdata.com Tue Jan 2 10:06:06 2001 From: marimont at nxpdata.com (David H. Marimont) Date: Tue, 02 Jan 2001 07:06:06 -0800 Subject: [Numpy-discussion] Da Blas References: <39C2409E.51659120@cfa.harvard.edu> Message-ID: <3A51EE5E.4367EF78@nxpdata.com> I have a problem that seems related to the one below. I'm trying to build and install Numeric 17.2.0 using the lapack and blas libraries under Python 2.0 on Red Hat 6.2 with the The build and install go fine, but when I run python and import LinearAlgebra, I get the following message: Python 2.0 (#1, Nov 13 2000, 14:15:52) [GCC egcs-2.91.66 19990314/Linux (egcs-1.1.2 release)] on linux2 Type "copyright", "credits" or "license" for more information. >>> from Numeric import * >>> from LinearAlgebra import * Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.0/site-packages/Numeric/LinearAlgebra.py", line 8, in ? import lapack_lite ImportError: /usr/local/lib/python2.0/site-packages/Numeric/lapack_lite.so: undefined symbol: dgesvd_ >>> I did try relinking with g77, as Scott's earlier message suggested: [root at harmony LALITE]# g77 -shared build/temp.linux-i686-2.0/Src/lapack_litemodule.o -L/usr/lib/lapack -o build/lib.linux-i686-2.0/lapack_lite.so ld: cannot open crtbeginS.o: No such file or directory [root at harmony LALITE]# But I don't know much about g77 (or gcc, for that matter), so I didn't know how to diagnose the error. I'm also not sure I'd know what to do next if the link step had worked! I'd sure appreciate some help with this... Thanks! David "Scott M. Ransom" wrote: > > Frank Horowitz wrote: > > > > However, when I > > coerced the distutils system to get around that bug (by specifying > > "/usr/lib " with a trailing blank for the BLASLIBDIR and LAPACKLIBDIR > > variables in setup.py) the same problem (i.e. an "ImportError: > > /usr/lib/liblapack.so.3: undefined symbol: e_wsfe" in importing > > lapack_lite) ultimately manifested itself. > > This problem is easily fixed (at least on linux) by performing the link > of lapack_lite.so with g77 instead of gcc (this is required because the > lapack and/or blas libraries are based on fortran object files...). > > For instance the out-of-box link command on my machine (Debian 2.2) is: > > gcc -shared build/temp.linux2/Src/lapack_litemodule.o -L/usr/local/lib > -L/usr/lib -llapack -lblas -o build/lib.linux2/lapack_lite.so > > Simply change the 'gcc' to 'g77' and everything works nicely. > > Not sure if this is specific to Linux or not... > > Scott > > -- > Scott M. Ransom Address: Harvard-Smithsonian CfA > Phone: (617) 495-4142 60 Garden St. MS 10 > email: ransom at cfa.harvard.edu Cambridge, MA 02138 > GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From jhauser at ifm.uni-kiel.de Tue Jan 2 10:17:48 2001 From: jhauser at ifm.uni-kiel.de (Janko Hauser) Date: Tue, 2 Jan 2001 16:17:48 +0100 (CET) Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) In-Reply-To: References: Message-ID: <20010102151748.26222.qmail@lisboa.ifm.uni-kiel.de> First, have you tested the execution time of inverse() alone in NumPy and Gauss, or only with the appended function. There are some ways to speed up the function without using a specialized library. The function also has a possible error. (see comments in the function). In general the penality for array creation (memory allocation) is rather big in Numpy, especially if you use long expressions, as the result of every term is stored in a newly created array. To speed up these kinds of operations use the inplace operations with some workspace arrays, like: >>> # Translating _ap[0] = _T[0] * _a0 + _c[0] >>> _ap[0]=add(multiply(_T[0],_a0,_ap[0]),_c[0],_ap[0]) Then it should be possible to code this without a loop at all. This would also speed up the function. HTH, __Janko Shuetrim, Geoff writes: > ________ > > A simple version of the filter is given below: > (Note that I have modified Matrix.py in my installation to include a > transpose method for the Matrix class, T()). > > # *************************************************************** > # kalman.py module by Geoff Shuetrim > # > # Please note - this code is thoroughly untested at this stage > # > # You may copy and use this module as you see fit with no > # guarantee implied provided you keep this notice in all copies. > # *************************************************************** > > # Minimization routines > """kalman.py > > Routines to implement Kalman filtering and smoothing > for multivariate linear state space representations > of time-series models. > > Notation follows that contained in Harvey, A.C. (1989) > "Forecasting Structural Time Series Models and the Kalman Filter". > > Filter --- Filter - condition on data to date > > """ > > # Import the necessary modules to use NumPy > import math > from Numeric import * > from LinearAlgebra import * > from RandomArray import * > import MLab > from Matrix import * > > # Initialise module constants > Num = Numeric > max = MLab.max > min = MLab.min > abs = Num.absolute > __version__="0.0.1" > > # filtration constants > _obs = 100 > _k = 1 > _n = 1 > _s = 1 > > # Filtration global arrays > _y = Matrix(cumsum(standard_normal((_obs,1,_n)))) > _v = Matrix(zeros((_obs,1,_n),Float64)) > _Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 > _d = Matrix(zeros((_obs,1,_n),Float64)) > _H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 > _T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 > _c = Matrix(zeros((_obs,1,_k),Float64)) > _R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 > _Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 > _a = Matrix(zeros((_obs,1,_k),Float64)) > _a0 = Matrix(zeros((_k,1),Float64)) > _ap = _a !!! Are you sure? This does not copy, but only makes a new reference > _as = _a !!! Same here > _P = Matrix(zeros((_obs,_k,_k),Float64)) > _P0 = Matrix(zeros((_k,_k),Float64)) > _Pp = _P > _Ps = _P > _LL = Matrix(zeros((_obs,1,1),Float64)) > > def Filter(): # Kalman filtering routine > > _ap[0] = _T[0] * _a0 + _c[0] > _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() > > for t in range(1,_obs-1): > > _ap[t] = _T[t] * _a[t-1] + _c[t] !!! You are changing _a and _as and _ap at the same time > _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() > > Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] > Ft_inverse = inverse(Ft) > _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] > > _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() > _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * > _Pp[t] > _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * > Ft_inverse * _v[t].T()) > > Filter() > ____________________________________________________________________________ > ________ > > > > ********************************************************************** > " This email is intended only for the use of the individual or entity > named above and may contain information that is confidential and > privileged. If you are not the intended recipient, you are hereby > notified that any dissemination, distribution or copying of this > Email is strictly prohibited. When addressed to our clients, any > opinions or advice contained in this Email are subject to the > terms and conditions expressed in the governing KPMG client > engagement letter. If you have received this Email in error, please > notify us immediately by return email or telephone +61 2 93357000 > and destroy the original message. Thank You. " > **********************************************************************... > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From anthony.seward at ieee.org Tue Jan 2 11:45:55 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Tue, 2 Jan 2001 09:45:55 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: Message-ID: On Thu, 28 Dec 2000, Paul F. Dubois wrote: > A millenium-end report from the Head Nummie (this name is a joke; see the > DEVELOPERS file): > > There have been a steady set of messages on the subject of I should do this > or that to make it easier to make RPMs. It is impossible for me to act on > these: I don't know much about RPMs, and if I did, I don't know if making > the change suggested is good or bad for someone doing something else, like > making Windows installers. Therefore my policy is to rely on the Distutils > people to work this out. Those who wish to make it easier to make a binary > installer for platform xyz should figure out what would be required by the > Distutils bdist family of commands. > > That is not to say that I don't appreciate people trying to help. I'm > grateful for all the support I get from the community. I think that relying > on division of labor in this case is the right thing to do, so that we take > advantage of the Distutils effort. If I'm wrong, I'll listen. > The problem that I pointed out is not a problem with building a binary package. Invoking './setup_all.py build' on a clean machine does not work. The numpy core is built, but the packages are not. The reason is that all of the packages are looking for 'Numeric/arrayobject.h' which does not exist until the numpy core has been installed at least once. Even then, the packages will use an old version of arrayobject.h. I see two solutions: 1) Have the setup script make a symbolic link in the package's include directory to the include directory of the numpy core. Call the symbolic link 'Numeric.' 2) Move the include files for the core to a subdirectory called 'Numeric.' I would prefer the first solution, but I'm not aware of a way for the non-unix versions of python to create a link to a directory. Tony From beausol at hpl.hp.com Tue Jan 2 11:57:35 2001 From: beausol at hpl.hp.com (Raymond Beausoleil) Date: Tue, 02 Jan 2001 08:57:35 -0800 Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) In-Reply-To: Message-ID: <5.0.2.1.2.20010102083525.00aac000@hplex1.hpl.hp.com> I use Windows NT/2000 (at gunpoint), and I've gotten much better performance than NumPy's Lite code by using the Intel MKL. I keep meaning to post these results to an external HP Labs web page, but we're currently engaged in an internal snit about what those pages are allowed to look like. Essentially, with NumPy 17 and Python 1.5, I've been able to: (1) Replace NumPy BLAS with Intel's MKL BLAS, after a number of routine renaming edits in the NumPy code. (Apparently "cblas" is an irresistible prefix.) This turns out to be pretty easy. (2) Replace portions of LAPACK Lite with the corresponding routines from Intel's MKL. I've been hacking away at this on a piecemeal basis, because the MKL uses Fortran calling conventions and array-ordering. In most of the MKL routines, the usual LAPACK flags for transposition are available to compensate. (3) Add new LAPACK functionality using LAPACK routines from the MKL that are not included in LAPACK Lite. I've been meaning to pick this project up again, since I want to use the Intel FFT routines in some new code that I'm writing. The chief benefit of the MKL libraries is the ability to easily use multiple processors and threads simply by setting a couple of environment variables. Until recently (when MATLAB finally started using LAPACK), I was able to gain significantly better performance than MATLAB using the Intel BLAS/LAPACK, even without multiple processors. Now the Intel libraries are only about 10-25% faster, if my watch can be believed. I have _no idea_ how the Intel MKL performs relative to the native Linux BLAS/LAPACK distributions, but I know that (for example) the MKL is about 50% faster than the linear algebra routines shipped by NAG. Although I haven't figured out the standard NumPy distribution process yet (I still use that pesky Visual Studio IDE), I'll elevate the development of a general-purpose integration of the Intel MKL into NumPy to an Official New Year's Resolution. I made my last ONYR about ten years ago, and was able to give up channel-surfing using the TV remote control. So, I have demonstrated an ability to deliver in this department; the realization that we've just started a new millennium only adds to the pressure. Presumably, when I get this finished (I'll target the end of the month), I'll be able to find a place to post it. ============================ Ray Beausoleil Hewlett-Packard Laboratories mailto:beausol at hpl.hp.com 425-883-6648 Office 425-957-4951 Telnet 425-941-2566 Mobile ============================ At 05:56 PM 1/2/2001 +1100, Shuetrim, Geoff wrote: >Apologies for asking an overly vague question like this but: >on an Intel/win32 platform (I only have a windows version of >Gauss), I am comparing Numpy matrix inversion with that >of Gauss (very much the same type of software as Matlab >which at least some of you used to work with). As the size >of the matrix to be inverted increases, the speed of Numpy >appears to asymptote (on my machine) to about half that of >Gauss. For small matrices, it is much worse than that >because of the various overheads that are dealt with by Numpy. > >Would this speed differential be largely eliminated if I was not >using LAPACK-LITE? If so I will try to figure my way through >hooking into Intel's MKL - anyone got hints on doing this - I saw >mention of it in the mailing list archives. Would I be better off, >speed-wise, eschewing win32 altogether and using native LAPACK >and BLAS libraries on my Linux box? > >This is relevant to me in the context of a multivariate Kalman filtering >module that I am working up to replace one I have been using on >the Gauss platform for years. The Numpy version of my filter has a >very similar logic structure to that of Gauss but is drastically slower. >I have only been working with Numpy for a month or so which may >mean that my code is relatively innefficient. I have been assuming >that Numpy - as an interpreted language is mainly slowed by >looping structures. > >Thanks in advance, > >Geoff Shuetrim > >____________________________________________________________________________ >________ > >A simple version of the filter is given below: >(Note that I have modified Matrix.py in my installation to include a >transpose method for the Matrix class, T()). > ># *************************************************************** ># kalman.py module by Geoff Shuetrim ># ># Please note - this code is thoroughly untested at this stage ># ># You may copy and use this module as you see fit with no ># guarantee implied provided you keep this notice in all copies. ># *************************************************************** > ># Minimization routines >"""kalman.py > >Routines to implement Kalman filtering and smoothing >for multivariate linear state space representations >of time-series models. > >Notation follows that contained in Harvey, A.C. (1989) >"Forecasting Structural Time Series Models and the Kalman Filter". > >Filter --- Filter - condition on data to date > >""" > ># Import the necessary modules to use NumPy >import math >from Numeric import * >from LinearAlgebra import * >from RandomArray import * >import MLab >from Matrix import * > ># Initialise module constants >Num = Numeric >max = MLab.max >min = MLab.min >abs = Num.absolute >__version__="0.0.1" > ># filtration constants >_obs = 100 >_k = 1 >_n = 1 >_s = 1 > ># Filtration global arrays >_y = Matrix(cumsum(standard_normal((_obs,1,_n)))) >_v = Matrix(zeros((_obs,1,_n),Float64)) >_Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 >_d = Matrix(zeros((_obs,1,_n),Float64)) >_H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 >_T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 >_c = Matrix(zeros((_obs,1,_k),Float64)) >_R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 >_Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 >_a = Matrix(zeros((_obs,1,_k),Float64)) >_a0 = Matrix(zeros((_k,1),Float64)) >_ap = _a >_as = _a >_P = Matrix(zeros((_obs,_k,_k),Float64)) >_P0 = Matrix(zeros((_k,_k),Float64)) >_Pp = _P >_Ps = _P >_LL = Matrix(zeros((_obs,1,1),Float64)) > >def Filter(): # Kalman filtering routine > > _ap[0] = _T[0] * _a0 + _c[0] > _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() > > for t in range(1,_obs-1): > > _ap[t] = _T[t] * _a[t-1] + _c[t] > _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() > > Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] > Ft_inverse = inverse(Ft) > _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] > > _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() > _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * >_Pp[t] > _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * >Ft_inverse * _v[t].T()) > >Filter() >____________________________________________________________________________ >________ > > > >********************************************************************** >" This email is intended only for the use of the individual or entity >named above and may contain information that is confidential and >privileged. If you are not the intended recipient, you are hereby >notified that any dissemination, distribution or copying of this >Email is strictly prohibited. When addressed to our clients, any >opinions or advice contained in this Email are subject to the >terms and conditions expressed in the governing KPMG client >engagement letter. If you have received this Email in error, please >notify us immediately by return email or telephone +61 2 93357000 >and destroy the original message. Thank You. " >**********************************************************************... > >_______________________________________________ >Numpy-discussion mailing list >Numpy-discussion at lists.sourceforge.net >http://lists.sourceforge.net/mailman/listinfo/numpy-discussion ============================ Ray Beausoleil Hewlett-Packard Laboratories mailto:beausol at hpl.hp.com 425-883-6648 Office 425-957-4951 Telnet 425-941-2566 Mobile ============================ From hinsen at cnrs-orleans.fr Wed Jan 3 08:14:23 2001 From: hinsen at cnrs-orleans.fr (Konrad Hinsen) Date: Wed, 3 Jan 2001 14:14:23 +0100 Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: (message from Tony Seward on Tue, 2 Jan 2001 09:45:55 -0700 (MST)) References: Message-ID: <200101031314.OAA10152@chinon.cnrs-orleans.fr> > I see two solutions: > 1) Have the setup script make a symbolic link in the package's include > directory to the include directory of the numpy core. Call the symbolic > link 'Numeric.' > > 2) Move the include files for the core to a subdirectory called 'Numeric.' There's a third one: 3) Have the "build" stage of the packages copy the core header files into their private include directories. This doesn't require links at the cost of wasting (temporarily) a tiny amount of disk space. Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hinsen at cnrs-orleans.fr Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.56.24 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- From anthony.seward at ieee.org Wed Jan 3 12:38:38 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Wed, 3 Jan 2001 10:38:38 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: <200101031314.OAA10152@chinon.cnrs-orleans.fr> Message-ID: On Wed, 3 Jan 2001, Konrad Hinsen wrote: > > I see two solutions: > > 1) Have the setup script make a symbolic link in the package's include > > directory to the include directory of the numpy core. Call the symbolic > > link 'Numeric.' > > > > 2) Move the include files for the core to a subdirectory called 'Numeric.' > > There's a third one: > > 3) Have the "build" stage of the packages copy the core header files > into their private include directories. > > This doesn't require links at the cost of wasting (temporarily) a tiny > amount of disk space. > > Konrad. > I have attached a patch that impliments your solution. So far it is working for me. When I've finished with the RPM spec file I will post that to the list as well. Tony From anthony.seward at ieee.org Wed Jan 3 12:47:09 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Wed, 3 Jan 2001 10:47:09 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: Message-ID: On Wed, 3 Jan 2001, Tony Seward wrote: > On Wed, 3 Jan 2001, Konrad Hinsen wrote: > > > I have attached a patch that impliments your solution. So far it is working > for me. When I've finished with the RPM spec file I will post that to the > list as well. > > Tony > oops. The patch is attached to this message. No really. -------------- next part -------------- diff -ru orig-Numeric-17.1.2/Packages/FFT/setup.py Numeric-17.1.2/Packages/FFT/setup.py --- orig-Numeric-17.1.2/Packages/FFT/setup.py Fri Sep 29 09:02:20 2000 +++ Numeric-17.1.2/Packages/FFT/setup.py Wed Jan 3 09:55:44 2001 @@ -8,6 +8,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + setup (name = "FFTPACK", version = "1.0", maintainer = "Numerical Python Developers", diff -ru orig-Numeric-17.1.2/Packages/LALITE/setup.py Numeric-17.1.2/Packages/LALITE/setup.py --- orig-Numeric-17.1.2/Packages/LALITE/setup.py Fri Sep 29 09:02:34 2000 +++ Numeric-17.1.2/Packages/LALITE/setup.py Wed Jan 3 09:57:17 2001 @@ -13,6 +13,24 @@ except: raise SystemExit, "Distutils problem. See Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + headers = glob (os.path.join ("Include","*.h")) # delete all but the first one in this list if using your own LAPACK/BLAS diff -ru orig-Numeric-17.1.2/Packages/RANLIB/setup.py Numeric-17.1.2/Packages/RANLIB/setup.py --- orig-Numeric-17.1.2/Packages/RANLIB/setup.py Fri Sep 29 09:02:55 2000 +++ Numeric-17.1.2/Packages/RANLIB/setup.py Wed Jan 3 09:57:37 2001 @@ -13,6 +13,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + headers = glob (os.path.join ("Include","*.h")) setup (name = "Numeric", packages = [''], diff -ru orig-Numeric-17.1.2/Packages/RNG/setup.py Numeric-17.1.2/Packages/RNG/setup.py --- orig-Numeric-17.1.2/Packages/RNG/setup.py Fri Sep 29 09:03:05 2000 +++ Numeric-17.1.2/Packages/RNG/setup.py Wed Jan 3 09:58:02 2001 @@ -16,6 +16,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + setup (name = "RNG", version = "3.0", maintainer = "Paul Dubois", From Oliphant.Travis at mayo.edu Wed Jan 3 17:30:29 2001 From: Oliphant.Travis at mayo.edu (Travis Oliphant) Date: Wed, 3 Jan 2001 16:30:29 -0600 (CST) Subject: [Numpy-discussion] Re: Numpy-discussion digest, Vol 1 #152 - 2 msgs In-Reply-To: Message-ID: > > A millenium-end report from the Head Nummie (this name is a joke; see the > DEVELOPERS file): > Our nummie-ears are listening.... > There have been a steady set of messages on the subject of I should do this > or that to make it easier to make RPMs. It is impossible for me to act on > these: I don't know much about RPMs, and if I did, I don't know if making > the change suggested is good or bad for someone doing something else, like > making Windows installers. > Therefore my policy is to rely on the Distutils > people to work this out. Those who wish to make it easier to make a binary > installer for platform xyz should figure out what would be required by the > Distutils bdist family of commands. Good idea to go with the distutils for doing this. I've delayed RPM's for this reason until I figure out how to interact with the distutils better (I haven't spent much time doing it yet). > > That is not to say that I don't appreciate people trying to help. I'm > grateful for all the support I get from the community. I think that relying > on division of labor in this case is the right thing to do, so that we take > advantage of the Distutils effort. If I'm wrong, I'll listen. > > There are a number of bug reports on the sourceforge site. I would be > grateful for patches. In particular there are two reports dealing with FFTs. > I lack the expertise to deal with these. I'll look into these unless somebody has them done. > > The masked array package MA has been getting more of a workout as I put it > into production at my site. I believe that it fills not only the immediate > need for dealing with missing values, but can serve as a model for how to > make a "Numeric-workalike" with extra properties, since we can't inherit > from Numeric's array. Since MA is improved fairly often, I recommend keeping > up via cvs if you are a user. > > I have new manuals but have had difficulty with the transport up to > SourceForge. Anybody else having such a problem? I used scp from a Linux box > and it sat there and didn't do anything. > > The rest of this is for developers. > > Actually, once you get into it, it isn't all that clear that inheritance > would help very much. For example, suppose you want an array class F that > has masked values but also has a special behavior f() controlled by a > parameter set at creation, beta. Suppose therefore you decide to inherit > from class MA. Thus the constructor of your new class must take the same > arguments as an MA array but add a beta=somedefault. OK, we do that. Now we > can construct a couple of F's: > f1 = F([1.,2.,3.], beta=1.) > f2 = F([4.,2.,1.], beta=2.) > Great. Now we can do f1.f(), f2.f(). Maybe we redefine __str__ so we can > print beta when we print f1. > > Now try to do something. Anything. Say, > f3 = f1 + f2 > > Oops. f3 is an MA, not an F. We might have written __add__ in MA so the it > used self.__class__ to construct the answer. But since the constructor now > needs a new parameter, there is no way MA.__add__ can make an F. It doesn't > know how. Doesn't know how to call F(), doesn't know what value to use for > beta anyway. > > So now we redefine all the methods in MA. Besides muttering that maybe > inheriting didn't buy me a lot, I am still nowhere, for the next thing I > realize is that every function f(a) that takes an MA as an argument and > returns an MA, still returns an MA. If any of these make sense for an > instance of F, I have to replace them, just as MA replaced sin, cos, sqrt, > take, etc. from Numeric. > > I have therefore come to the conclusion that we have been barking up the > wrong tree. There might be a few cases where inheritance would buy you > something, but essentially Numeric and MA are useless as parents. Instead, > what would be good to have is a python class Numeric built upon a suite of C > routines that did all the real work, such as adding, transposing, iterating > over elements, applying a function to each element, etc. I think this is what I've been trying to do in the "rewrite." Paul Barrett has made some excellent progress here. > Since it is now > possible to build a Python class with C methods, which it was not when > Numeric was created, we ought to think about it. What does this mean? What feature gives this ability? I'm not sure I see when this changed? > Such an API could be used > to make other classes with good performance. We could lose the artificial > layer that is in there now that makes it so tedious to add a function. (I > counted something like five or six places I had to modify when I added > "put".) I'd love to talk more with you about this. I'm now at my new place for anyone wishing to contact me. Travis Oliphant 437 CB Brigham Young University Provo, UT 84602 oliphant at ee.byu.edu (801) 378-3108 Thanks for you great efforts Paul. From Barrett at stsci.edu Wed Jan 3 18:04:34 2001 From: Barrett at stsci.edu (Paul Barrett) Date: Wed, 3 Jan 2001 18:04:34 -0500 (EST) Subject: [Numpy-discussion] Re: Numpy-discussion digest, Vol 1 #152 - 2 msgs In-Reply-To: References: Message-ID: <14931.44574.756591.67934@nem-srvr.stsci.edu> Travis Oliphant writes: [snip snip] > > > > I have therefore come to the conclusion that we have been barking up the > > wrong tree. There might be a few cases where inheritance would buy you > > something, but essentially Numeric and MA are useless as parents. Instead, > > what would be good to have is a python class Numeric built upon a suite of C > > routines that did all the real work, such as adding, transposing, iterating > > over elements, applying a function to each element, etc. > > I think this is what I've been trying to do in the "rewrite." Paul > Barrett has made some excellent progress here. I am currently writing the PEP 209: Multidimensional Arrays documentation and hope to submit the initial draft by the end of the week for comments. The proposed design is along the lines Paul Dubois has suggested. > > Since it is now > > possible to build a Python class with C methods, which it was not when > > Numeric was created, we ought to think about it. > > What does this mean? What feature gives this ability? I'm not sure I see > when this changed? I'd also like to know what Paul Dubois means by this. > > Such an API could be used > > to make other classes with good performance. We could lose the artificial > > layer that is in there now that makes it so tedious to add a function. (I > > counted something like five or six places I had to modify when I added > > "put".) > > I'd love to talk more with you about this. Ditto! -- Dr. Paul Barrett Space Telescope Science Institute Phone: 410-338-4475 ESS/Science Software Group FAX: 410-338-4767 Baltimore, MD 21218 From pplumlee at omnigon.com Thu Jan 4 13:43:42 2001 From: pplumlee at omnigon.com (Phlip) Date: Thu, 4 Jan 2001 10:43:42 -0800 Subject: [Numpy-discussion] STL wrapper for PyArrayObject In-Reply-To: <14931.44574.756591.67934@nem-srvr.stsci.edu> References: <14931.44574.756591.67934@nem-srvr.stsci.edu> Message-ID: <01010410434300.06404@localhost.localdomain> Nummies Where I work, our official pipe dreamer has decided to wrap a multiarray up as an STL container. This means we can write convenient high-level code in both C++ and Python, and use this wrapper in the bridge between them. E-searches for a pre-existing cut of this yield negative. Has anyone done this yet? Or am I (gulp) the first? --Phlip http://c2.com/cgi/wiki?PhlIp From paulp at ActiveState.com Thu Jan 4 20:22:06 2001 From: paulp at ActiveState.com (Paul Prescod) Date: Thu, 04 Jan 2001 17:22:06 -0800 Subject: [Numpy-discussion] Multi-distribution distributions Message-ID: <3A5521BE.1E8DF72@ActiveState.com> I'm somewhat surprised by the fact that some distributions (e.g. Numpy, Zodb) have multiple setup.py programs. As far as I know, these setup.py's do not share information so there is no way to do a bdist_wininst or bdist_rpm that builds a single distribution for these multiple sub-packages. I see this as a fairly large problem! The bdist_ functions are an important part of Distutils functionality. Paul Prescod From pauldubois at home.com Fri Jan 5 00:27:05 2001 From: pauldubois at home.com (Paul F. Dubois) Date: Thu, 4 Jan 2001 21:27:05 -0800 Subject: [Numpy-discussion] Multi-distribution distributions In-Reply-To: <3A5521BE.1E8DF72@ActiveState.com> Message-ID: Explanation: we made these packages optional partly because they ought to be optional but partly because one of them depends on LAPACK and many people wish to configure the packages to use a different LAPACK than the limited "lite" one supplied. It would be correct in the spirit of SourceForge and Python packages to make every single one of these optional packages a separate "project" or at least a separate download. That would raise the overhead. Technically the manual should be split up into pieces too. I don't think all that trouble is worth undergoing in order to "solve" this problem. So, you could just build separate rpms for each of the optional packages. They are NOT subpackages in the true sense of the word, except that a couple of them install into Numeric's directory for backward compatibility. Numeric is not a true package either. Again, people have argued (and I agree) that purity of thought is trumped by backward compatibility and that we should leave well enough alone. You are welcome to add a script which runs the bdist functions on all the optional packages, in much the same way setup_all.py works. You do need to face the issue if making a bdist for the public of which LAPACK you use on which platform. I believe I was one of the earliest and hardest pushers for Distutils, so I have no trouble agreeing with your goals. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Paul Prescod Sent: Thursday, January 04, 2001 5:22 PM To: distutils-sig at python.org Cc: akuchlin at mems-exchange.org; numpy-discussion at lists.sourceforge.net Subject: [Numpy-discussion] Multi-distribution distributions I'm somewhat surprised by the fact that some distributions (e.g. Numpy, Zodb) have multiple setup.py programs. As far as I know, these setup.py's do not share information so there is no way to do a bdist_wininst or bdist_rpm that builds a single distribution for these multiple sub-packages. I see this as a fairly large problem! The bdist_ functions are an important part of Distutils functionality. Paul Prescod _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From Roy.Dragseth at cc.uit.no Fri Jan 5 08:53:49 2001 From: Roy.Dragseth at cc.uit.no (Roy Dragseth) Date: Fri, 05 Jan 2001 14:53:49 +0100 Subject: [Numpy-discussion] Numeric and swig (or, how to wrap functions). Message-ID: <200101051353.OAA25994@paiute.cc.uit.no> Hello all. I'm trying to figure out a (simple) way to wrap extension functions using Numeric and swig. I want to use swig because I find the method described i ch. 12 of the user manual a bit too elaborate. The problem I'm facing with swig is that I cannot figure out a way to pass arrays to the functions expecting double pointers as input. Say, for example, that I want to wrap daxpy: 1. I put the function call interface into a swig file: void daxpy(int* n,double* a, double* x,int* incx, double* y, int* incy) 2. I want the function call visible to python to be something like: x = Numeric.arange(0.0,10.0) y = a[:] a = 1.0 daxpy(a,x,y) 3. To achieve 2. I make a python function that does the real call to daxpy: def daxpy(a,x,y): n = len(x) d_x = GetDataPointer(x) inc_x = . . . daxpy_c(n,a,d_x,inc_x,d_y,inc_y) (daxpy_c is the real daxpy) The problem I'm facing is that I cannot grab the internal data pointer from Numeric arrays and pass it to a function. Is there a simple way to do that without having to write a wrapper function in c for every function I want to use? How should I write the function GetDataPointer()? Any hints is greatly appreciated. Best regards, Roy. The Computer Center, University of Troms?, N-9037 TROMS?, Norway. phone:+47 77 64 41 07, fax:+47 77 64 41 00 Roy Dragseth, High Perfomance Computing System Administrator Direct call: +47 77 64 62 56. email: royd at cc.uit.no From rogerha at ifi.uio.no Fri Jan 5 09:48:15 2001 From: rogerha at ifi.uio.no (Roger Hansen) Date: 05 Jan 2001 15:48:15 +0100 Subject: [Numpy-discussion] Numeric and swig (or, how to wrap functions). In-Reply-To: <200101051353.OAA25994@paiute.cc.uit.no> References: <200101051353.OAA25994@paiute.cc.uit.no> Message-ID: * Roy Dragseth [snip Numeric and swig questions about how to give a numpy array to a swigged c function] > Any hints is greatly appreciated. I've done this a few times. What you need to do is help swig understand that a numpy array is input and how to treat this as a C array. With swig you can do this with a typemap. Here's an example: we can create a swig interface file like %module exPyC %{ #include #include /* Remember this one */ #include %} %init %{ import_array() /* Initial function for NumPy */ %} %typemap(python,in) double * { PyArrayObject *py_arr; /* first check if it is a NumPy array */ if (!PyArray_Check($source)) { PyErr_SetString(PyExc_TypeError, "Not a NumPy array"); return NULL; } if (PyArray_ObjectType($source,0) != PyArray_DOUBLE) { PyErr_SetString(PyExc_ValueError, "Array must be of type double"); return NULL; } /* check that array is 1D */ py_arr = PyArray_ContiguousFromObject($source, PyArray_DOUBLE, 1, 1); /* set a double pointer to the NumPy allocated memory */ $target = py_arr->data; } /* end of typemap */ %inline %{ void arrayArg(double *arr, int n) { int i; for (i = 0; i < n; i++) { arr[i] = f(i*1.0/n); } } %} Now, compile your extension, and test >>> import exPyC >>> import Numeric >>> a = Numeric.zeros(10,'d') >>> a array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]) >>> exPyC.arrayArg(a,10) >>> a array([ 0. , 0.00996671, 0.0394695 , 0.08733219, 0.15164665, 0.22984885, 0.31882112, 0.41501643, 0.51459976, 0.61360105]) You can also get rid of the length argument with a typemap(python,ignore), see the swig docs. For more info about the safety checks in the typemap check the numpy docs. Lykke til! :-) HTH, Roger From nwagner at isd.uni-stuttgart.de Fri Jan 5 08:16:45 2001 From: nwagner at isd.uni-stuttgart.de (Nils Wagner) Date: Fri, 05 Jan 2001 14:16:45 +0100 Subject: [Numpy-discussion] Extensions of Numpy Message-ID: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> Hi, I am looking for some functions for the computation of Matrix functions : matrix exponential matrix square root matrix logarithm Eigenvalue problems : generalized eigenvalue problems polynomial eigenvalue problems I am also interested in ODE (ordinary differential equation) solvers. Is there any progress ? Nils From paulp at ActiveState.com Fri Jan 5 12:10:20 2001 From: paulp at ActiveState.com (Paul Prescod) Date: Fri, 05 Jan 2001 09:10:20 -0800 Subject: [Distutils] RE: [Numpy-discussion] Multi-distribution distributions References: Message-ID: <3A55FFFC.3DB3872D@ActiveState.com> "Paul F. Dubois" wrote: > > Explanation: we made these packages optional partly because they ought to be > optional but partly because one of them depends on LAPACK and many people > wish to configure the packages to use a different LAPACK than the limited > "lite" one supplied. Wow, your "lite" module is half a meg. I'd hate to see the heavy version. :) I may not understand the extension so please tell me if I'm off-base: Would it be simpler to have a single setup.py and a set of flags to turn on and off each of the "secondary" extensions? > You are welcome to add a script which runs the bdist functions on all the > optional packages, in much the same way setup_all.py works. If I did this, I would consider a different strategy. I would suggest that each of the setup.py's could move their "setup" function into an if __name__=="__main__" block. Then setup_all.py could import each one (a little trickery needed there) and then combine the symbols like sourcelist, headers, ext_modules and so forth. > You do need to > face the issue if making a bdist for the public of which LAPACK you use on > which platform. I would expect to use lapack_lite.pyd. It's easy enough to override it by copying a custom lapack on top. Paul Prescod From manka at us.edu.pl Sat Jan 6 15:20:33 2001 From: manka at us.edu.pl (Ryszard Manka) Date: Sat, 6 Jan 2001 20:20:33 +0000 Subject: [Numpy-discussion] Chebyshev In-Reply-To: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> References: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> Message-ID: <01010620203300.00936@manka> >Nils Wagner writes: > > I am also interested in ODE (ordinary differential equation) solvers. > Hi, Some examples of solving ODE using the Chebyshev polynomial I put on my web home page : http://uranos.cto.us.edu.pl/~manka/math/newmath.html numpy, pyfort and gist are needed. -- Ryszard Manka manka at us.edu.pl From phlip_cpp at my-deja.com Sat Jan 6 20:58:25 2001 From: phlip_cpp at my-deja.com (Phlip TheProgrammer) Date: Sat, 6 Jan 2001 17:58:25 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject Message-ID: <200101070158.RAA24234@mail19.bigmailbox.com> An embedded and charset-unspecified text was scrubbed... Name: not available URL: From phlip_cpp at my-deja.com Sat Jan 6 01:10:52 2001 From: phlip_cpp at my-deja.com (phlip) Date: Fri, 5 Jan 2001 22:10:52 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject In-Reply-To: References: Message-ID: <01010522105200.01585@cuzco.concentric.net> On Friday 05 January 2001 06:24, pfenn at mmm.com wrote: > Be sure to look at the CXX on sourceforge. Ja. This is a sweet combination. The Joy of Swig is we can declare some parameters as basic types, but others as raw PyObjects. Then we can have our way with these. And using CXX, we wrap these objects in high-level C++ methods. Not the low-level fragile C. The effect compares favorably to ATL for VC++ and ActiveX. But there's just one little question here. (Hence the buckshot of list servers.) If we pass a multiarray into a function expecting a PyArrayObject, how then do we add new elements to it? I tried things like 'push_back()' and 'setItem()', but they either did not exist or did not extend the array. Am I going to have to generate a new array and copy the old one into the new? --Phlip From pauldubois at home.com Mon Jan 8 12:23:10 2001 From: pauldubois at home.com (Paul F. Dubois) Date: Mon, 8 Jan 2001 09:23:10 -0800 Subject: [Numpy-discussion] RE: [Numpy-developers] [Bug #128025] MA seems to leak memory In-Reply-To: Message-ID: The helpful person who submitted this report asks for a reply but does not give their name. Thank you for finding this problem. It turns out that this is and this isn't a problem. On the whole, it is. I have a fix but need to warn people that it will break any existing code that takes advantage of a feature I did not advertise. (:-> Here's the story: class MA inherits from a class ActiveAttributes, supplied with MA as an announced module in the package. ActiveAttributes is designed to encapsulate the behavior that Numeric has, of having attributes that are not really attributes but are trios of functions. For an example of what I mean by this, consider the .shape attribute. print x.shape x.shape = (3, 9) del x.shape shape appears to be the name of an attribute but in fact it is not. There are actually a trio of handlers, so that the above actually execute more like they were print x.__getshape() x.__setshape((3,9)) x.__delattr('shape') Now for the problem. In implementing ActiveAttributes, I set up an indirect system of handlers for each "active" attribute like 'shape', and in remembering what handlers to use I carelessly used *bound* methods rather than *unbound* methods. This meant that each instance contained a reference cycle. However, Python 2.0 has a cyclic garbage collector that runs periodically so over the course of a long routine like my test routine, the garbage was in fact being collected. The bug-poster's patch reveals the problem by doing it a lot of memory operations in a very few statements. The fix is to change ActiveAttributes to save unbound rather than bound methods. This works and prevents the observed growth. I will check it in to CVS with a new release number 4.0 when I have finished testing another idea that may also be an improvement. Current users may wish to invoke the facilities of the "gc" module in 2.0 to increase the frequency of collection if they are currently experiencing a problem. To anyone out there who had noticed the activeattr.py module and used it, this change will require an incompatible change to your code when you switch to the new version. -----Original Message----- From: numpy-developers-admin at lists.sourceforge.net [mailto:numpy-developers-admin at lists.sourceforge.net]On Behalf Of noreply at sourceforge.net Sent: Monday, January 08, 2001 4:27 AM To: noreply at sourceforge.net; noreply at sourceforge.net; numpy-developers at sourceforge.net Subject: [Numpy-developers] [Bug #128025] MA seems to leak memory Bug #128025, was updated on 2001-Jan-08 04:26 Here is a current snapshot of the bug. Project: Numerical Python Category: Fatal Error Status: Open Resolution: None Bug Group: Robustness lacking Priority: 5 Submitted by: nobody Assigned to : nobody Summary: MA seems to leak memory Details: I executed the following code in the interpreter: >>> import MA >>> a = MA.arange(10000000) >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] After array a was created, memory consumption was around 40 MB. After the b slices, it's now 190 MB. Even if MA does make a copy-on-slice, it should IMHO free the slice after there are no more references to it. (I am not a Sourceforge user or anything, so I would appreciate if someone would at least let me know if this is a known problem.) For detailed info, follow this link: http://sourceforge.net/bugs/?func=detailbug&bug_id=128025&group_id=1369 _______________________________________________ Numpy-developers mailing list Numpy-developers at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/numpy-developers From guido at python.org Tue Jan 9 08:58:21 2001 From: guido at python.org (Guido van Rossum) Date: Tue, 09 Jan 2001 08:58:21 -0500 Subject: [Numpy-discussion] Need help fixing NumPy links on www.python.org Message-ID: <200101091358.IAA09254@cj20424-a.reston1.va.home.com> It appears that all the links to NumPy for Download and Documentation on www.python.org are broken. This looks bad for www.python.org and for Numerical Python! In particular the links on these pages are all broken: http://www.python.org/topics/scicomp/numpy_download.html link to ftp://ftp-icf.llnl.gov/pub/python/README.html http://www.python.org/topics/scicomp/documentation.html links to ftp://ftp-icf.llnl.gov/pub/python/numericalpython.pdf, ftp://ftp-icf.llnl.gov/pub/python/NumTut.tgz, and the FAQ (http://www.python.org/cgi-bin/numpy-faq) Can anybody "in the know" mail me updated links? If you have updated HTML for the pages containign the links that would be even better! Please mail directly to guido at python.org; I'm not subscribed to this list. --Guido van Rossum (home page: http://www.python.org/~guido/) From cbarker at jps.net Tue Jan 9 13:55:11 2001 From: cbarker at jps.net (Chris Barker) Date: Tue, 09 Jan 2001 10:55:11 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject References: <200101070158.RAA24234@mail19.bigmailbox.com> Message-ID: <3A5B5E8F.5D206B7@jps.net> Phlip TheProgrammer wrote: > And using CXX, we wrap these objects in high-level C++ methods. Not the low-level fragile C. The effect compares favorably to ATL for VC++ and ActiveX. > If we pass a multiarray into a function expecting a PyArrayObject, how then do we add new elements to it? I tried things like 'push_back()' and 'setItem()', but they either did not exist or did not extend the array. > > Am I going to have to generate a new array and copy the old one into the new? I waited a little while before answering this, because there are certainly people more qualified to do so that me. I am only on the NumPy list, so it may have been answered on a different list. The short answer is yes, you will have to generate a new a array and copy the old one into the new. MultiArray objects were created to provide efficient storage of lots of numbers (among other things). Because of this requirement, the numbers are stored as a large single array, and so they cannot be re-sized without re-creating that array. You may be able to change just the data array itself (and a few properties), rather than creating a new structure entirely, but it probably wouldn't be worth it. By the way, I'd like to hear how this all works out. Being able to use NumPy Arrays in extensions more easily would be great! -Chris -- Christopher Barker, Ph.D. cbarker at jps.net --- --- --- http://www.jps.net/cbarker -----@@ -----@@ -----@@ ------@@@ ------@@@ ------@@@ Water Resources Engineering ------ @ ------ @ ------ @ Coastal and Fluvial Hydrodynamics ------- --------- -------- ------------------------------------------------------------------------ ------------------------------------------------------------------------ From pplumlee at omnigon.com Thu Jan 11 14:48:28 2001 From: pplumlee at omnigon.com (Phlip) Date: Thu, 11 Jan 2001 11:48:28 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject In-Reply-To: <3A5B5E8F.5D206B7@jps.net> References: <200101070158.RAA24234@mail19.bigmailbox.com> <3A5B5E8F.5D206B7@jps.net> Message-ID: <01010911244501.01263@localhost.localdomain> Proclaimed Chris Barker from the mountaintops: > I waited a little while before answering this, because there are > certainly people more qualified to do so that me. I am only on the NumPy > list, so it may have been answered on a different list. The irritation is, without a CXX list server, I'm having to molest the off-topic fora where Dubois et al are reputed to hang out. > The short answer is yes, you will have to generate a new a array and > copy the old one into the new. MultiArray objects were created to > provide efficient storage of lots of numbers (among other things). > Because of this requirement, the numbers are stored as a large single > array, and so they cannot be re-sized without re-creating that array. > You may be able to change just the data array itself (and a few > properties), rather than creating a new structure entirely, but it > probably wouldn't be worth it. Here's the state of the system: static void copyData ( Py::Array & ma, vector > & database, int maxFields ) { #if 1 Py::Sequence shape (Py::Int (2)); // <-- pow shape[0] = Py::Int (int (database.size())); shape[1] = Py::Int (maxFields); PyArray_Reshape ((PyArrayObject*)ma.ptr(), shape.ptr()); #else int zone[] = { database.size(), maxFields }; Py::Object mo ((PyObject*)PyArray_FromDims (2, zone, PyArray_OBJECT) ); ma = mo; assert (ma.dimension(1) == database.size()); assert (ma.dimension(2) == maxFields); for (int idxRow (0); idxRow < maxRows; ++idxRow) { Py::Array row (ma[idxRow]); for (int idxCol (0); idxCol < maxFields; ++idxCol) { string const & str (database[idxRow][idxCol]); Py::String pyStr (str.c_str()); Py::Object obj (pyStr); row [idxCol] = obj; // <-- pow } } #endif } Both versions crash on the line marked "pow". The top one crashes when I think I'm trying to do the equivalent of the Python array = (2.4) The bottom one crashes after creating a new array, right when I try to copy in an element. The Pythonic equivalent of matrixi [row][col] = "8" Everyone remember I'm not trying to presenve the old contents of the array - just return from the extension a new array full of stuff. > By the way, I'd like to hear how this all works out. Being able to use > NumPy Arrays in extensions more easily would be great! Our Chief Drive-by Architect has ordered me to use them like an in-memory database. >Sigh< --Phlip "...fanatical devotion to the Pope, and cute red uniforms..." From pfenn at mmm.com Thu Jan 11 17:29:22 2001 From: pfenn at mmm.com (pfenn at mmm.com) Date: Thu, 11 Jan 2001 16:29:22 -0600 Subject: [Numpy-discussion] Another CXX question Message-ID: This is probably a stupid coding error, but the following code generates the error: C:\Program Files\Microsoft Visual Studio\MyProjects\Xrawdatafile\Xrawdatafile.cpp(32) : error C2065: 'FromAPI' : undeclared identifier I've marked the line in the listing that causes the error. It occurs in the function that creates a new datafile_type object to return to the interpreter. Right now the datafile_type object is very simple, and has but one method that prints a hardcoded string. I'm using CXX-4.?? with Visual C++ 6.0 and SP3. I'm going to download SP4 tonight and see if that helps, but any hints would be appreciated. I haven't been doing C++ for very long, so I'm probably mising something obvious. //**************************************************************************************************************** // Xrawdatafile.cpp : Defines the entry point for the DLL application. // #include "stdafx.h" #include #include #include "Xrawdatafile.h" #include "datafile_type.h" BOOL APIENTRY DllMain(HANDLE hModule, DWORD ul_reason_for_call, LPVOID lpReserved) { switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: case DLL_THREAD_ATTACH: case DLL_THREAD_DETACH: case DLL_PROCESS_DETACH: break; } return TRUE; } class datafile_module : Py::ExtensionModule { public: datafile_module() : Py::ExtensionModule("datafile") { datafile_type::init(); add_varargs_method("datafile", &datafile_module::new_datafile,"datafile(filename)"); initialize("interface to ThermoFinnigan XRaw data files."); } virtual ~datafile_module() { }; private: Py::Object new_datafile(const Py::Tuple& args) { return Py::asObject((new datafile_type())); // THIS LINE CAUSES THE ERROR } }; extern "C" __declspec(dllexport) void initdatafile(void) { static datafile_module *dfm = new datafile_module; } //**************************************************************************************************************************** From arne.keller at ppm.u-psud.fr Tue Jan 16 07:55:40 2001 From: arne.keller at ppm.u-psud.fr (Arne Keller) Date: Tue, 16 Jan 2001 13:55:40 +0100 Subject: [Numpy-discussion] multipack Message-ID: <3A6444CC.74E221C4@ppm.u-psud.fr> I'm looking for the Multipack and cephes python modules by Travis Oliphant. The Multipack Home page link on numpy.sourceforge gives a 404... -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From phrxy at csv.warwick.ac.uk Tue Jan 16 14:15:36 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Tue, 16 Jan 2001 19:15:36 +0000 (GMT) Subject: [Numpy-discussion] multipack In-Reply-To: <3A6444CC.74E221C4@ppm.u-psud.fr> Message-ID: On Tue, 16 Jan 2001, Arne Keller wrote: > I'm looking for the Multipack and cephes python modules by Travis > Oliphant. > > The Multipack Home page link on numpy.sourceforge gives a 404... http://oliphant.netpedia.net/ http://oliphant.netpedia.net/multipack.html BTW, I've partly written a distutils script for this with the intention of making it easier to compile on windows (the setup.py works on linux but I need to change some things to make it set up the setup script according to the way the source has been set up). It seems distutils doesn't have explicit support for FORTRAN, so something needs changing there for it to work cross-platform. I haven't got any feedback yet from the Distutils people on how best to do this, but if anyone reading this who is practiced at compiling FORTRAN and C (especially on windows compilers) and wouldn't mind adding the little bits required to support FORTRAN for their system once this has been decided, mail me. John From jbaddor at physics.mcgill.ca Tue Jan 16 15:29:01 2001 From: jbaddor at physics.mcgill.ca (Jean-Bernard Addor) Date: Tue, 16 Jan 2001 15:29:01 -0500 (EST) Subject: [Numpy-discussion] strange underflows Message-ID: Hey Numpy people! I just compiled the last NumPy. Underflows don't give zero but generate OverflowError: Python 2.0 (#2, Dec 15 2000, 15:04:31) [GCC 2.95.2 19991024 (release)] on linux2 Type "copyright", "credits" or "license" for more information. Hello from .pythonrc.py >>> import Numeric >>> Numeric.__version__ '17.2.0' >>> Numeric.exp(-1.42676746e+12) Traceback (most recent call last): File "", line 1, in ? OverflowError: math range error >>> How to suppress these? Any hint? Jean-Bernard From jhauser at ifm.uni-kiel.de Tue Jan 16 16:07:09 2001 From: jhauser at ifm.uni-kiel.de (Janko Hauser) Date: Tue, 16 Jan 2001 22:07:09 +0100 (CET) Subject: [Numpy-discussion] multipack In-Reply-To: References: <3A6444CC.74E221C4@ppm.u-psud.fr> Message-ID: <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> These pages are depreciated, as Travis has moved. I could not find the announcement for the new pages. Menawhile you can check out these packages from the public cvs server of Pearu Peterson. For an overview of the cvs content look at http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ and for instructions to get it out of cvs look at http://cens.ioc.ee/projects/pysymbolic/ HTH, __Janko John J. Lee writes: > On Tue, 16 Jan 2001, Arne Keller wrote: > > > I'm looking for the Multipack and cephes python modules by Travis > > Oliphant. > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > http://oliphant.netpedia.net/ > > http://oliphant.netpedia.net/multipack.html > > BTW, I've partly written a distutils script for this with the intention of > making it easier to compile on windows (the setup.py works on linux but I > need to change some things to make it set up the setup script according to > the way the source has been set up). > > It seems distutils doesn't have explicit support for FORTRAN, so something > needs changing there for it to work cross-platform. I haven't got any > feedback yet from the Distutils people on how best to do this, but if > anyone reading this who is practiced at compiling FORTRAN and C > (especially on windows compilers) and wouldn't mind adding the little bits > required to support FORTRAN for their system once this has been decided, > mail me. > > > John > > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion From arne at cybercable.fr Wed Jan 17 04:30:10 2001 From: arne at cybercable.fr (arne keller) Date: Wed, 17 Jan 2001 10:30:10 +0100 Subject: [Numpy-discussion] multipack References: Message-ID: <3A656622.739ADC4B@cybercable.fr> "John J. Lee" wrote: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > I'm looking for the Multipack and cephes python modules by Travis > > Oliphant. > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > http://oliphant.netpedia.net/ > > http://oliphant.netpedia.net/multipack.html did you try these links? For me it returns the following message: Not Found The requested URL / was not found on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to > > BTW, I've partly written a distutils script for this with the intention of > making it easier to compile on windows (the setup.py works on linux but I > need to change some things to make it set up the setup script according to > the way the source has been set up). > > It seems distutils doesn't have explicit support for FORTRAN, so something > needs changing there for it to work cross-platform. I haven't got any > feedback yet from the Distutils people on how best to do this, but if > anyone reading this who is practiced at compiling FORTRAN and C > (especially on windows compilers) and wouldn't mind adding the little bits > required to support FORTRAN for their system once this has been decided, > mail me. > > John > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From boyle5 at llnl.gov Thu Jan 18 15:02:52 2001 From: boyle5 at llnl.gov (Jim Boyle) Date: Thu, 18 Jan 2001 12:02:52 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> Message-ID: I pulled the multipack code from the CVS server and built it for my new installation of Python 2.0. When I import Multipack I get: WARNING: Python C API version mismatch for module cephes: This Python has API version 1009, module cephes has version 1007 this warning occurs for the numpyio and sigtools modules. Have I done something screwy in the installation, or do the modules have to be updated? If the latter is there any guidance on what to fix? What are the routines/procedures that changed in going from version 1007 to 1009? So far everything appears to provide the correct answers, so the API mismatch is not crippling. However, from past experience I know that ignoring warnings often ends in tears. Jim >These pages are depreciated, as Travis has moved. I could not find the >announcement for the new pages. Menawhile you can check out these >packages from the public cvs server of Pearu Peterson. > >For an overview of the cvs content look at > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > >and for instructions to get it out of cvs look at > >http://cens.ioc.ee/projects/pysymbolic/ > >HTH, >__Janko > > >John J. Lee writes: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > Oliphant. > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > http://oliphant.netpedia.net/ > > > > http://oliphant.netpedia.net/multipack.html > > > > BTW, I've partly written a distutils script for this with the intention of > > making it easier to compile on windows (the setup.py works on linux but I > > need to change some things to make it set up the setup script according to > > the way the source has been set up). > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > needs changing there for it to work cross-platform. I haven't got any > > feedback yet from the Distutils people on how best to do this, but if > > anyone reading this who is practiced at compiling FORTRAN and C > > (especially on windows compilers) and wouldn't mind adding the little bits > > required to support FORTRAN for their system once this has been decided, > > mail me. From arne at cybercable.fr Thu Jan 18 15:24:54 2001 From: arne at cybercable.fr (arne keller) Date: Thu, 18 Jan 2001 21:24:54 +0100 Subject: [Numpy-discussion] multipack References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> Message-ID: <3A675116.C47AEB56@cybercable.fr> I install the multipack-0.7 with my new installation of Python2.0 and I don't have any warning : maybe a problem with PYTHONPATH if your old Multipack under your old Pyhon1.5 is still present? Jim Boyle wrote: > > I pulled the multipack code from the CVS server and built it for my > new installation of Python 2.0. > When I import Multipack I get: > > WARNING: Python C API version mismatch for module cephes: > This Python has API version 1009, module cephes has version 1007 > > this warning occurs for the numpyio and sigtools modules. > > Have I done something screwy in the installation, or do the modules > have to be updated? > If the latter is there any guidance on what to fix? What are the > routines/procedures that changed in going from version 1007 to 1009? > > So far everything appears to provide the correct answers, so the API > mismatch is not crippling. However, from past experience I know that > ignoring warnings often ends in tears. > > Jim > > >These pages are depreciated, as Travis has moved. I could not find the > >announcement for the new pages. Menawhile you can check out these > >packages from the public cvs server of Pearu Peterson. > > > >For an overview of the cvs content look at > > > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > > > >and for instructions to get it out of cvs look at > > > >http://cens.ioc.ee/projects/pysymbolic/ > > > >HTH, > >__Janko > > > > > >John J. Lee writes: > > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > > Oliphant. > > > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > > > http://oliphant.netpedia.net/ > > > > > > http://oliphant.netpedia.net/multipack.html > > > > > > BTW, I've partly written a distutils script for this with the intention of > > > making it easier to compile on windows (the setup.py works on linux but I > > > need to change some things to make it set up the setup script according to > > > the way the source has been set up). > > > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > > needs changing there for it to work cross-platform. I haven't got any > > > feedback yet from the Distutils people on how best to do this, but if > > > anyone reading this who is practiced at compiling FORTRAN and C > > > (especially on windows compilers) and wouldn't mind adding the little bits > > > required to support FORTRAN for their system once this has been decided, > > > mail me. > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From paul at pfdubois.com Thu Jan 18 17:30:03 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Thu, 18 Jan 2001 14:30:03 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: Message-ID: This would typically mean that you are importing the package into a different python than the one you used to build it. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Jim Boyle Sent: Thursday, January 18, 2001 12:03 PM To: numpy-discussion at lists.sourceforge.net Subject: Re: [Numpy-discussion] multipack I pulled the multipack code from the CVS server and built it for my new installation of Python 2.0. When I import Multipack I get: WARNING: Python C API version mismatch for module cephes: This Python has API version 1009, module cephes has version 1007 this warning occurs for the numpyio and sigtools modules. Have I done something screwy in the installation, or do the modules have to be updated? If the latter is there any guidance on what to fix? What are the routines/procedures that changed in going from version 1007 to 1009? So far everything appears to provide the correct answers, so the API mismatch is not crippling. However, from past experience I know that ignoring warnings often ends in tears. Jim >These pages are depreciated, as Travis has moved. I could not find the >announcement for the new pages. Menawhile you can check out these >packages from the public cvs server of Pearu Peterson. > >For an overview of the cvs content look at > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > >and for instructions to get it out of cvs look at > >http://cens.ioc.ee/projects/pysymbolic/ > >HTH, >__Janko > > >John J. Lee writes: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > Oliphant. > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > http://oliphant.netpedia.net/ > > > > http://oliphant.netpedia.net/multipack.html > > > > BTW, I've partly written a distutils script for this with the intention of > > making it easier to compile on windows (the setup.py works on linux but I > > need to change some things to make it set up the setup script according to > > the way the source has been set up). > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > needs changing there for it to work cross-platform. I haven't got any > > feedback yet from the Distutils people on how best to do this, but if > > anyone reading this who is practiced at compiling FORTRAN and C > > (especially on windows compilers) and wouldn't mind adding the little bits > > required to support FORTRAN for their system once this has been decided, > > mail me. _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From boyle5 at llnl.gov Fri Jan 19 12:33:22 2001 From: boyle5 at llnl.gov (Jim Boyle) Date: Fri, 19 Jan 2001 09:33:22 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: <3A675116.C47AEB56@cybercable.fr> References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> <3A675116.C47AEB56@cybercable.fr> Message-ID: Thanks to Arne and Paul I took the time to find out what was going on. It turns out that the CVS has multipack 0.8 and Travis added the cephes, numpyio and sigtools modules to multipack in going from 0.7 to 0.8. I did the usual things to account for a non-standard python location at the topmost level of Multipack as I had before with 0.7 but should have also tweaked the Makefiles in the cephes, numpyio and sigtools directories. The result was that these modules were built with Python 1.5 and all the others with 2.0 and I confused myself as to where the problem was. But when I tried to build the cephes module using Python 2.0 I got the following error: gcc -O2 -I/usr/local/include/python2.0 -c -o amos_wrappers.o amos_wrappers.c In file included from cephes/mconf.h:162, from amos_wrappers.h:12, from amos_wrappers.c:8: cephes/protos.h:67: parse error before `sizeof' cephes/protos.h:68: parse error before `sizeof' cephes/protos.h:69: parse error before `sizeof' make: *** [amos_wrappers.o] Error 1 the line: gcc -O2 -I/usr/local/include/python1.5 -c -o amos_wrappers.o amos_wrappers.c works fine. I have tried this on another machine with Python 2.0 and got the same error. The Python.h includes changed quite a bit in going from 1.5 to 2.0. Any ideas as to what is wrong would be welcome. Has anyone installed multipack 0.8 using Python 2.0? Jim >I install the multipack-0.7 with my new installation of Python2.0 and I >don't have any >warning : >maybe a problem with PYTHONPATH if your old Multipack under your old >Pyhon1.5 is still present? > > >Jim Boyle wrote: >> >> I pulled the multipack code from the CVS server and built it for my >> new installation of Python 2.0. >> When I import Multipack I get: >> >> WARNING: Python C API version mismatch for module cephes: >> This Python has API version 1009, module cephes has version 1007 >> >> this warning occurs for the numpyio and sigtools modules. >> >> Have I done something screwy in the installation, or do the modules >> have to be updated? >> If the latter is there any guidance on what to fix? What are the >> routines/procedures that changed in going from version 1007 to 1009? >> >> So far everything appears to provide the correct answers, so the API >> mismatch is not crippling. However, from past experience I know that > > ignoring warnings often ends in tears. >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From Edward_A._Damerau at dadebehring.com Fri Jan 19 15:09:47 2001 From: Edward_A._Damerau at dadebehring.com (Edward_A._Damerau at dadebehring.com) Date: Fri, 19 Jan 2001 15:09:47 -0500 Subject: [Numpy-discussion] calling numpy from C++ Message-ID: I've got Python 20 and numerical python installed and running, i.e., I can run python scripts and call import module Numerical and use arrays,etc. I've also been successful at calling python scripts from C/C++ code (MSVC++ 6.0) - Py_Initialize(), Py_BuildValue() etc. and building the de-bug version of python20.dll (python20_d.dll) so I can debug-step through the code. However, I can't seem to be able to call numpy functions from C/C++ - PyArray_FromDims(), etc. My code will compile and link but, at runtime, I get an exception error. It seems that either I need to create a numpy.dll that will live in winnt/system32 or somehow need to modify python20.dll to include the numpy functions. How do I do that? Thanks. From anthony.seward at ieee.org Sun Jan 21 00:12:59 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sat, 20 Jan 2001 22:12:59 -0700 (MST) Subject: [Numpy-discussion] Questions about organization Message-ID: I've been trying to get the setup.py scripts in the Numeric distribution to build binary distributions. I have tried several things but what always ends up being a problem is the current Numeric organization, specifically LALITE and RANLIB. They just don't fit well with the distutils paradigm. If they were packaged as part of the Numeric core but were a build-time option, I think that I could get things to work much more easily. The configuration part of distutils is still a bit imature but overcomming that would just require a slightly more involved setup.py. I don't want to go ahead if there is no chance of it being accepted. Tony From phrxy at csv.warwick.ac.uk Sun Jan 21 12:49:30 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Sun, 21 Jan 2001 17:49:30 +0000 (GMT) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sat, 20 Jan 2001, Tony Seward wrote: > I've been trying to get the setup.py scripts in the Numeric distribution to > build binary distributions. I have tried several things but what always > ends up being a problem is the current Numeric organization, specifically > LALITE and RANLIB. They just don't fit well with the distutils paradigm. > > If they were packaged as part of the Numeric core but were a build-time > option, I think that I could get things to work much more easily. The [...] Can't you just edit the list of extensions in setup_all.py? Or has that changed? John From anthony.seward at ieee.org Sun Jan 21 17:38:11 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sun, 21 Jan 2001 15:38:11 -0700 (MST) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, John J. Lee wrote: > On Sat, 20 Jan 2001, Tony Seward wrote: > > > I've been trying to get the setup.py scripts in the Numeric distribution to > > build binary distributions. I have tried several things but what always > > ends up being a problem is the current Numeric organization, specifically > > LALITE and RANLIB. They just don't fit well with the distutils paradigm. > > > > If they were packaged as part of the Numeric core but were a build-time > > option, I think that I could get things to work much more easily. The > [...] > > Can't you just edit the list of extensions in setup_all.py? > > Or has that changed? > > > John > I'm not sure what you mean. Even './setup_all.py build' doen't work for FFT, LALITE, RANLIB and RNG. Right now one has to have already installed the headers from the core of Numeric in order to even build these modules. The sequence is thus 1) get source 2) install the core 3) build or install the submodules As is stands, even if the Numeric core is already installed, it is not possible to make a binary distribution of the LALITE package (and maybe others). This could be fixed easily of the LALITE package, but I havn't looked to see if there are other problems with the other packages. The purpose of distutils is that the sequence for a given distribution is 1) get the source 2) build, or install, or build a binary distribution and you're done This can't be done with the way that things are organized right now. There are probably several ways to deal with this, but first a decision needs to be made as to what the goals are. As I understand it the goals are 1) keep all of the current packages in the distribution 2) the following imports will work individually or together: a) import Numeric b) import FFT c) import LinearAlgebra d) import MLab e) import Matrix f) import Precision g) import RandomArray h) import UserArray i) import ranlib j) import umath k) import RNG l) import MA 3) minimal changes in the source I just need some direction so that I can get things working. I think that the simplest thing is to merge the RANLIB and LALITE packages with the core of Numeric and make the building and installation of them an option. This fits beter with the paradigm of distutils. Tony From phrxy at csv.warwick.ac.uk Sun Jan 21 18:51:44 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Sun, 21 Jan 2001 23:51:44 +0000 (GMT) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, Tony Seward wrote: > On Sun, 21 Jan 2001, John J. Lee wrote: [...] > I'm not sure what you mean. Even './setup_all.py build' doen't work for > FFT, LALITE, RANLIB and RNG. > > Right now one has to have already installed the headers from the core of > Numeric in order to even build these modules. The sequence is thus > > 1) get source > 2) install the core > 3) build or install the submodules Ah, sorry, I didn't read you post properly. I thought you were just having trouble compiling something you didn't really need, and wanted to avoid compiling it. You may already know this, but this has been discussed on the distutils-sig mailing list (and maybe this one too) just recently. I don't recall what conclusions they came to, if any - things are held up at the moment because Greg Ward just moved house and is effectively off-line. I guess when he appears again he'll be swamped, so don't hold your breath for changes in distutils itself. John From paul at pfdubois.com Sun Jan 21 18:48:50 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Sun, 21 Jan 2001 15:48:50 -0800 Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: You wrote: ...As I understand it the goals are 1) keep all of the current packages in the distribution 2) the following imports will work individually or together: a) import Numeric b) import FFT c) import LinearAlgebra d) import MLab e) import Matrix f) import Precision g) import RandomArray h) import UserArray i) import ranlib j) import umath k) import RNG l) import MA 3) minimal changes in the source I just need some direction so that I can get things working. I think that the simplest thing is to merge the RANLIB and LALITE packages with the core of Numeric and make the building and installation of them an option. This fits beter with the paradigm of distutils. --- You are describing exactly what we used to have. We changed it. We had reasons that haven't disappeared. (1) is not necessarily a goal. We had a long argument about that, too. If anything, (1) is an anti-goal. Under no circumstances am I willing to move packages back to the core. I think it more likely that all the optional packages will go somewhere else entirely. We need a more organized structure. For historical reasons people did not want to change the way some of the above are NOT packages. (2) No, it is not possible to import those modules independently. Some depend on the others. (3) is not a goal, but it is not permissible to require changes in CLIENT code, such as changing where the include files get installed. (4) Goal: enable people to easily modify LinearAlgebra so that a user-specified BLAS and/or LAPACK library is used. (5) Goal: not make it harder to make distributions on the Mac or Windows just to make RPMs easier. From anthony.seward at ieee.org Sun Jan 21 21:09:27 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sun, 21 Jan 2001 19:09:27 -0700 (MST) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, Paul F. Dubois wrote: > You wrote: > > ...As I understand it the goals are > > 1) keep all of the current packages in the distribution > 2) the following imports will work individually or together: > a) import Numeric > b) import FFT > c) import LinearAlgebra > d) import MLab > e) import Matrix > f) import Precision > g) import RandomArray > h) import UserArray > i) import ranlib > j) import umath > k) import RNG > l) import MA > 3) minimal changes in the source > > I just need some direction so that I can get things working. I think that > the simplest thing is to merge the RANLIB and LALITE packages with the core > of Numeric and make the building and installation of them an option. This > fits beter with the paradigm of distutils. > > --- > You are describing exactly what we used to have. We changed it. We had > reasons that haven't disappeared. > > (1) is not necessarily a goal. We had a long argument about that, too. If > anything, (1) is an anti-goal. > Under no circumstances am I willing to move packages back to the core. I > think it more likely that > all the optional packages will go somewhere else entirely. We need a > more organized structure. > For historical reasons people did not want to change the way some of the > above are NOT packages. OK, I guess I mis-rememberd the conclusion. If we have a more organized structure than I think that most of the problems would be solved. Whether RANLIB and LALITE are sub-packages of Numeric, or they are installed parallel to Numeric the setup.py scripts would be simple. If they are parallel to Numeric than they can be imported like they are now if that is a concern. > (2) No, it is not possible to import those modules independently. Some > depend on the others. All of the import lines that I gave work with the Numeric that I have installed: I left out the ones that didn't. Let me know which ones should be able to be imported and if there are any missing. > (3) is not a goal, but it is not permissible to require changes in CLIENT > code, such as changing where the include files get installed. By client code do you mean non-python code using the libraries? > (4) Goal: enable people to easily modify LinearAlgebra so that a > user-specified BLAS and/or LAPACK library is used. I've been playing with this and on *nix types systems it shouldn't be too hard to do as options to setup.py. I don't know how BLAS and LAPAK get installed on non *nix systems. My plan has been to do minimal autoconf type scanning (taking into consideration command line options) to look for existing libraries. > (5) Goal: not make it harder to make distributions on the Mac or Windows > just to make RPMs easier. > I'm trying to go through distutils when the required functionality is already present so it shouldn't be mch of an issue. I do have a W2k laptop that I can test things on in extremis. The main roadblock is trying to figure out what to do with RANLIB and LALITE. Tony From Edward_A._Damerau at dadebehring.com Mon Jan 22 12:50:54 2001 From: Edward_A._Damerau at dadebehring.com (Edward_A._Damerau at dadebehring.com) Date: Mon, 22 Jan 2001 12:50:54 -0500 Subject: [Numpy-discussion] Debug version of numpy for Win32 Message-ID: Has anyone built a debug version of _numpy.pyd (_numpy_d.pyd) for Windows NT (MS Visual Studio C++ 6.0)? From HoumanG at ActiveState.com Mon Jan 22 13:18:47 2001 From: HoumanG at ActiveState.com (Houman G) Date: Mon, 22 Jan 2001 10:18:47 -0800 Subject: [Numpy-discussion] Questions about organization (fwd) In-Reply-To: <20010121125510.A30408@ActiveState.com> Message-ID: <000701c0849f$c3c26f60$ba03a8c0@TANK> On Sun, Jan 21, 2001 at 12:36:37PM -0800, Greg Stein wrote: > FYI about what? I'm not sure that I see the point here. > > Are you trying to say that ActiveState can provide tools for packaging? > Maybe that we can get NumPy packaged for people? This one, I believe. HoumanG has been working on PyPM (like PPM for Python) for which he is creating appropriate setup.py scripts for packages such as NumPy. I wrote a new setup.py script for NumPy a week ago. The script I wrote builds the RANLIB and the LALITE as part of the core NumPy; and the binary distribution of NumPy built with my script also installs the RANLIB and LALITE. If the question is that Distutils supports a source tree structure like the one NumPy has, the question is no. However, I made a patch (for Distutils) and submitted to SourceForge which allows the source tree to be spread in more than one directory. A copy of the patch is in the attachment. Houman -------------- next part -------------- A non-text attachment was scrubbed... Name: Distutils_multidir_patch.diff.dos Type: application/octet-stream Size: 10643 bytes Desc: not available URL: From pplumlee at omnigon.com Mon Jan 22 13:26:41 2001 From: pplumlee at omnigon.com (Phlip) Date: Mon, 22 Jan 2001 10:26:41 -0800 Subject: [Numpy-discussion] Ever hear of ROOT? In-Reply-To: References: Message-ID: <01012210264100.01330@rrm> Nummies: A colleague recently asked me to look up something called "ROOT". I complained "I can't e-search for 'root'! I'd get a million false hits!" ROOT's here: http://root.cern.ch It's a high-end version of programs like GnuPlot or OpenDx; it's written for physicists at CERN, and it can handle terabyte databases. The bad news it bonds to a questionable "C++ Interpreter" for its scripting end. We all know that interpreting a language designed to be compiled represents the worst of both worlds. Has anyone plugged their NumericPython into ROOT? How hard could that be? -- Phlip phlip_cpp at my-deja.com ============ http://c2.com/cgi/wiki?PhlIp ============ -- http://users.deltanet.com/~tegan/home.html -- From paul at pfdubois.com Mon Jan 22 14:24:51 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Mon, 22 Jan 2001 11:24:51 -0800 Subject: [Numpy-discussion] Questions about organization (fwd) In-Reply-To: <000701c0849f$c3c26f60$ba03a8c0@TANK> Message-ID: Just so you know: it is not ok for the official script to install RANLIB and LALITE as a default part of the build. In fact, that used to be the case and we changed it. There are people who have to change the way these packages are built to use custom LAPACK and BLAS scripts. I won't be able to accept a patch to NumPy that does this. NumPy's current clumsy structure is a result of a decision not to break existing client scripts. These "Optional Packages" are distributed with NumPy. The don't have to be, but again, there were many people who thought it wasn't worth the confusion of having to go get all of them separately. This arrangement appears to annoy distutils purists, but distutils purity isn't the only consideration. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Houman G Sent: Monday, January 22, 2001 10:19 AM To: dev-py at ActiveState.com Cc: numpy-discussion at lists.sourceforge.net Subject: RE: [Numpy-discussion] Questions about organization (fwd) On Sun, Jan 21, 2001 at 12:36:37PM -0800, Greg Stein wrote: > FYI about what? I'm not sure that I see the point here. > > Are you trying to say that ActiveState can provide tools for packaging? > Maybe that we can get NumPy packaged for people? This one, I believe. HoumanG has been working on PyPM (like PPM for Python) for which he is creating appropriate setup.py scripts for packages such as NumPy. I wrote a new setup.py script for NumPy a week ago. The script I wrote builds the RANLIB and the LALITE as part of the core NumPy; and the binary distribution of NumPy built with my script also installs the RANLIB and LALITE. If the question is that Distutils supports a source tree structure like the one NumPy has, the question is no. However, I made a patch (for Distutils) and submitted to SourceForge which allows the source tree to be spread in more than one directory. A copy of the patch is in the attachment. Houman From turner at blueskystudios.com Tue Jan 23 11:33:56 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 11:33:56 -0500 Subject: [Numpy-discussion] numpy.org redirection Message-ID: <14957.45684.132068.755774@denmark.blueskystudios.com> some time ago I registered the domain numpy.org with the intention of donating it to the group just thought about it again today, and realized I could easily redirect it, so my question is: o does anyone object to my doing that? o where should it point, here? http://numpy.sourceforge.net/ or here? http://sourceforge.net/projects/numpy if someone feels strongly that I should transfer ownership of the domain to someone more central to development, etc., I'll be happy to do that - otherwise I'm happy just redirecting and renewing it as it comes up, or at some point setting up real hosting of the domain somewhere (sourceforge?) -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 From paul at pfdubois.com Tue Jan 23 11:56:31 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Tue, 23 Jan 2001 08:56:31 -0800 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: This is very generous of you. I don't want to own the domain. I'm not even sure as a govt employee if you could give it to me. The new Python Foundation, when it is up and running, would be appropriate. For now, I suggest you just leave it as is. I don't understand your remark about "real hosting". numpy.sourceforge.net really is at sourceforge (? guess it is obvious I'm not familiar with this term). Both pages have pointers to each other. So technically it is a wash. People looking for docs need to be on one, for releases on the other. My guess is that the link on the project page back to the home page is a little more confusing for a newbie than the clearly spelled-out link on the home page to the project page. When I first used SF I thought "Home" would be the project page. So my guess is that numpy.sourceforge.net would be best. The operative word is "guess". -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of John A. Turner Sent: Tuesday, January 23, 2001 8:34 AM To: Numpy-discussion at lists.sourceforge.net Subject: [Numpy-discussion] numpy.org redirection some time ago I registered the domain numpy.org with the intention of donating it to the group just thought about it again today, and realized I could easily redirect it, so my question is: o does anyone object to my doing that? o where should it point, here? http://numpy.sourceforge.net/ or here? http://sourceforge.net/projects/numpy if someone feels strongly that I should transfer ownership of the domain to someone more central to development, etc., I'll be happy to do that - otherwise I'm happy just redirecting and renewing it as it comes up, or at some point setting up real hosting of the domain somewhere (sourceforge?) -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From turner at blueskystudios.com Tue Jan 23 12:19:35 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 12:19:35 -0500 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: References: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: <14957.48423.359577.770038@denmark.blueskystudios.com> >>>>> "PFD" == Paul F Dubois : PFD> This is very generous of you. I don't want to own the domain. I'm not even PFD> sure as a govt employee if you could give it to me. The new Python PFD> Foundation, when it is up and running, would be appropriate. For now, I PFD> suggest you just leave it as is. PFD> I don't understand your remark about "real hosting". numpy.sourceforge.net PFD> really is at sourceforge (? guess it is obvious I'm not familiar with this PFD> term). I just meant to have numpy.org hosted by sourceforge - but I'm not even sure they do that PFD> Both pages have pointers to each other. So technically it is a wash. PFD> People looking for docs need to be on one, for releases on the other. PFD> PFD> My guess is that the link on the project page back to the home page is a PFD> little more confusing for a newbie than the clearly spelled-out link on the PFD> home page to the project page. When I first used SF I thought "Home" would PFD> be the project page. So my guess is that numpy.sourceforge.net would be PFD> best. The operative word is "guess". that's what I was thinking too - just thought I'd ask for other opinions (I do think it's a little odd that the title of that page is still "LLNL Python Extensions" rather than "Numerical Python") ok, so the redirection is set up - will probably take a while to propogate, of course -JT From turner at blueskystudios.com Tue Jan 23 12:28:39 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 12:28:39 -0500 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: References: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: <14957.48967.738242.832224@denmark.blueskystudios.com> >>>>> "PFD" == Paul F Dubois : PFD> This is very generous of you. by the way, it isn't such a big deal any more since the registrars have opened up - I use http://www.directnic.com, which is only $15/year and has nice easy admin of the domains - far superior to networksolutions so I've done this for several open source projects (gnuplot, octave, and latex2html) - usually when I use but haven't been able to contribute as much in the way of development as I would have liked -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 From paul at pfdubois.com Wed Jan 24 10:46:07 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Wed, 24 Jan 2001 07:46:07 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available Message-ID: Numeric Python 17.3.0 is available in source form at http://sourceforge.net/projects/numpy This is the last planned release in the 17 family. Binary versions will appear later -- developers, please help. Release 18 will require Python 2.1. Several of the new features in Python 2.1 are specifically meant to enable improvements in Numeric and we are going to try to take advantage of them. Paul F. Dubois Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory From pete at shinners.org Wed Jan 24 12:18:04 2001 From: pete at shinners.org (Pete Shinners) Date: Wed, 24 Jan 2001 09:18:04 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available References: Message-ID: <000701c08629$9ea26d80$0200a8c0@petebox> > This is the last planned release in the 17 family. Binary versions will > appear later -- developers, please help. hello paul. i didn't see a changelog or anything in the new release? regardless, i've finally put together my precompiled ZIP for win32 and python2.0. it is available here; http://pygame.seul.org/ftp/contrib/Numeric-17.3-win32-2.0.zip once this file is available from the sourceforge site i will remove it from my little server, so please don't go linking this URL all over the world :] also note that i finally broke from the win32 binary naming convention that had been in use since the 16.0 release. i don't know if that was actually some sort of convention, or just an old tradition that got accidentally started :] either way, feel free to rename this file however you see fit. one last thing. in the "setup.py" for Numeric, one of the extensions included "Libraries=['m']", but this is not needed on windows. (in fact, there is no math library, so the compile was failing). i took this library out for my build, and everything seems happy. finally. unlike my previous release, i also have finally included some quick information in the readme, i'll just snip out the new information from that file... """ This build of Numeric 17.3.0 was built by Pete Shinners (pete at shinners.org) on January 24, 2001 To install, extract the ZIP file into your Python folder. (for example, on my system this is C:\python20\) Unfortunately, I do not make use of the scientific packages like FFT, MA, and RNG, so I cannot test that they work, but I do know the base ArrayObject works well, and these extra packages have worked fine on my previous releases. I am also the maintainer of the pygame library. Pygame is a great library for writing games in python, and includes a numeric module for directly accessing pixel values with numeric arrays. http://pygame.seul.org Following is the original README included with the Numeric source ----------------------------------------------------------------- """ + open('numeric-17.3/readme').read() From mhagger at alum.mit.edu Sat Jan 27 12:10:43 2001 From: mhagger at alum.mit.edu (Michael Haggerty) Date: Sat, 27 Jan 2001 12:10:43 -0500 (EST) Subject: [Numpy-discussion] Gnuplot.py version 1.5 released Message-ID: <14963.275.224865.406047@freak.kaiserty.com> This is to announce the release of version 1.5 of Gnuplot.py. Gnuplot.py is a Python [1] package that allows you to create graphs from within Python using the gnuplot [2] plotting program. Gnuplot.py can be obtained from http://gnuplot-py.sourceforge.net/ Prerequisites (see footnotes): the Python interpreter [1] the Python Numeric module [3] the gnuplot program [2] Some ways this package can be used: 1. Interactive data processing: Use Python's excellent Numeric package to create and manipulate arrays of numbers, and use Gnuplot.py to visualize the results. 2. Web graphics: write CGI scripts in Python that use gnuplot to output plots in GIF format and return them to the client. 3. Glue for numerical applications (this is my favorite): wrap your C++/C/Fortran subroutines so that they are callable from Python, then you can perform numerical computations interactively from scripts or from the command line and use Gnuplot.py to plot the output on the fly. 4. Compute a series of datasets in Python and plot them one after the other using Gnuplot.py to produce a crude animation. New features in this version: + Added distutils support. + Broke up the module a bit for better maintainability. The most commonly-used facilities are still available through "import Gnuplot", but some specialized things have been moved to separate modules, in particular funcutils.py and PlotItems.py. + funcutils.tabulate_function() can be used to evaluate a function on a 1-D or 2-D grid of points (this replaces grid_function, which only worked with 2-D grids). + Added two helper functions, funcutils.compute_Data and funcutils.compute_GridData, which compute a function's values on a set of points and package the results into a PlotItem. + GridFunc is no longer an independent class; it is now a factory function that returns a GridData. GridFunc is deprecated in favor of funcutils.compute_GridData. + Changed set_option to work from a table, so that it doesn't need to be overloaded so often. + Implemented test_persist for each platform to make it easier for users to determine whether the `-persist' option is supported. + Added a prefer_persist option to serve as the default `persist' choice. + Following a suggestion by Jannie Hofmeyr, use "from os import popen" for Python 2.0 under Windows. I don't use Windows, so let me know how this works. + Added support for the `axes' and `smooth' options of the `plot' command. + Reworked the comment strings in an effort to make them work nicely with happydoc. Features already present in older versions: + Two and three-dimensional plots. + Plot data from memory, from a file, or from an expression. + Support for multiple simultaneous gnuplot sessions. + Can pass arbitrary commands to the gnuplot program. + Object oriented, extensible design with several built-in types of plot items. + Portable and easy to install (nothing to compile except on Windows). + Support for MS Windows, using the `pgnuplot.exe' program. + Support for sending data to gnuplot as `inline' or `binary' data. These are optimizations that also remove the need for temporary files. Temporary files are still the default. Footnotes: ---------- [1] Python is an excellent object-oriented scripting/rapid development language that is also especially good at gluing programs together. [2] gnuplot is a free, popular, very portable plotting program with a command-line interface. It can make 2-d and 3-d plots and can output to myriad printers and graphics terminals. [3] The Numeric Python extension is a Python module that adds fast and convenient array manipulations to the Python language. -- Michael Haggerty mhagger at alum.mit.edu From jsaenz at wm.lc.ehu.es Mon Jan 29 09:46:44 2001 From: jsaenz at wm.lc.ehu.es (Jon Saenz) Date: Mon, 29 Jan 2001 15:46:44 +0100 (MET) Subject: [Numpy-discussion] Is this a wheel? Message-ID: Hello, there. I needed last Saturday a function which returns the greatest/smallest element of a NumPy array. I seeked through the documentation and found the argmax/argmin functions. However, they must be called recursively to find the greatest(smallest) element of a multidimendional array. As I needed to process a BIG dataset of multidimensional arrays, I wrote a function in C which returns as a NumPy array shaped (2,) the [smallest one,biggest one] elements in an arbitrarily shaped NumPy array. It is pure C and works for multidimensional arrays. The return typecode is the same of the input array (except with complex numbers, which compare numbers through their modules). I can make this function available to general public by means of my WEB page or my starship account as a module. However, I wonder: a) Is this a wheel already invented some 40,000 years ago? May be I missed something in the manual? b) If the answer to the previous question is NO, would you (main developers) be interested in making it available as one of the "general purpose" NumPy functions? It is quite general-purpose, indeed. I have needed it five times or so in the last two years... Looking after your comments. Jon Saenz. | Tfno: +34 946012470 Depto. Fisica Aplicada II | Fax: +34 944648500 Facultad de Ciencias. \\ Universidad del Pais Vasco \\ Apdo. 644 \\ 48080 - Bilbao \\ SPAIN From paul at pfdubois.com Mon Jan 29 10:20:48 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Mon, 29 Jan 2001 07:20:48 -0800 Subject: [Numpy-discussion] Is this a wheel? In-Reply-To: Message-ID: >>> import Numeric >>> x=Numeric.arange(24) >>> x.shape=(3,2,4) >>> print Numeric.maximum.reduce(x.flat) 23 The .flat operator does not copy the data. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Jon Saenz Sent: Monday, January 29, 2001 6:47 AM To: Numpy-Discussion at Lists. Sourceforge. Net Subject: [Numpy-discussion] Is this a wheel? Hello, there. I needed last Saturday a function which returns the greatest/smallest element of a NumPy array. I seeked through the documentation and found the argmax/argmin functions. However, they must be called recursively to find the greatest(smallest) element of a multidimendional array. As I needed to process a BIG dataset of multidimensional arrays, I wrote a function in C which returns as a NumPy array shaped (2,) the [smallest one,biggest one] elements in an arbitrarily shaped NumPy array. It is pure C and works for multidimensional arrays. The return typecode is the same of the input array (except with complex numbers, which compare numbers through their modules). I can make this function available to general public by means of my WEB page or my starship account as a module. However, I wonder: a) Is this a wheel already invented some 40,000 years ago? May be I missed something in the manual? b) If the answer to the previous question is NO, would you (main developers) be interested in making it available as one of the "general purpose" NumPy functions? It is quite general-purpose, indeed. I have needed it five times or so in the last two years... Looking after your comments. Jon Saenz. | Tfno: +34 946012470 Depto. Fisica Aplicada II | Fax: +34 944648500 Facultad de Ciencias. \\ Universidad del Pais Vasco \\ Apdo. 644 \\ 48080 - Bilbao \\ SPAIN _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From dpgrote at lbl.gov Mon Jan 29 14:29:24 2001 From: dpgrote at lbl.gov (David P Grote) Date: Mon, 29 Jan 2001 11:29:24 -0800 Subject: [Numpy-discussion] Is this a wheel? References: Message-ID: <3A75C494.70200@lbl.gov> An HTML attachment was scrubbed... URL: From pete at shinners.org Tue Jan 30 03:25:55 2001 From: pete at shinners.org (Pete Shinners) Date: Tue, 30 Jan 2001 00:25:55 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available References: <000701c08629$9ea26d80$0200a8c0@petebox> Message-ID: <002901c08a96$458bef20$0200a8c0@petebox> just curious about the status on this. i've got this file still available, but i notice it hasn't shown up on the Numpy downloads on sourceforge. Last week, I wrote: > > This is the last planned release in the 17 family. Binary versions will > > appear later -- developers, please help. > > hello paul. i didn't see a changelog or anything in the new release? > > regardless, i've finally put together my precompiled ZIP for win32 > and python2.0. it is available here; > http://pygame.seul.org/ftp/contrib/Numeric-17.3-win32-2.0.zip > > once this file is available from the sourceforge site i will > remove it from my little server, so please don't go linking > this URL all over the world :] > > also note that i finally broke from the win32 binary naming > convention that had been in use since the 16.0 release. i > don't know if that was actually some sort of convention, or > just an old tradition that got accidentally started :] > either way, feel free to rename this file however you see fit. > > one last thing. in the "setup.py" for Numeric, one of the > extensions included "Libraries=['m']", but this is not needed > on windows. (in fact, there is no math library, so the > compile was failing). i took this library out for my build, > and everything seems happy. > > finally. unlike my previous release, i also have finally included > some quick information in the readme, i'll just snip out the new > information from that file... From gshuetrim at kpmg.com.au Tue Jan 2 01:56:21 2001 From: gshuetrim at kpmg.com.au (Shuetrim, Geoff) Date: Tue, 2 Jan 2001 17:56:21 +1100 Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) Message-ID: Apologies for asking an overly vague question like this but: on an Intel/win32 platform (I only have a windows version of Gauss), I am comparing Numpy matrix inversion with that of Gauss (very much the same type of software as Matlab which at least some of you used to work with). As the size of the matrix to be inverted increases, the speed of Numpy appears to asymptote (on my machine) to about half that of Gauss. For small matrices, it is much worse than that because of the various overheads that are dealt with by Numpy. Would this speed differential be largely eliminated if I was not using LAPACK-LITE? If so I will try to figure my way through hooking into Intel's MKL - anyone got hints on doing this - I saw mention of it in the mailing list archives. Would I be better off, speed-wise, eschewing win32 altogether and using native LAPACK and BLAS libraries on my Linux box? This is relevant to me in the context of a multivariate Kalman filtering module that I am working up to replace one I have been using on the Gauss platform for years. The Numpy version of my filter has a very similar logic structure to that of Gauss but is drastically slower. I have only been working with Numpy for a month or so which may mean that my code is relatively innefficient. I have been assuming that Numpy - as an interpreted language is mainly slowed by looping structures. Thanks in advance, Geoff Shuetrim ____________________________________________________________________________ ________ A simple version of the filter is given below: (Note that I have modified Matrix.py in my installation to include a transpose method for the Matrix class, T()). # *************************************************************** # kalman.py module by Geoff Shuetrim # # Please note - this code is thoroughly untested at this stage # # You may copy and use this module as you see fit with no # guarantee implied provided you keep this notice in all copies. # *************************************************************** # Minimization routines """kalman.py Routines to implement Kalman filtering and smoothing for multivariate linear state space representations of time-series models. Notation follows that contained in Harvey, A.C. (1989) "Forecasting Structural Time Series Models and the Kalman Filter". Filter --- Filter - condition on data to date """ # Import the necessary modules to use NumPy import math from Numeric import * from LinearAlgebra import * from RandomArray import * import MLab from Matrix import * # Initialise module constants Num = Numeric max = MLab.max min = MLab.min abs = Num.absolute __version__="0.0.1" # filtration constants _obs = 100 _k = 1 _n = 1 _s = 1 # Filtration global arrays _y = Matrix(cumsum(standard_normal((_obs,1,_n)))) _v = Matrix(zeros((_obs,1,_n),Float64)) _Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 _d = Matrix(zeros((_obs,1,_n),Float64)) _H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 _T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 _c = Matrix(zeros((_obs,1,_k),Float64)) _R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 _Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 _a = Matrix(zeros((_obs,1,_k),Float64)) _a0 = Matrix(zeros((_k,1),Float64)) _ap = _a _as = _a _P = Matrix(zeros((_obs,_k,_k),Float64)) _P0 = Matrix(zeros((_k,_k),Float64)) _Pp = _P _Ps = _P _LL = Matrix(zeros((_obs,1,1),Float64)) def Filter(): # Kalman filtering routine _ap[0] = _T[0] * _a0 + _c[0] _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() for t in range(1,_obs-1): _ap[t] = _T[t] * _a[t-1] + _c[t] _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] Ft_inverse = inverse(Ft) _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * _Pp[t] _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * Ft_inverse * _v[t].T()) Filter() ____________________________________________________________________________ ________ ********************************************************************** " This email is intended only for the use of the individual or entity named above and may contain information that is confidential and privileged. If you are not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this Email is strictly prohibited. When addressed to our clients, any opinions or advice contained in this Email are subject to the terms and conditions expressed in the governing KPMG client engagement letter. If you have received this Email in error, please notify us immediately by return email or telephone +61 2 93357000 and destroy the original message. Thank You. " **********************************************************************... From marimont at nxpdata.com Tue Jan 2 10:06:06 2001 From: marimont at nxpdata.com (David H. Marimont) Date: Tue, 02 Jan 2001 07:06:06 -0800 Subject: [Numpy-discussion] Da Blas References: <39C2409E.51659120@cfa.harvard.edu> Message-ID: <3A51EE5E.4367EF78@nxpdata.com> I have a problem that seems related to the one below. I'm trying to build and install Numeric 17.2.0 using the lapack and blas libraries under Python 2.0 on Red Hat 6.2 with the The build and install go fine, but when I run python and import LinearAlgebra, I get the following message: Python 2.0 (#1, Nov 13 2000, 14:15:52) [GCC egcs-2.91.66 19990314/Linux (egcs-1.1.2 release)] on linux2 Type "copyright", "credits" or "license" for more information. >>> from Numeric import * >>> from LinearAlgebra import * Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.0/site-packages/Numeric/LinearAlgebra.py", line 8, in ? import lapack_lite ImportError: /usr/local/lib/python2.0/site-packages/Numeric/lapack_lite.so: undefined symbol: dgesvd_ >>> I did try relinking with g77, as Scott's earlier message suggested: [root at harmony LALITE]# g77 -shared build/temp.linux-i686-2.0/Src/lapack_litemodule.o -L/usr/lib/lapack -o build/lib.linux-i686-2.0/lapack_lite.so ld: cannot open crtbeginS.o: No such file or directory [root at harmony LALITE]# But I don't know much about g77 (or gcc, for that matter), so I didn't know how to diagnose the error. I'm also not sure I'd know what to do next if the link step had worked! I'd sure appreciate some help with this... Thanks! David "Scott M. Ransom" wrote: > > Frank Horowitz wrote: > > > > However, when I > > coerced the distutils system to get around that bug (by specifying > > "/usr/lib " with a trailing blank for the BLASLIBDIR and LAPACKLIBDIR > > variables in setup.py) the same problem (i.e. an "ImportError: > > /usr/lib/liblapack.so.3: undefined symbol: e_wsfe" in importing > > lapack_lite) ultimately manifested itself. > > This problem is easily fixed (at least on linux) by performing the link > of lapack_lite.so with g77 instead of gcc (this is required because the > lapack and/or blas libraries are based on fortran object files...). > > For instance the out-of-box link command on my machine (Debian 2.2) is: > > gcc -shared build/temp.linux2/Src/lapack_litemodule.o -L/usr/local/lib > -L/usr/lib -llapack -lblas -o build/lib.linux2/lapack_lite.so > > Simply change the 'gcc' to 'g77' and everything works nicely. > > Not sure if this is specific to Linux or not... > > Scott > > -- > Scott M. Ransom Address: Harvard-Smithsonian CfA > Phone: (617) 495-4142 60 Garden St. MS 10 > email: ransom at cfa.harvard.edu Cambridge, MA 02138 > GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From jhauser at ifm.uni-kiel.de Tue Jan 2 10:17:48 2001 From: jhauser at ifm.uni-kiel.de (Janko Hauser) Date: Tue, 2 Jan 2001 16:17:48 +0100 (CET) Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) In-Reply-To: References: Message-ID: <20010102151748.26222.qmail@lisboa.ifm.uni-kiel.de> First, have you tested the execution time of inverse() alone in NumPy and Gauss, or only with the appended function. There are some ways to speed up the function without using a specialized library. The function also has a possible error. (see comments in the function). In general the penality for array creation (memory allocation) is rather big in Numpy, especially if you use long expressions, as the result of every term is stored in a newly created array. To speed up these kinds of operations use the inplace operations with some workspace arrays, like: >>> # Translating _ap[0] = _T[0] * _a0 + _c[0] >>> _ap[0]=add(multiply(_T[0],_a0,_ap[0]),_c[0],_ap[0]) Then it should be possible to code this without a loop at all. This would also speed up the function. HTH, __Janko Shuetrim, Geoff writes: > ________ > > A simple version of the filter is given below: > (Note that I have modified Matrix.py in my installation to include a > transpose method for the Matrix class, T()). > > # *************************************************************** > # kalman.py module by Geoff Shuetrim > # > # Please note - this code is thoroughly untested at this stage > # > # You may copy and use this module as you see fit with no > # guarantee implied provided you keep this notice in all copies. > # *************************************************************** > > # Minimization routines > """kalman.py > > Routines to implement Kalman filtering and smoothing > for multivariate linear state space representations > of time-series models. > > Notation follows that contained in Harvey, A.C. (1989) > "Forecasting Structural Time Series Models and the Kalman Filter". > > Filter --- Filter - condition on data to date > > """ > > # Import the necessary modules to use NumPy > import math > from Numeric import * > from LinearAlgebra import * > from RandomArray import * > import MLab > from Matrix import * > > # Initialise module constants > Num = Numeric > max = MLab.max > min = MLab.min > abs = Num.absolute > __version__="0.0.1" > > # filtration constants > _obs = 100 > _k = 1 > _n = 1 > _s = 1 > > # Filtration global arrays > _y = Matrix(cumsum(standard_normal((_obs,1,_n)))) > _v = Matrix(zeros((_obs,1,_n),Float64)) > _Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 > _d = Matrix(zeros((_obs,1,_n),Float64)) > _H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 > _T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 > _c = Matrix(zeros((_obs,1,_k),Float64)) > _R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 > _Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 > _a = Matrix(zeros((_obs,1,_k),Float64)) > _a0 = Matrix(zeros((_k,1),Float64)) > _ap = _a !!! Are you sure? This does not copy, but only makes a new reference > _as = _a !!! Same here > _P = Matrix(zeros((_obs,_k,_k),Float64)) > _P0 = Matrix(zeros((_k,_k),Float64)) > _Pp = _P > _Ps = _P > _LL = Matrix(zeros((_obs,1,1),Float64)) > > def Filter(): # Kalman filtering routine > > _ap[0] = _T[0] * _a0 + _c[0] > _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() > > for t in range(1,_obs-1): > > _ap[t] = _T[t] * _a[t-1] + _c[t] !!! You are changing _a and _as and _ap at the same time > _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() > > Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] > Ft_inverse = inverse(Ft) > _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] > > _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() > _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * > _Pp[t] > _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * > Ft_inverse * _v[t].T()) > > Filter() > ____________________________________________________________________________ > ________ > > > > ********************************************************************** > " This email is intended only for the use of the individual or entity > named above and may contain information that is confidential and > privileged. If you are not the intended recipient, you are hereby > notified that any dissemination, distribution or copying of this > Email is strictly prohibited. When addressed to our clients, any > opinions or advice contained in this Email are subject to the > terms and conditions expressed in the governing KPMG client > engagement letter. If you have received this Email in error, please > notify us immediately by return email or telephone +61 2 93357000 > and destroy the original message. Thank You. " > **********************************************************************... > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From anthony.seward at ieee.org Tue Jan 2 11:45:55 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Tue, 2 Jan 2001 09:45:55 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: Message-ID: On Thu, 28 Dec 2000, Paul F. Dubois wrote: > A millenium-end report from the Head Nummie (this name is a joke; see the > DEVELOPERS file): > > There have been a steady set of messages on the subject of I should do this > or that to make it easier to make RPMs. It is impossible for me to act on > these: I don't know much about RPMs, and if I did, I don't know if making > the change suggested is good or bad for someone doing something else, like > making Windows installers. Therefore my policy is to rely on the Distutils > people to work this out. Those who wish to make it easier to make a binary > installer for platform xyz should figure out what would be required by the > Distutils bdist family of commands. > > That is not to say that I don't appreciate people trying to help. I'm > grateful for all the support I get from the community. I think that relying > on division of labor in this case is the right thing to do, so that we take > advantage of the Distutils effort. If I'm wrong, I'll listen. > The problem that I pointed out is not a problem with building a binary package. Invoking './setup_all.py build' on a clean machine does not work. The numpy core is built, but the packages are not. The reason is that all of the packages are looking for 'Numeric/arrayobject.h' which does not exist until the numpy core has been installed at least once. Even then, the packages will use an old version of arrayobject.h. I see two solutions: 1) Have the setup script make a symbolic link in the package's include directory to the include directory of the numpy core. Call the symbolic link 'Numeric.' 2) Move the include files for the core to a subdirectory called 'Numeric.' I would prefer the first solution, but I'm not aware of a way for the non-unix versions of python to create a link to a directory. Tony From beausol at hpl.hp.com Tue Jan 2 11:57:35 2001 From: beausol at hpl.hp.com (Raymond Beausoleil) Date: Tue, 02 Jan 2001 08:57:35 -0800 Subject: [Numpy-discussion] Question: LAPACK-LITE consequences for Numpy speed (win32 platfor m) In-Reply-To: Message-ID: <5.0.2.1.2.20010102083525.00aac000@hplex1.hpl.hp.com> I use Windows NT/2000 (at gunpoint), and I've gotten much better performance than NumPy's Lite code by using the Intel MKL. I keep meaning to post these results to an external HP Labs web page, but we're currently engaged in an internal snit about what those pages are allowed to look like. Essentially, with NumPy 17 and Python 1.5, I've been able to: (1) Replace NumPy BLAS with Intel's MKL BLAS, after a number of routine renaming edits in the NumPy code. (Apparently "cblas" is an irresistible prefix.) This turns out to be pretty easy. (2) Replace portions of LAPACK Lite with the corresponding routines from Intel's MKL. I've been hacking away at this on a piecemeal basis, because the MKL uses Fortran calling conventions and array-ordering. In most of the MKL routines, the usual LAPACK flags for transposition are available to compensate. (3) Add new LAPACK functionality using LAPACK routines from the MKL that are not included in LAPACK Lite. I've been meaning to pick this project up again, since I want to use the Intel FFT routines in some new code that I'm writing. The chief benefit of the MKL libraries is the ability to easily use multiple processors and threads simply by setting a couple of environment variables. Until recently (when MATLAB finally started using LAPACK), I was able to gain significantly better performance than MATLAB using the Intel BLAS/LAPACK, even without multiple processors. Now the Intel libraries are only about 10-25% faster, if my watch can be believed. I have _no idea_ how the Intel MKL performs relative to the native Linux BLAS/LAPACK distributions, but I know that (for example) the MKL is about 50% faster than the linear algebra routines shipped by NAG. Although I haven't figured out the standard NumPy distribution process yet (I still use that pesky Visual Studio IDE), I'll elevate the development of a general-purpose integration of the Intel MKL into NumPy to an Official New Year's Resolution. I made my last ONYR about ten years ago, and was able to give up channel-surfing using the TV remote control. So, I have demonstrated an ability to deliver in this department; the realization that we've just started a new millennium only adds to the pressure. Presumably, when I get this finished (I'll target the end of the month), I'll be able to find a place to post it. ============================ Ray Beausoleil Hewlett-Packard Laboratories mailto:beausol at hpl.hp.com 425-883-6648 Office 425-957-4951 Telnet 425-941-2566 Mobile ============================ At 05:56 PM 1/2/2001 +1100, Shuetrim, Geoff wrote: >Apologies for asking an overly vague question like this but: >on an Intel/win32 platform (I only have a windows version of >Gauss), I am comparing Numpy matrix inversion with that >of Gauss (very much the same type of software as Matlab >which at least some of you used to work with). As the size >of the matrix to be inverted increases, the speed of Numpy >appears to asymptote (on my machine) to about half that of >Gauss. For small matrices, it is much worse than that >because of the various overheads that are dealt with by Numpy. > >Would this speed differential be largely eliminated if I was not >using LAPACK-LITE? If so I will try to figure my way through >hooking into Intel's MKL - anyone got hints on doing this - I saw >mention of it in the mailing list archives. Would I be better off, >speed-wise, eschewing win32 altogether and using native LAPACK >and BLAS libraries on my Linux box? > >This is relevant to me in the context of a multivariate Kalman filtering >module that I am working up to replace one I have been using on >the Gauss platform for years. The Numpy version of my filter has a >very similar logic structure to that of Gauss but is drastically slower. >I have only been working with Numpy for a month or so which may >mean that my code is relatively innefficient. I have been assuming >that Numpy - as an interpreted language is mainly slowed by >looping structures. > >Thanks in advance, > >Geoff Shuetrim > >____________________________________________________________________________ >________ > >A simple version of the filter is given below: >(Note that I have modified Matrix.py in my installation to include a >transpose method for the Matrix class, T()). > ># *************************************************************** ># kalman.py module by Geoff Shuetrim ># ># Please note - this code is thoroughly untested at this stage ># ># You may copy and use this module as you see fit with no ># guarantee implied provided you keep this notice in all copies. ># *************************************************************** > ># Minimization routines >"""kalman.py > >Routines to implement Kalman filtering and smoothing >for multivariate linear state space representations >of time-series models. > >Notation follows that contained in Harvey, A.C. (1989) >"Forecasting Structural Time Series Models and the Kalman Filter". > >Filter --- Filter - condition on data to date > >""" > ># Import the necessary modules to use NumPy >import math >from Numeric import * >from LinearAlgebra import * >from RandomArray import * >import MLab >from Matrix import * > ># Initialise module constants >Num = Numeric >max = MLab.max >min = MLab.min >abs = Num.absolute >__version__="0.0.1" > ># filtration constants >_obs = 100 >_k = 1 >_n = 1 >_s = 1 > ># Filtration global arrays >_y = Matrix(cumsum(standard_normal((_obs,1,_n)))) >_v = Matrix(zeros((_obs,1,_n),Float64)) >_Z = Matrix(ones((_obs,_k,_n),Float64)) + 1.0 >_d = Matrix(zeros((_obs,1,_n),Float64)) >_H = Matrix(zeros((_obs,_n,_n),Float64)) + 1.0 >_T = Matrix(zeros((_obs,_k,_k),Float64)) + 1.0 >_c = Matrix(zeros((_obs,1,_k),Float64)) >_R = Matrix(zeros((_obs,_k,_s),Float64)) + 1.0 >_Q = Matrix(zeros((_obs,_s,_s),Float64)) + 1.0 >_a = Matrix(zeros((_obs,1,_k),Float64)) >_a0 = Matrix(zeros((_k,1),Float64)) >_ap = _a >_as = _a >_P = Matrix(zeros((_obs,_k,_k),Float64)) >_P0 = Matrix(zeros((_k,_k),Float64)) >_Pp = _P >_Ps = _P >_LL = Matrix(zeros((_obs,1,1),Float64)) > >def Filter(): # Kalman filtering routine > > _ap[0] = _T[0] * _a0 + _c[0] > _Pp[0] = _T[0] * _P0 * _T[0].T() + _R[0] * _Q[0] * _R[0].T() > > for t in range(1,_obs-1): > > _ap[t] = _T[t] * _a[t-1] + _c[t] > _Pp[t] = _T[t] * _P0 * _T[t].T() + _R[t] * _Q[t] * _R[t].T() > > Ft = _Z[t] * _Pp[t] * _Z[t].T() + _H[t] > Ft_inverse = inverse(Ft) > _v[t] = _y[t] - _Z[t] * _ap[t] - _d[t] > > _a[t] = _ap[t] + _Pp[t] * _Z[t].T() * Ft_inverse * _v[t].T() > _P[t] = _Pp[t] - _Pp[t].T() * _Z[t].T() * Ft_inverse * _Z[t] * >_Pp[t] > _LL[t] = -0.5 * (log(2*pi) + log(determinant(Ft)) + _v[t] * >Ft_inverse * _v[t].T()) > >Filter() >____________________________________________________________________________ >________ > > > >********************************************************************** >" This email is intended only for the use of the individual or entity >named above and may contain information that is confidential and >privileged. If you are not the intended recipient, you are hereby >notified that any dissemination, distribution or copying of this >Email is strictly prohibited. When addressed to our clients, any >opinions or advice contained in this Email are subject to the >terms and conditions expressed in the governing KPMG client >engagement letter. If you have received this Email in error, please >notify us immediately by return email or telephone +61 2 93357000 >and destroy the original message. Thank You. " >**********************************************************************... > >_______________________________________________ >Numpy-discussion mailing list >Numpy-discussion at lists.sourceforge.net >http://lists.sourceforge.net/mailman/listinfo/numpy-discussion ============================ Ray Beausoleil Hewlett-Packard Laboratories mailto:beausol at hpl.hp.com 425-883-6648 Office 425-957-4951 Telnet 425-941-2566 Mobile ============================ From hinsen at cnrs-orleans.fr Wed Jan 3 08:14:23 2001 From: hinsen at cnrs-orleans.fr (Konrad Hinsen) Date: Wed, 3 Jan 2001 14:14:23 +0100 Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: (message from Tony Seward on Tue, 2 Jan 2001 09:45:55 -0700 (MST)) References: Message-ID: <200101031314.OAA10152@chinon.cnrs-orleans.fr> > I see two solutions: > 1) Have the setup script make a symbolic link in the package's include > directory to the include directory of the numpy core. Call the symbolic > link 'Numeric.' > > 2) Move the include files for the core to a subdirectory called 'Numeric.' There's a third one: 3) Have the "build" stage of the packages copy the core header files into their private include directories. This doesn't require links at the cost of wasting (temporarily) a tiny amount of disk space. Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hinsen at cnrs-orleans.fr Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.56.24 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- From anthony.seward at ieee.org Wed Jan 3 12:38:38 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Wed, 3 Jan 2001 10:38:38 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: <200101031314.OAA10152@chinon.cnrs-orleans.fr> Message-ID: On Wed, 3 Jan 2001, Konrad Hinsen wrote: > > I see two solutions: > > 1) Have the setup script make a symbolic link in the package's include > > directory to the include directory of the numpy core. Call the symbolic > > link 'Numeric.' > > > > 2) Move the include files for the core to a subdirectory called 'Numeric.' > > There's a third one: > > 3) Have the "build" stage of the packages copy the core header files > into their private include directories. > > This doesn't require links at the cost of wasting (temporarily) a tiny > amount of disk space. > > Konrad. > I have attached a patch that impliments your solution. So far it is working for me. When I've finished with the RPM spec file I will post that to the list as well. Tony From anthony.seward at ieee.org Wed Jan 3 12:47:09 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Wed, 3 Jan 2001 10:47:09 -0700 (MST) Subject: [Numpy-discussion] Happy New Year Nummies -- and some discussion about the future In-Reply-To: Message-ID: On Wed, 3 Jan 2001, Tony Seward wrote: > On Wed, 3 Jan 2001, Konrad Hinsen wrote: > > > I have attached a patch that impliments your solution. So far it is working > for me. When I've finished with the RPM spec file I will post that to the > list as well. > > Tony > oops. The patch is attached to this message. No really. -------------- next part -------------- diff -ru orig-Numeric-17.1.2/Packages/FFT/setup.py Numeric-17.1.2/Packages/FFT/setup.py --- orig-Numeric-17.1.2/Packages/FFT/setup.py Fri Sep 29 09:02:20 2000 +++ Numeric-17.1.2/Packages/FFT/setup.py Wed Jan 3 09:55:44 2001 @@ -8,6 +8,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + setup (name = "FFTPACK", version = "1.0", maintainer = "Numerical Python Developers", diff -ru orig-Numeric-17.1.2/Packages/LALITE/setup.py Numeric-17.1.2/Packages/LALITE/setup.py --- orig-Numeric-17.1.2/Packages/LALITE/setup.py Fri Sep 29 09:02:34 2000 +++ Numeric-17.1.2/Packages/LALITE/setup.py Wed Jan 3 09:57:17 2001 @@ -13,6 +13,24 @@ except: raise SystemExit, "Distutils problem. See Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + headers = glob (os.path.join ("Include","*.h")) # delete all but the first one in this list if using your own LAPACK/BLAS diff -ru orig-Numeric-17.1.2/Packages/RANLIB/setup.py Numeric-17.1.2/Packages/RANLIB/setup.py --- orig-Numeric-17.1.2/Packages/RANLIB/setup.py Fri Sep 29 09:02:55 2000 +++ Numeric-17.1.2/Packages/RANLIB/setup.py Wed Jan 3 09:57:37 2001 @@ -13,6 +13,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + headers = glob (os.path.join ("Include","*.h")) setup (name = "Numeric", packages = [''], diff -ru orig-Numeric-17.1.2/Packages/RNG/setup.py Numeric-17.1.2/Packages/RNG/setup.py --- orig-Numeric-17.1.2/Packages/RNG/setup.py Fri Sep 29 09:03:05 2000 +++ Numeric-17.1.2/Packages/RNG/setup.py Wed Jan 3 09:58:02 2001 @@ -16,6 +16,24 @@ except: raise SystemExit, "Distutils problem, see Numeric README." +# __ See if we are being built 'in place' +if os.path.exists(os.path.join('..','..','Include','arrayobject.h')): + # __ We are building in place + if not os.path.exists(os.path.join('Include','Numeric')): + os.mkdir(os.path.join('Include','Numeric')); + + # __ Copy the core numeric header to the package's include directory + import shutil + headers = glob(os.path.join('..','..','Include','*.h')); + for h in headers: + # __ Use 'copy2' because it preserves modification times + shutil.copy2(h, os.path.join('Include','Numeric')); + +else: + # __ We are not building in place so assume that the headers + # from the numeric core are installed in their proper place + pass + setup (name = "RNG", version = "3.0", maintainer = "Paul Dubois", From Oliphant.Travis at mayo.edu Wed Jan 3 17:30:29 2001 From: Oliphant.Travis at mayo.edu (Travis Oliphant) Date: Wed, 3 Jan 2001 16:30:29 -0600 (CST) Subject: [Numpy-discussion] Re: Numpy-discussion digest, Vol 1 #152 - 2 msgs In-Reply-To: Message-ID: > > A millenium-end report from the Head Nummie (this name is a joke; see the > DEVELOPERS file): > Our nummie-ears are listening.... > There have been a steady set of messages on the subject of I should do this > or that to make it easier to make RPMs. It is impossible for me to act on > these: I don't know much about RPMs, and if I did, I don't know if making > the change suggested is good or bad for someone doing something else, like > making Windows installers. > Therefore my policy is to rely on the Distutils > people to work this out. Those who wish to make it easier to make a binary > installer for platform xyz should figure out what would be required by the > Distutils bdist family of commands. Good idea to go with the distutils for doing this. I've delayed RPM's for this reason until I figure out how to interact with the distutils better (I haven't spent much time doing it yet). > > That is not to say that I don't appreciate people trying to help. I'm > grateful for all the support I get from the community. I think that relying > on division of labor in this case is the right thing to do, so that we take > advantage of the Distutils effort. If I'm wrong, I'll listen. > > There are a number of bug reports on the sourceforge site. I would be > grateful for patches. In particular there are two reports dealing with FFTs. > I lack the expertise to deal with these. I'll look into these unless somebody has them done. > > The masked array package MA has been getting more of a workout as I put it > into production at my site. I believe that it fills not only the immediate > need for dealing with missing values, but can serve as a model for how to > make a "Numeric-workalike" with extra properties, since we can't inherit > from Numeric's array. Since MA is improved fairly often, I recommend keeping > up via cvs if you are a user. > > I have new manuals but have had difficulty with the transport up to > SourceForge. Anybody else having such a problem? I used scp from a Linux box > and it sat there and didn't do anything. > > The rest of this is for developers. > > Actually, once you get into it, it isn't all that clear that inheritance > would help very much. For example, suppose you want an array class F that > has masked values but also has a special behavior f() controlled by a > parameter set at creation, beta. Suppose therefore you decide to inherit > from class MA. Thus the constructor of your new class must take the same > arguments as an MA array but add a beta=somedefault. OK, we do that. Now we > can construct a couple of F's: > f1 = F([1.,2.,3.], beta=1.) > f2 = F([4.,2.,1.], beta=2.) > Great. Now we can do f1.f(), f2.f(). Maybe we redefine __str__ so we can > print beta when we print f1. > > Now try to do something. Anything. Say, > f3 = f1 + f2 > > Oops. f3 is an MA, not an F. We might have written __add__ in MA so the it > used self.__class__ to construct the answer. But since the constructor now > needs a new parameter, there is no way MA.__add__ can make an F. It doesn't > know how. Doesn't know how to call F(), doesn't know what value to use for > beta anyway. > > So now we redefine all the methods in MA. Besides muttering that maybe > inheriting didn't buy me a lot, I am still nowhere, for the next thing I > realize is that every function f(a) that takes an MA as an argument and > returns an MA, still returns an MA. If any of these make sense for an > instance of F, I have to replace them, just as MA replaced sin, cos, sqrt, > take, etc. from Numeric. > > I have therefore come to the conclusion that we have been barking up the > wrong tree. There might be a few cases where inheritance would buy you > something, but essentially Numeric and MA are useless as parents. Instead, > what would be good to have is a python class Numeric built upon a suite of C > routines that did all the real work, such as adding, transposing, iterating > over elements, applying a function to each element, etc. I think this is what I've been trying to do in the "rewrite." Paul Barrett has made some excellent progress here. > Since it is now > possible to build a Python class with C methods, which it was not when > Numeric was created, we ought to think about it. What does this mean? What feature gives this ability? I'm not sure I see when this changed? > Such an API could be used > to make other classes with good performance. We could lose the artificial > layer that is in there now that makes it so tedious to add a function. (I > counted something like five or six places I had to modify when I added > "put".) I'd love to talk more with you about this. I'm now at my new place for anyone wishing to contact me. Travis Oliphant 437 CB Brigham Young University Provo, UT 84602 oliphant at ee.byu.edu (801) 378-3108 Thanks for you great efforts Paul. From Barrett at stsci.edu Wed Jan 3 18:04:34 2001 From: Barrett at stsci.edu (Paul Barrett) Date: Wed, 3 Jan 2001 18:04:34 -0500 (EST) Subject: [Numpy-discussion] Re: Numpy-discussion digest, Vol 1 #152 - 2 msgs In-Reply-To: References: Message-ID: <14931.44574.756591.67934@nem-srvr.stsci.edu> Travis Oliphant writes: [snip snip] > > > > I have therefore come to the conclusion that we have been barking up the > > wrong tree. There might be a few cases where inheritance would buy you > > something, but essentially Numeric and MA are useless as parents. Instead, > > what would be good to have is a python class Numeric built upon a suite of C > > routines that did all the real work, such as adding, transposing, iterating > > over elements, applying a function to each element, etc. > > I think this is what I've been trying to do in the "rewrite." Paul > Barrett has made some excellent progress here. I am currently writing the PEP 209: Multidimensional Arrays documentation and hope to submit the initial draft by the end of the week for comments. The proposed design is along the lines Paul Dubois has suggested. > > Since it is now > > possible to build a Python class with C methods, which it was not when > > Numeric was created, we ought to think about it. > > What does this mean? What feature gives this ability? I'm not sure I see > when this changed? I'd also like to know what Paul Dubois means by this. > > Such an API could be used > > to make other classes with good performance. We could lose the artificial > > layer that is in there now that makes it so tedious to add a function. (I > > counted something like five or six places I had to modify when I added > > "put".) > > I'd love to talk more with you about this. Ditto! -- Dr. Paul Barrett Space Telescope Science Institute Phone: 410-338-4475 ESS/Science Software Group FAX: 410-338-4767 Baltimore, MD 21218 From pplumlee at omnigon.com Thu Jan 4 13:43:42 2001 From: pplumlee at omnigon.com (Phlip) Date: Thu, 4 Jan 2001 10:43:42 -0800 Subject: [Numpy-discussion] STL wrapper for PyArrayObject In-Reply-To: <14931.44574.756591.67934@nem-srvr.stsci.edu> References: <14931.44574.756591.67934@nem-srvr.stsci.edu> Message-ID: <01010410434300.06404@localhost.localdomain> Nummies Where I work, our official pipe dreamer has decided to wrap a multiarray up as an STL container. This means we can write convenient high-level code in both C++ and Python, and use this wrapper in the bridge between them. E-searches for a pre-existing cut of this yield negative. Has anyone done this yet? Or am I (gulp) the first? --Phlip http://c2.com/cgi/wiki?PhlIp From paulp at ActiveState.com Thu Jan 4 20:22:06 2001 From: paulp at ActiveState.com (Paul Prescod) Date: Thu, 04 Jan 2001 17:22:06 -0800 Subject: [Numpy-discussion] Multi-distribution distributions Message-ID: <3A5521BE.1E8DF72@ActiveState.com> I'm somewhat surprised by the fact that some distributions (e.g. Numpy, Zodb) have multiple setup.py programs. As far as I know, these setup.py's do not share information so there is no way to do a bdist_wininst or bdist_rpm that builds a single distribution for these multiple sub-packages. I see this as a fairly large problem! The bdist_ functions are an important part of Distutils functionality. Paul Prescod From pauldubois at home.com Fri Jan 5 00:27:05 2001 From: pauldubois at home.com (Paul F. Dubois) Date: Thu, 4 Jan 2001 21:27:05 -0800 Subject: [Numpy-discussion] Multi-distribution distributions In-Reply-To: <3A5521BE.1E8DF72@ActiveState.com> Message-ID: Explanation: we made these packages optional partly because they ought to be optional but partly because one of them depends on LAPACK and many people wish to configure the packages to use a different LAPACK than the limited "lite" one supplied. It would be correct in the spirit of SourceForge and Python packages to make every single one of these optional packages a separate "project" or at least a separate download. That would raise the overhead. Technically the manual should be split up into pieces too. I don't think all that trouble is worth undergoing in order to "solve" this problem. So, you could just build separate rpms for each of the optional packages. They are NOT subpackages in the true sense of the word, except that a couple of them install into Numeric's directory for backward compatibility. Numeric is not a true package either. Again, people have argued (and I agree) that purity of thought is trumped by backward compatibility and that we should leave well enough alone. You are welcome to add a script which runs the bdist functions on all the optional packages, in much the same way setup_all.py works. You do need to face the issue if making a bdist for the public of which LAPACK you use on which platform. I believe I was one of the earliest and hardest pushers for Distutils, so I have no trouble agreeing with your goals. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Paul Prescod Sent: Thursday, January 04, 2001 5:22 PM To: distutils-sig at python.org Cc: akuchlin at mems-exchange.org; numpy-discussion at lists.sourceforge.net Subject: [Numpy-discussion] Multi-distribution distributions I'm somewhat surprised by the fact that some distributions (e.g. Numpy, Zodb) have multiple setup.py programs. As far as I know, these setup.py's do not share information so there is no way to do a bdist_wininst or bdist_rpm that builds a single distribution for these multiple sub-packages. I see this as a fairly large problem! The bdist_ functions are an important part of Distutils functionality. Paul Prescod _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/numpy-discussion From Roy.Dragseth at cc.uit.no Fri Jan 5 08:53:49 2001 From: Roy.Dragseth at cc.uit.no (Roy Dragseth) Date: Fri, 05 Jan 2001 14:53:49 +0100 Subject: [Numpy-discussion] Numeric and swig (or, how to wrap functions). Message-ID: <200101051353.OAA25994@paiute.cc.uit.no> Hello all. I'm trying to figure out a (simple) way to wrap extension functions using Numeric and swig. I want to use swig because I find the method described i ch. 12 of the user manual a bit too elaborate. The problem I'm facing with swig is that I cannot figure out a way to pass arrays to the functions expecting double pointers as input. Say, for example, that I want to wrap daxpy: 1. I put the function call interface into a swig file: void daxpy(int* n,double* a, double* x,int* incx, double* y, int* incy) 2. I want the function call visible to python to be something like: x = Numeric.arange(0.0,10.0) y = a[:] a = 1.0 daxpy(a,x,y) 3. To achieve 2. I make a python function that does the real call to daxpy: def daxpy(a,x,y): n = len(x) d_x = GetDataPointer(x) inc_x = . . . daxpy_c(n,a,d_x,inc_x,d_y,inc_y) (daxpy_c is the real daxpy) The problem I'm facing is that I cannot grab the internal data pointer from Numeric arrays and pass it to a function. Is there a simple way to do that without having to write a wrapper function in c for every function I want to use? How should I write the function GetDataPointer()? Any hints is greatly appreciated. Best regards, Roy. The Computer Center, University of Troms?, N-9037 TROMS?, Norway. phone:+47 77 64 41 07, fax:+47 77 64 41 00 Roy Dragseth, High Perfomance Computing System Administrator Direct call: +47 77 64 62 56. email: royd at cc.uit.no From rogerha at ifi.uio.no Fri Jan 5 09:48:15 2001 From: rogerha at ifi.uio.no (Roger Hansen) Date: 05 Jan 2001 15:48:15 +0100 Subject: [Numpy-discussion] Numeric and swig (or, how to wrap functions). In-Reply-To: <200101051353.OAA25994@paiute.cc.uit.no> References: <200101051353.OAA25994@paiute.cc.uit.no> Message-ID: * Roy Dragseth [snip Numeric and swig questions about how to give a numpy array to a swigged c function] > Any hints is greatly appreciated. I've done this a few times. What you need to do is help swig understand that a numpy array is input and how to treat this as a C array. With swig you can do this with a typemap. Here's an example: we can create a swig interface file like %module exPyC %{ #include #include /* Remember this one */ #include %} %init %{ import_array() /* Initial function for NumPy */ %} %typemap(python,in) double * { PyArrayObject *py_arr; /* first check if it is a NumPy array */ if (!PyArray_Check($source)) { PyErr_SetString(PyExc_TypeError, "Not a NumPy array"); return NULL; } if (PyArray_ObjectType($source,0) != PyArray_DOUBLE) { PyErr_SetString(PyExc_ValueError, "Array must be of type double"); return NULL; } /* check that array is 1D */ py_arr = PyArray_ContiguousFromObject($source, PyArray_DOUBLE, 1, 1); /* set a double pointer to the NumPy allocated memory */ $target = py_arr->data; } /* end of typemap */ %inline %{ void arrayArg(double *arr, int n) { int i; for (i = 0; i < n; i++) { arr[i] = f(i*1.0/n); } } %} Now, compile your extension, and test >>> import exPyC >>> import Numeric >>> a = Numeric.zeros(10,'d') >>> a array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]) >>> exPyC.arrayArg(a,10) >>> a array([ 0. , 0.00996671, 0.0394695 , 0.08733219, 0.15164665, 0.22984885, 0.31882112, 0.41501643, 0.51459976, 0.61360105]) You can also get rid of the length argument with a typemap(python,ignore), see the swig docs. For more info about the safety checks in the typemap check the numpy docs. Lykke til! :-) HTH, Roger From nwagner at isd.uni-stuttgart.de Fri Jan 5 08:16:45 2001 From: nwagner at isd.uni-stuttgart.de (Nils Wagner) Date: Fri, 05 Jan 2001 14:16:45 +0100 Subject: [Numpy-discussion] Extensions of Numpy Message-ID: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> Hi, I am looking for some functions for the computation of Matrix functions : matrix exponential matrix square root matrix logarithm Eigenvalue problems : generalized eigenvalue problems polynomial eigenvalue problems I am also interested in ODE (ordinary differential equation) solvers. Is there any progress ? Nils From paulp at ActiveState.com Fri Jan 5 12:10:20 2001 From: paulp at ActiveState.com (Paul Prescod) Date: Fri, 05 Jan 2001 09:10:20 -0800 Subject: [Distutils] RE: [Numpy-discussion] Multi-distribution distributions References: Message-ID: <3A55FFFC.3DB3872D@ActiveState.com> "Paul F. Dubois" wrote: > > Explanation: we made these packages optional partly because they ought to be > optional but partly because one of them depends on LAPACK and many people > wish to configure the packages to use a different LAPACK than the limited > "lite" one supplied. Wow, your "lite" module is half a meg. I'd hate to see the heavy version. :) I may not understand the extension so please tell me if I'm off-base: Would it be simpler to have a single setup.py and a set of flags to turn on and off each of the "secondary" extensions? > You are welcome to add a script which runs the bdist functions on all the > optional packages, in much the same way setup_all.py works. If I did this, I would consider a different strategy. I would suggest that each of the setup.py's could move their "setup" function into an if __name__=="__main__" block. Then setup_all.py could import each one (a little trickery needed there) and then combine the symbols like sourcelist, headers, ext_modules and so forth. > You do need to > face the issue if making a bdist for the public of which LAPACK you use on > which platform. I would expect to use lapack_lite.pyd. It's easy enough to override it by copying a custom lapack on top. Paul Prescod From manka at us.edu.pl Sat Jan 6 15:20:33 2001 From: manka at us.edu.pl (Ryszard Manka) Date: Sat, 6 Jan 2001 20:20:33 +0000 Subject: [Numpy-discussion] Chebyshev In-Reply-To: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> References: <3A55C93D.526BDF7D@isd.uni-stuttgart.de> Message-ID: <01010620203300.00936@manka> >Nils Wagner writes: > > I am also interested in ODE (ordinary differential equation) solvers. > Hi, Some examples of solving ODE using the Chebyshev polynomial I put on my web home page : http://uranos.cto.us.edu.pl/~manka/math/newmath.html numpy, pyfort and gist are needed. -- Ryszard Manka manka at us.edu.pl From phlip_cpp at my-deja.com Sat Jan 6 20:58:25 2001 From: phlip_cpp at my-deja.com (Phlip TheProgrammer) Date: Sat, 6 Jan 2001 17:58:25 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject Message-ID: <200101070158.RAA24234@mail19.bigmailbox.com> An embedded and charset-unspecified text was scrubbed... Name: not available URL: From phlip_cpp at my-deja.com Sat Jan 6 01:10:52 2001 From: phlip_cpp at my-deja.com (phlip) Date: Fri, 5 Jan 2001 22:10:52 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject In-Reply-To: References: Message-ID: <01010522105200.01585@cuzco.concentric.net> On Friday 05 January 2001 06:24, pfenn at mmm.com wrote: > Be sure to look at the CXX on sourceforge. Ja. This is a sweet combination. The Joy of Swig is we can declare some parameters as basic types, but others as raw PyObjects. Then we can have our way with these. And using CXX, we wrap these objects in high-level C++ methods. Not the low-level fragile C. The effect compares favorably to ATL for VC++ and ActiveX. But there's just one little question here. (Hence the buckshot of list servers.) If we pass a multiarray into a function expecting a PyArrayObject, how then do we add new elements to it? I tried things like 'push_back()' and 'setItem()', but they either did not exist or did not extend the array. Am I going to have to generate a new array and copy the old one into the new? --Phlip From pauldubois at home.com Mon Jan 8 12:23:10 2001 From: pauldubois at home.com (Paul F. Dubois) Date: Mon, 8 Jan 2001 09:23:10 -0800 Subject: [Numpy-discussion] RE: [Numpy-developers] [Bug #128025] MA seems to leak memory In-Reply-To: Message-ID: The helpful person who submitted this report asks for a reply but does not give their name. Thank you for finding this problem. It turns out that this is and this isn't a problem. On the whole, it is. I have a fix but need to warn people that it will break any existing code that takes advantage of a feature I did not advertise. (:-> Here's the story: class MA inherits from a class ActiveAttributes, supplied with MA as an announced module in the package. ActiveAttributes is designed to encapsulate the behavior that Numeric has, of having attributes that are not really attributes but are trios of functions. For an example of what I mean by this, consider the .shape attribute. print x.shape x.shape = (3, 9) del x.shape shape appears to be the name of an attribute but in fact it is not. There are actually a trio of handlers, so that the above actually execute more like they were print x.__getshape() x.__setshape((3,9)) x.__delattr('shape') Now for the problem. In implementing ActiveAttributes, I set up an indirect system of handlers for each "active" attribute like 'shape', and in remembering what handlers to use I carelessly used *bound* methods rather than *unbound* methods. This meant that each instance contained a reference cycle. However, Python 2.0 has a cyclic garbage collector that runs periodically so over the course of a long routine like my test routine, the garbage was in fact being collected. The bug-poster's patch reveals the problem by doing it a lot of memory operations in a very few statements. The fix is to change ActiveAttributes to save unbound rather than bound methods. This works and prevents the observed growth. I will check it in to CVS with a new release number 4.0 when I have finished testing another idea that may also be an improvement. Current users may wish to invoke the facilities of the "gc" module in 2.0 to increase the frequency of collection if they are currently experiencing a problem. To anyone out there who had noticed the activeattr.py module and used it, this change will require an incompatible change to your code when you switch to the new version. -----Original Message----- From: numpy-developers-admin at lists.sourceforge.net [mailto:numpy-developers-admin at lists.sourceforge.net]On Behalf Of noreply at sourceforge.net Sent: Monday, January 08, 2001 4:27 AM To: noreply at sourceforge.net; noreply at sourceforge.net; numpy-developers at sourceforge.net Subject: [Numpy-developers] [Bug #128025] MA seems to leak memory Bug #128025, was updated on 2001-Jan-08 04:26 Here is a current snapshot of the bug. Project: Numerical Python Category: Fatal Error Status: Open Resolution: None Bug Group: Robustness lacking Priority: 5 Submitted by: nobody Assigned to : nobody Summary: MA seems to leak memory Details: I executed the following code in the interpreter: >>> import MA >>> a = MA.arange(10000000) >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] >>> b = a[:5000000] After array a was created, memory consumption was around 40 MB. After the b slices, it's now 190 MB. Even if MA does make a copy-on-slice, it should IMHO free the slice after there are no more references to it. (I am not a Sourceforge user or anything, so I would appreciate if someone would at least let me know if this is a known problem.) For detailed info, follow this link: http://sourceforge.net/bugs/?func=detailbug&bug_id=128025&group_id=1369 _______________________________________________ Numpy-developers mailing list Numpy-developers at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/numpy-developers From guido at python.org Tue Jan 9 08:58:21 2001 From: guido at python.org (Guido van Rossum) Date: Tue, 09 Jan 2001 08:58:21 -0500 Subject: [Numpy-discussion] Need help fixing NumPy links on www.python.org Message-ID: <200101091358.IAA09254@cj20424-a.reston1.va.home.com> It appears that all the links to NumPy for Download and Documentation on www.python.org are broken. This looks bad for www.python.org and for Numerical Python! In particular the links on these pages are all broken: http://www.python.org/topics/scicomp/numpy_download.html link to ftp://ftp-icf.llnl.gov/pub/python/README.html http://www.python.org/topics/scicomp/documentation.html links to ftp://ftp-icf.llnl.gov/pub/python/numericalpython.pdf, ftp://ftp-icf.llnl.gov/pub/python/NumTut.tgz, and the FAQ (http://www.python.org/cgi-bin/numpy-faq) Can anybody "in the know" mail me updated links? If you have updated HTML for the pages containign the links that would be even better! Please mail directly to guido at python.org; I'm not subscribed to this list. --Guido van Rossum (home page: http://www.python.org/~guido/) From cbarker at jps.net Tue Jan 9 13:55:11 2001 From: cbarker at jps.net (Chris Barker) Date: Tue, 09 Jan 2001 10:55:11 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject References: <200101070158.RAA24234@mail19.bigmailbox.com> Message-ID: <3A5B5E8F.5D206B7@jps.net> Phlip TheProgrammer wrote: > And using CXX, we wrap these objects in high-level C++ methods. Not the low-level fragile C. The effect compares favorably to ATL for VC++ and ActiveX. > If we pass a multiarray into a function expecting a PyArrayObject, how then do we add new elements to it? I tried things like 'push_back()' and 'setItem()', but they either did not exist or did not extend the array. > > Am I going to have to generate a new array and copy the old one into the new? I waited a little while before answering this, because there are certainly people more qualified to do so that me. I am only on the NumPy list, so it may have been answered on a different list. The short answer is yes, you will have to generate a new a array and copy the old one into the new. MultiArray objects were created to provide efficient storage of lots of numbers (among other things). Because of this requirement, the numbers are stored as a large single array, and so they cannot be re-sized without re-creating that array. You may be able to change just the data array itself (and a few properties), rather than creating a new structure entirely, but it probably wouldn't be worth it. By the way, I'd like to hear how this all works out. Being able to use NumPy Arrays in extensions more easily would be great! -Chris -- Christopher Barker, Ph.D. cbarker at jps.net --- --- --- http://www.jps.net/cbarker -----@@ -----@@ -----@@ ------@@@ ------@@@ ------@@@ Water Resources Engineering ------ @ ------ @ ------ @ Coastal and Fluvial Hydrodynamics ------- --------- -------- ------------------------------------------------------------------------ ------------------------------------------------------------------------ From pplumlee at omnigon.com Thu Jan 11 14:48:28 2001 From: pplumlee at omnigon.com (Phlip) Date: Thu, 11 Jan 2001 11:48:28 -0800 Subject: [Numpy-discussion] Re: STL wrapper for PyArrayObject In-Reply-To: <3A5B5E8F.5D206B7@jps.net> References: <200101070158.RAA24234@mail19.bigmailbox.com> <3A5B5E8F.5D206B7@jps.net> Message-ID: <01010911244501.01263@localhost.localdomain> Proclaimed Chris Barker from the mountaintops: > I waited a little while before answering this, because there are > certainly people more qualified to do so that me. I am only on the NumPy > list, so it may have been answered on a different list. The irritation is, without a CXX list server, I'm having to molest the off-topic fora where Dubois et al are reputed to hang out. > The short answer is yes, you will have to generate a new a array and > copy the old one into the new. MultiArray objects were created to > provide efficient storage of lots of numbers (among other things). > Because of this requirement, the numbers are stored as a large single > array, and so they cannot be re-sized without re-creating that array. > You may be able to change just the data array itself (and a few > properties), rather than creating a new structure entirely, but it > probably wouldn't be worth it. Here's the state of the system: static void copyData ( Py::Array & ma, vector > & database, int maxFields ) { #if 1 Py::Sequence shape (Py::Int (2)); // <-- pow shape[0] = Py::Int (int (database.size())); shape[1] = Py::Int (maxFields); PyArray_Reshape ((PyArrayObject*)ma.ptr(), shape.ptr()); #else int zone[] = { database.size(), maxFields }; Py::Object mo ((PyObject*)PyArray_FromDims (2, zone, PyArray_OBJECT) ); ma = mo; assert (ma.dimension(1) == database.size()); assert (ma.dimension(2) == maxFields); for (int idxRow (0); idxRow < maxRows; ++idxRow) { Py::Array row (ma[idxRow]); for (int idxCol (0); idxCol < maxFields; ++idxCol) { string const & str (database[idxRow][idxCol]); Py::String pyStr (str.c_str()); Py::Object obj (pyStr); row [idxCol] = obj; // <-- pow } } #endif } Both versions crash on the line marked "pow". The top one crashes when I think I'm trying to do the equivalent of the Python array = (2.4) The bottom one crashes after creating a new array, right when I try to copy in an element. The Pythonic equivalent of matrixi [row][col] = "8" Everyone remember I'm not trying to presenve the old contents of the array - just return from the extension a new array full of stuff. > By the way, I'd like to hear how this all works out. Being able to use > NumPy Arrays in extensions more easily would be great! Our Chief Drive-by Architect has ordered me to use them like an in-memory database. >Sigh< --Phlip "...fanatical devotion to the Pope, and cute red uniforms..." From pfenn at mmm.com Thu Jan 11 17:29:22 2001 From: pfenn at mmm.com (pfenn at mmm.com) Date: Thu, 11 Jan 2001 16:29:22 -0600 Subject: [Numpy-discussion] Another CXX question Message-ID: This is probably a stupid coding error, but the following code generates the error: C:\Program Files\Microsoft Visual Studio\MyProjects\Xrawdatafile\Xrawdatafile.cpp(32) : error C2065: 'FromAPI' : undeclared identifier I've marked the line in the listing that causes the error. It occurs in the function that creates a new datafile_type object to return to the interpreter. Right now the datafile_type object is very simple, and has but one method that prints a hardcoded string. I'm using CXX-4.?? with Visual C++ 6.0 and SP3. I'm going to download SP4 tonight and see if that helps, but any hints would be appreciated. I haven't been doing C++ for very long, so I'm probably mising something obvious. //**************************************************************************************************************** // Xrawdatafile.cpp : Defines the entry point for the DLL application. // #include "stdafx.h" #include #include #include "Xrawdatafile.h" #include "datafile_type.h" BOOL APIENTRY DllMain(HANDLE hModule, DWORD ul_reason_for_call, LPVOID lpReserved) { switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: case DLL_THREAD_ATTACH: case DLL_THREAD_DETACH: case DLL_PROCESS_DETACH: break; } return TRUE; } class datafile_module : Py::ExtensionModule { public: datafile_module() : Py::ExtensionModule("datafile") { datafile_type::init(); add_varargs_method("datafile", &datafile_module::new_datafile,"datafile(filename)"); initialize("interface to ThermoFinnigan XRaw data files."); } virtual ~datafile_module() { }; private: Py::Object new_datafile(const Py::Tuple& args) { return Py::asObject((new datafile_type())); // THIS LINE CAUSES THE ERROR } }; extern "C" __declspec(dllexport) void initdatafile(void) { static datafile_module *dfm = new datafile_module; } //**************************************************************************************************************************** From arne.keller at ppm.u-psud.fr Tue Jan 16 07:55:40 2001 From: arne.keller at ppm.u-psud.fr (Arne Keller) Date: Tue, 16 Jan 2001 13:55:40 +0100 Subject: [Numpy-discussion] multipack Message-ID: <3A6444CC.74E221C4@ppm.u-psud.fr> I'm looking for the Multipack and cephes python modules by Travis Oliphant. The Multipack Home page link on numpy.sourceforge gives a 404... -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From phrxy at csv.warwick.ac.uk Tue Jan 16 14:15:36 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Tue, 16 Jan 2001 19:15:36 +0000 (GMT) Subject: [Numpy-discussion] multipack In-Reply-To: <3A6444CC.74E221C4@ppm.u-psud.fr> Message-ID: On Tue, 16 Jan 2001, Arne Keller wrote: > I'm looking for the Multipack and cephes python modules by Travis > Oliphant. > > The Multipack Home page link on numpy.sourceforge gives a 404... http://oliphant.netpedia.net/ http://oliphant.netpedia.net/multipack.html BTW, I've partly written a distutils script for this with the intention of making it easier to compile on windows (the setup.py works on linux but I need to change some things to make it set up the setup script according to the way the source has been set up). It seems distutils doesn't have explicit support for FORTRAN, so something needs changing there for it to work cross-platform. I haven't got any feedback yet from the Distutils people on how best to do this, but if anyone reading this who is practiced at compiling FORTRAN and C (especially on windows compilers) and wouldn't mind adding the little bits required to support FORTRAN for their system once this has been decided, mail me. John From jbaddor at physics.mcgill.ca Tue Jan 16 15:29:01 2001 From: jbaddor at physics.mcgill.ca (Jean-Bernard Addor) Date: Tue, 16 Jan 2001 15:29:01 -0500 (EST) Subject: [Numpy-discussion] strange underflows Message-ID: Hey Numpy people! I just compiled the last NumPy. Underflows don't give zero but generate OverflowError: Python 2.0 (#2, Dec 15 2000, 15:04:31) [GCC 2.95.2 19991024 (release)] on linux2 Type "copyright", "credits" or "license" for more information. Hello from .pythonrc.py >>> import Numeric >>> Numeric.__version__ '17.2.0' >>> Numeric.exp(-1.42676746e+12) Traceback (most recent call last): File "", line 1, in ? OverflowError: math range error >>> How to suppress these? Any hint? Jean-Bernard From jhauser at ifm.uni-kiel.de Tue Jan 16 16:07:09 2001 From: jhauser at ifm.uni-kiel.de (Janko Hauser) Date: Tue, 16 Jan 2001 22:07:09 +0100 (CET) Subject: [Numpy-discussion] multipack In-Reply-To: References: <3A6444CC.74E221C4@ppm.u-psud.fr> Message-ID: <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> These pages are depreciated, as Travis has moved. I could not find the announcement for the new pages. Menawhile you can check out these packages from the public cvs server of Pearu Peterson. For an overview of the cvs content look at http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ and for instructions to get it out of cvs look at http://cens.ioc.ee/projects/pysymbolic/ HTH, __Janko John J. Lee writes: > On Tue, 16 Jan 2001, Arne Keller wrote: > > > I'm looking for the Multipack and cephes python modules by Travis > > Oliphant. > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > http://oliphant.netpedia.net/ > > http://oliphant.netpedia.net/multipack.html > > BTW, I've partly written a distutils script for this with the intention of > making it easier to compile on windows (the setup.py works on linux but I > need to change some things to make it set up the setup script according to > the way the source has been set up). > > It seems distutils doesn't have explicit support for FORTRAN, so something > needs changing there for it to work cross-platform. I haven't got any > feedback yet from the Distutils people on how best to do this, but if > anyone reading this who is practiced at compiling FORTRAN and C > (especially on windows compilers) and wouldn't mind adding the little bits > required to support FORTRAN for their system once this has been decided, > mail me. > > > John > > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion From arne at cybercable.fr Wed Jan 17 04:30:10 2001 From: arne at cybercable.fr (arne keller) Date: Wed, 17 Jan 2001 10:30:10 +0100 Subject: [Numpy-discussion] multipack References: Message-ID: <3A656622.739ADC4B@cybercable.fr> "John J. Lee" wrote: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > I'm looking for the Multipack and cephes python modules by Travis > > Oliphant. > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > http://oliphant.netpedia.net/ > > http://oliphant.netpedia.net/multipack.html did you try these links? For me it returns the following message: Not Found The requested URL / was not found on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to > > BTW, I've partly written a distutils script for this with the intention of > making it easier to compile on windows (the setup.py works on linux but I > need to change some things to make it set up the setup script according to > the way the source has been set up). > > It seems distutils doesn't have explicit support for FORTRAN, so something > needs changing there for it to work cross-platform. I haven't got any > feedback yet from the Distutils people on how best to do this, but if > anyone reading this who is practiced at compiling FORTRAN and C > (especially on windows compilers) and wouldn't mind adding the little bits > required to support FORTRAN for their system once this has been decided, > mail me. > > John > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From boyle5 at llnl.gov Thu Jan 18 15:02:52 2001 From: boyle5 at llnl.gov (Jim Boyle) Date: Thu, 18 Jan 2001 12:02:52 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> Message-ID: I pulled the multipack code from the CVS server and built it for my new installation of Python 2.0. When I import Multipack I get: WARNING: Python C API version mismatch for module cephes: This Python has API version 1009, module cephes has version 1007 this warning occurs for the numpyio and sigtools modules. Have I done something screwy in the installation, or do the modules have to be updated? If the latter is there any guidance on what to fix? What are the routines/procedures that changed in going from version 1007 to 1009? So far everything appears to provide the correct answers, so the API mismatch is not crippling. However, from past experience I know that ignoring warnings often ends in tears. Jim >These pages are depreciated, as Travis has moved. I could not find the >announcement for the new pages. Menawhile you can check out these >packages from the public cvs server of Pearu Peterson. > >For an overview of the cvs content look at > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > >and for instructions to get it out of cvs look at > >http://cens.ioc.ee/projects/pysymbolic/ > >HTH, >__Janko > > >John J. Lee writes: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > Oliphant. > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > http://oliphant.netpedia.net/ > > > > http://oliphant.netpedia.net/multipack.html > > > > BTW, I've partly written a distutils script for this with the intention of > > making it easier to compile on windows (the setup.py works on linux but I > > need to change some things to make it set up the setup script according to > > the way the source has been set up). > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > needs changing there for it to work cross-platform. I haven't got any > > feedback yet from the Distutils people on how best to do this, but if > > anyone reading this who is practiced at compiling FORTRAN and C > > (especially on windows compilers) and wouldn't mind adding the little bits > > required to support FORTRAN for their system once this has been decided, > > mail me. From arne at cybercable.fr Thu Jan 18 15:24:54 2001 From: arne at cybercable.fr (arne keller) Date: Thu, 18 Jan 2001 21:24:54 +0100 Subject: [Numpy-discussion] multipack References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> Message-ID: <3A675116.C47AEB56@cybercable.fr> I install the multipack-0.7 with my new installation of Python2.0 and I don't have any warning : maybe a problem with PYTHONPATH if your old Multipack under your old Pyhon1.5 is still present? Jim Boyle wrote: > > I pulled the multipack code from the CVS server and built it for my > new installation of Python 2.0. > When I import Multipack I get: > > WARNING: Python C API version mismatch for module cephes: > This Python has API version 1009, module cephes has version 1007 > > this warning occurs for the numpyio and sigtools modules. > > Have I done something screwy in the installation, or do the modules > have to be updated? > If the latter is there any guidance on what to fix? What are the > routines/procedures that changed in going from version 1007 to 1009? > > So far everything appears to provide the correct answers, so the API > mismatch is not crippling. However, from past experience I know that > ignoring warnings often ends in tears. > > Jim > > >These pages are depreciated, as Travis has moved. I could not find the > >announcement for the new pages. Menawhile you can check out these > >packages from the public cvs server of Pearu Peterson. > > > >For an overview of the cvs content look at > > > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > > > >and for instructions to get it out of cvs look at > > > >http://cens.ioc.ee/projects/pysymbolic/ > > > >HTH, > >__Janko > > > > > >John J. Lee writes: > > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > > Oliphant. > > > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > > > http://oliphant.netpedia.net/ > > > > > > http://oliphant.netpedia.net/multipack.html > > > > > > BTW, I've partly written a distutils script for this with the intention of > > > making it easier to compile on windows (the setup.py works on linux but I > > > need to change some things to make it set up the setup script according to > > > the way the source has been set up). > > > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > > needs changing there for it to work cross-platform. I haven't got any > > > feedback yet from the Distutils people on how best to do this, but if > > > anyone reading this who is practiced at compiling FORTRAN and C > > > (especially on windows compilers) and wouldn't mind adding the little bits > > > required to support FORTRAN for their system once this has been decided, > > > mail me. > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion at lists.sourceforge.net > http://lists.sourceforge.net/lists/listinfo/numpy-discussion -- Arne Keller Laboratoire de Photophysique Moleculaire du CNRS, Bat. 213. Universite de Paris-Sud, 91405 Orsay Cedex, France. tel.: (33) 1 69 15 82 83 -- fax. : (33) 1 69 15 67 77 From paul at pfdubois.com Thu Jan 18 17:30:03 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Thu, 18 Jan 2001 14:30:03 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: Message-ID: This would typically mean that you are importing the package into a different python than the one you used to build it. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Jim Boyle Sent: Thursday, January 18, 2001 12:03 PM To: numpy-discussion at lists.sourceforge.net Subject: Re: [Numpy-discussion] multipack I pulled the multipack code from the CVS server and built it for my new installation of Python 2.0. When I import Multipack I get: WARNING: Python C API version mismatch for module cephes: This Python has API version 1009, module cephes has version 1007 this warning occurs for the numpyio and sigtools modules. Have I done something screwy in the installation, or do the modules have to be updated? If the latter is there any guidance on what to fix? What are the routines/procedures that changed in going from version 1007 to 1009? So far everything appears to provide the correct answers, so the API mismatch is not crippling. However, from past experience I know that ignoring warnings often ends in tears. Jim >These pages are depreciated, as Travis has moved. I could not find the >announcement for the new pages. Menawhile you can check out these >packages from the public cvs server of Pearu Peterson. > >For an overview of the cvs content look at > >http://cens.ioc.ee/cgi-bin/cvsweb/python/multipack/ > >and for instructions to get it out of cvs look at > >http://cens.ioc.ee/projects/pysymbolic/ > >HTH, >__Janko > > >John J. Lee writes: > > On Tue, 16 Jan 2001, Arne Keller wrote: > > > > > I'm looking for the Multipack and cephes python modules by Travis > > > Oliphant. > > > > > > The Multipack Home page link on numpy.sourceforge gives a 404... > > > > http://oliphant.netpedia.net/ > > > > http://oliphant.netpedia.net/multipack.html > > > > BTW, I've partly written a distutils script for this with the intention of > > making it easier to compile on windows (the setup.py works on linux but I > > need to change some things to make it set up the setup script according to > > the way the source has been set up). > > > > It seems distutils doesn't have explicit support for FORTRAN, so something > > needs changing there for it to work cross-platform. I haven't got any > > feedback yet from the Distutils people on how best to do this, but if > > anyone reading this who is practiced at compiling FORTRAN and C > > (especially on windows compilers) and wouldn't mind adding the little bits > > required to support FORTRAN for their system once this has been decided, > > mail me. _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From boyle5 at llnl.gov Fri Jan 19 12:33:22 2001 From: boyle5 at llnl.gov (Jim Boyle) Date: Fri, 19 Jan 2001 09:33:22 -0800 Subject: [Numpy-discussion] multipack In-Reply-To: <3A675116.C47AEB56@cybercable.fr> References: <3A6444CC.74E221C4@ppm.u-psud.fr> <20010116210709.5729.qmail@lisboa.ifm.uni-kiel.de> <3A675116.C47AEB56@cybercable.fr> Message-ID: Thanks to Arne and Paul I took the time to find out what was going on. It turns out that the CVS has multipack 0.8 and Travis added the cephes, numpyio and sigtools modules to multipack in going from 0.7 to 0.8. I did the usual things to account for a non-standard python location at the topmost level of Multipack as I had before with 0.7 but should have also tweaked the Makefiles in the cephes, numpyio and sigtools directories. The result was that these modules were built with Python 1.5 and all the others with 2.0 and I confused myself as to where the problem was. But when I tried to build the cephes module using Python 2.0 I got the following error: gcc -O2 -I/usr/local/include/python2.0 -c -o amos_wrappers.o amos_wrappers.c In file included from cephes/mconf.h:162, from amos_wrappers.h:12, from amos_wrappers.c:8: cephes/protos.h:67: parse error before `sizeof' cephes/protos.h:68: parse error before `sizeof' cephes/protos.h:69: parse error before `sizeof' make: *** [amos_wrappers.o] Error 1 the line: gcc -O2 -I/usr/local/include/python1.5 -c -o amos_wrappers.o amos_wrappers.c works fine. I have tried this on another machine with Python 2.0 and got the same error. The Python.h includes changed quite a bit in going from 1.5 to 2.0. Any ideas as to what is wrong would be welcome. Has anyone installed multipack 0.8 using Python 2.0? Jim >I install the multipack-0.7 with my new installation of Python2.0 and I >don't have any >warning : >maybe a problem with PYTHONPATH if your old Multipack under your old >Pyhon1.5 is still present? > > >Jim Boyle wrote: >> >> I pulled the multipack code from the CVS server and built it for my >> new installation of Python 2.0. >> When I import Multipack I get: >> >> WARNING: Python C API version mismatch for module cephes: >> This Python has API version 1009, module cephes has version 1007 >> >> this warning occurs for the numpyio and sigtools modules. >> >> Have I done something screwy in the installation, or do the modules >> have to be updated? >> If the latter is there any guidance on what to fix? What are the >> routines/procedures that changed in going from version 1007 to 1009? >> >> So far everything appears to provide the correct answers, so the API >> mismatch is not crippling. However, from past experience I know that > > ignoring warnings often ends in tears. >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From Edward_A._Damerau at dadebehring.com Fri Jan 19 15:09:47 2001 From: Edward_A._Damerau at dadebehring.com (Edward_A._Damerau at dadebehring.com) Date: Fri, 19 Jan 2001 15:09:47 -0500 Subject: [Numpy-discussion] calling numpy from C++ Message-ID: I've got Python 20 and numerical python installed and running, i.e., I can run python scripts and call import module Numerical and use arrays,etc. I've also been successful at calling python scripts from C/C++ code (MSVC++ 6.0) - Py_Initialize(), Py_BuildValue() etc. and building the de-bug version of python20.dll (python20_d.dll) so I can debug-step through the code. However, I can't seem to be able to call numpy functions from C/C++ - PyArray_FromDims(), etc. My code will compile and link but, at runtime, I get an exception error. It seems that either I need to create a numpy.dll that will live in winnt/system32 or somehow need to modify python20.dll to include the numpy functions. How do I do that? Thanks. From anthony.seward at ieee.org Sun Jan 21 00:12:59 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sat, 20 Jan 2001 22:12:59 -0700 (MST) Subject: [Numpy-discussion] Questions about organization Message-ID: I've been trying to get the setup.py scripts in the Numeric distribution to build binary distributions. I have tried several things but what always ends up being a problem is the current Numeric organization, specifically LALITE and RANLIB. They just don't fit well with the distutils paradigm. If they were packaged as part of the Numeric core but were a build-time option, I think that I could get things to work much more easily. The configuration part of distutils is still a bit imature but overcomming that would just require a slightly more involved setup.py. I don't want to go ahead if there is no chance of it being accepted. Tony From phrxy at csv.warwick.ac.uk Sun Jan 21 12:49:30 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Sun, 21 Jan 2001 17:49:30 +0000 (GMT) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sat, 20 Jan 2001, Tony Seward wrote: > I've been trying to get the setup.py scripts in the Numeric distribution to > build binary distributions. I have tried several things but what always > ends up being a problem is the current Numeric organization, specifically > LALITE and RANLIB. They just don't fit well with the distutils paradigm. > > If they were packaged as part of the Numeric core but were a build-time > option, I think that I could get things to work much more easily. The [...] Can't you just edit the list of extensions in setup_all.py? Or has that changed? John From anthony.seward at ieee.org Sun Jan 21 17:38:11 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sun, 21 Jan 2001 15:38:11 -0700 (MST) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, John J. Lee wrote: > On Sat, 20 Jan 2001, Tony Seward wrote: > > > I've been trying to get the setup.py scripts in the Numeric distribution to > > build binary distributions. I have tried several things but what always > > ends up being a problem is the current Numeric organization, specifically > > LALITE and RANLIB. They just don't fit well with the distutils paradigm. > > > > If they were packaged as part of the Numeric core but were a build-time > > option, I think that I could get things to work much more easily. The > [...] > > Can't you just edit the list of extensions in setup_all.py? > > Or has that changed? > > > John > I'm not sure what you mean. Even './setup_all.py build' doen't work for FFT, LALITE, RANLIB and RNG. Right now one has to have already installed the headers from the core of Numeric in order to even build these modules. The sequence is thus 1) get source 2) install the core 3) build or install the submodules As is stands, even if the Numeric core is already installed, it is not possible to make a binary distribution of the LALITE package (and maybe others). This could be fixed easily of the LALITE package, but I havn't looked to see if there are other problems with the other packages. The purpose of distutils is that the sequence for a given distribution is 1) get the source 2) build, or install, or build a binary distribution and you're done This can't be done with the way that things are organized right now. There are probably several ways to deal with this, but first a decision needs to be made as to what the goals are. As I understand it the goals are 1) keep all of the current packages in the distribution 2) the following imports will work individually or together: a) import Numeric b) import FFT c) import LinearAlgebra d) import MLab e) import Matrix f) import Precision g) import RandomArray h) import UserArray i) import ranlib j) import umath k) import RNG l) import MA 3) minimal changes in the source I just need some direction so that I can get things working. I think that the simplest thing is to merge the RANLIB and LALITE packages with the core of Numeric and make the building and installation of them an option. This fits beter with the paradigm of distutils. Tony From phrxy at csv.warwick.ac.uk Sun Jan 21 18:51:44 2001 From: phrxy at csv.warwick.ac.uk (John J. Lee) Date: Sun, 21 Jan 2001 23:51:44 +0000 (GMT) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, Tony Seward wrote: > On Sun, 21 Jan 2001, John J. Lee wrote: [...] > I'm not sure what you mean. Even './setup_all.py build' doen't work for > FFT, LALITE, RANLIB and RNG. > > Right now one has to have already installed the headers from the core of > Numeric in order to even build these modules. The sequence is thus > > 1) get source > 2) install the core > 3) build or install the submodules Ah, sorry, I didn't read you post properly. I thought you were just having trouble compiling something you didn't really need, and wanted to avoid compiling it. You may already know this, but this has been discussed on the distutils-sig mailing list (and maybe this one too) just recently. I don't recall what conclusions they came to, if any - things are held up at the moment because Greg Ward just moved house and is effectively off-line. I guess when he appears again he'll be swamped, so don't hold your breath for changes in distutils itself. John From paul at pfdubois.com Sun Jan 21 18:48:50 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Sun, 21 Jan 2001 15:48:50 -0800 Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: You wrote: ...As I understand it the goals are 1) keep all of the current packages in the distribution 2) the following imports will work individually or together: a) import Numeric b) import FFT c) import LinearAlgebra d) import MLab e) import Matrix f) import Precision g) import RandomArray h) import UserArray i) import ranlib j) import umath k) import RNG l) import MA 3) minimal changes in the source I just need some direction so that I can get things working. I think that the simplest thing is to merge the RANLIB and LALITE packages with the core of Numeric and make the building and installation of them an option. This fits beter with the paradigm of distutils. --- You are describing exactly what we used to have. We changed it. We had reasons that haven't disappeared. (1) is not necessarily a goal. We had a long argument about that, too. If anything, (1) is an anti-goal. Under no circumstances am I willing to move packages back to the core. I think it more likely that all the optional packages will go somewhere else entirely. We need a more organized structure. For historical reasons people did not want to change the way some of the above are NOT packages. (2) No, it is not possible to import those modules independently. Some depend on the others. (3) is not a goal, but it is not permissible to require changes in CLIENT code, such as changing where the include files get installed. (4) Goal: enable people to easily modify LinearAlgebra so that a user-specified BLAS and/or LAPACK library is used. (5) Goal: not make it harder to make distributions on the Mac or Windows just to make RPMs easier. From anthony.seward at ieee.org Sun Jan 21 21:09:27 2001 From: anthony.seward at ieee.org (Tony Seward) Date: Sun, 21 Jan 2001 19:09:27 -0700 (MST) Subject: [Numpy-discussion] Questions about organization In-Reply-To: Message-ID: On Sun, 21 Jan 2001, Paul F. Dubois wrote: > You wrote: > > ...As I understand it the goals are > > 1) keep all of the current packages in the distribution > 2) the following imports will work individually or together: > a) import Numeric > b) import FFT > c) import LinearAlgebra > d) import MLab > e) import Matrix > f) import Precision > g) import RandomArray > h) import UserArray > i) import ranlib > j) import umath > k) import RNG > l) import MA > 3) minimal changes in the source > > I just need some direction so that I can get things working. I think that > the simplest thing is to merge the RANLIB and LALITE packages with the core > of Numeric and make the building and installation of them an option. This > fits beter with the paradigm of distutils. > > --- > You are describing exactly what we used to have. We changed it. We had > reasons that haven't disappeared. > > (1) is not necessarily a goal. We had a long argument about that, too. If > anything, (1) is an anti-goal. > Under no circumstances am I willing to move packages back to the core. I > think it more likely that > all the optional packages will go somewhere else entirely. We need a > more organized structure. > For historical reasons people did not want to change the way some of the > above are NOT packages. OK, I guess I mis-rememberd the conclusion. If we have a more organized structure than I think that most of the problems would be solved. Whether RANLIB and LALITE are sub-packages of Numeric, or they are installed parallel to Numeric the setup.py scripts would be simple. If they are parallel to Numeric than they can be imported like they are now if that is a concern. > (2) No, it is not possible to import those modules independently. Some > depend on the others. All of the import lines that I gave work with the Numeric that I have installed: I left out the ones that didn't. Let me know which ones should be able to be imported and if there are any missing. > (3) is not a goal, but it is not permissible to require changes in CLIENT > code, such as changing where the include files get installed. By client code do you mean non-python code using the libraries? > (4) Goal: enable people to easily modify LinearAlgebra so that a > user-specified BLAS and/or LAPACK library is used. I've been playing with this and on *nix types systems it shouldn't be too hard to do as options to setup.py. I don't know how BLAS and LAPAK get installed on non *nix systems. My plan has been to do minimal autoconf type scanning (taking into consideration command line options) to look for existing libraries. > (5) Goal: not make it harder to make distributions on the Mac or Windows > just to make RPMs easier. > I'm trying to go through distutils when the required functionality is already present so it shouldn't be mch of an issue. I do have a W2k laptop that I can test things on in extremis. The main roadblock is trying to figure out what to do with RANLIB and LALITE. Tony From Edward_A._Damerau at dadebehring.com Mon Jan 22 12:50:54 2001 From: Edward_A._Damerau at dadebehring.com (Edward_A._Damerau at dadebehring.com) Date: Mon, 22 Jan 2001 12:50:54 -0500 Subject: [Numpy-discussion] Debug version of numpy for Win32 Message-ID: Has anyone built a debug version of _numpy.pyd (_numpy_d.pyd) for Windows NT (MS Visual Studio C++ 6.0)? From HoumanG at ActiveState.com Mon Jan 22 13:18:47 2001 From: HoumanG at ActiveState.com (Houman G) Date: Mon, 22 Jan 2001 10:18:47 -0800 Subject: [Numpy-discussion] Questions about organization (fwd) In-Reply-To: <20010121125510.A30408@ActiveState.com> Message-ID: <000701c0849f$c3c26f60$ba03a8c0@TANK> On Sun, Jan 21, 2001 at 12:36:37PM -0800, Greg Stein wrote: > FYI about what? I'm not sure that I see the point here. > > Are you trying to say that ActiveState can provide tools for packaging? > Maybe that we can get NumPy packaged for people? This one, I believe. HoumanG has been working on PyPM (like PPM for Python) for which he is creating appropriate setup.py scripts for packages such as NumPy. I wrote a new setup.py script for NumPy a week ago. The script I wrote builds the RANLIB and the LALITE as part of the core NumPy; and the binary distribution of NumPy built with my script also installs the RANLIB and LALITE. If the question is that Distutils supports a source tree structure like the one NumPy has, the question is no. However, I made a patch (for Distutils) and submitted to SourceForge which allows the source tree to be spread in more than one directory. A copy of the patch is in the attachment. Houman -------------- next part -------------- A non-text attachment was scrubbed... Name: Distutils_multidir_patch.diff.dos Type: application/octet-stream Size: 10643 bytes Desc: not available URL: From pplumlee at omnigon.com Mon Jan 22 13:26:41 2001 From: pplumlee at omnigon.com (Phlip) Date: Mon, 22 Jan 2001 10:26:41 -0800 Subject: [Numpy-discussion] Ever hear of ROOT? In-Reply-To: References: Message-ID: <01012210264100.01330@rrm> Nummies: A colleague recently asked me to look up something called "ROOT". I complained "I can't e-search for 'root'! I'd get a million false hits!" ROOT's here: http://root.cern.ch It's a high-end version of programs like GnuPlot or OpenDx; it's written for physicists at CERN, and it can handle terabyte databases. The bad news it bonds to a questionable "C++ Interpreter" for its scripting end. We all know that interpreting a language designed to be compiled represents the worst of both worlds. Has anyone plugged their NumericPython into ROOT? How hard could that be? -- Phlip phlip_cpp at my-deja.com ============ http://c2.com/cgi/wiki?PhlIp ============ -- http://users.deltanet.com/~tegan/home.html -- From paul at pfdubois.com Mon Jan 22 14:24:51 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Mon, 22 Jan 2001 11:24:51 -0800 Subject: [Numpy-discussion] Questions about organization (fwd) In-Reply-To: <000701c0849f$c3c26f60$ba03a8c0@TANK> Message-ID: Just so you know: it is not ok for the official script to install RANLIB and LALITE as a default part of the build. In fact, that used to be the case and we changed it. There are people who have to change the way these packages are built to use custom LAPACK and BLAS scripts. I won't be able to accept a patch to NumPy that does this. NumPy's current clumsy structure is a result of a decision not to break existing client scripts. These "Optional Packages" are distributed with NumPy. The don't have to be, but again, there were many people who thought it wasn't worth the confusion of having to go get all of them separately. This arrangement appears to annoy distutils purists, but distutils purity isn't the only consideration. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Houman G Sent: Monday, January 22, 2001 10:19 AM To: dev-py at ActiveState.com Cc: numpy-discussion at lists.sourceforge.net Subject: RE: [Numpy-discussion] Questions about organization (fwd) On Sun, Jan 21, 2001 at 12:36:37PM -0800, Greg Stein wrote: > FYI about what? I'm not sure that I see the point here. > > Are you trying to say that ActiveState can provide tools for packaging? > Maybe that we can get NumPy packaged for people? This one, I believe. HoumanG has been working on PyPM (like PPM for Python) for which he is creating appropriate setup.py scripts for packages such as NumPy. I wrote a new setup.py script for NumPy a week ago. The script I wrote builds the RANLIB and the LALITE as part of the core NumPy; and the binary distribution of NumPy built with my script also installs the RANLIB and LALITE. If the question is that Distutils supports a source tree structure like the one NumPy has, the question is no. However, I made a patch (for Distutils) and submitted to SourceForge which allows the source tree to be spread in more than one directory. A copy of the patch is in the attachment. Houman From turner at blueskystudios.com Tue Jan 23 11:33:56 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 11:33:56 -0500 Subject: [Numpy-discussion] numpy.org redirection Message-ID: <14957.45684.132068.755774@denmark.blueskystudios.com> some time ago I registered the domain numpy.org with the intention of donating it to the group just thought about it again today, and realized I could easily redirect it, so my question is: o does anyone object to my doing that? o where should it point, here? http://numpy.sourceforge.net/ or here? http://sourceforge.net/projects/numpy if someone feels strongly that I should transfer ownership of the domain to someone more central to development, etc., I'll be happy to do that - otherwise I'm happy just redirecting and renewing it as it comes up, or at some point setting up real hosting of the domain somewhere (sourceforge?) -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 From paul at pfdubois.com Tue Jan 23 11:56:31 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Tue, 23 Jan 2001 08:56:31 -0800 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: This is very generous of you. I don't want to own the domain. I'm not even sure as a govt employee if you could give it to me. The new Python Foundation, when it is up and running, would be appropriate. For now, I suggest you just leave it as is. I don't understand your remark about "real hosting". numpy.sourceforge.net really is at sourceforge (? guess it is obvious I'm not familiar with this term). Both pages have pointers to each other. So technically it is a wash. People looking for docs need to be on one, for releases on the other. My guess is that the link on the project page back to the home page is a little more confusing for a newbie than the clearly spelled-out link on the home page to the project page. When I first used SF I thought "Home" would be the project page. So my guess is that numpy.sourceforge.net would be best. The operative word is "guess". -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of John A. Turner Sent: Tuesday, January 23, 2001 8:34 AM To: Numpy-discussion at lists.sourceforge.net Subject: [Numpy-discussion] numpy.org redirection some time ago I registered the domain numpy.org with the intention of donating it to the group just thought about it again today, and realized I could easily redirect it, so my question is: o does anyone object to my doing that? o where should it point, here? http://numpy.sourceforge.net/ or here? http://sourceforge.net/projects/numpy if someone feels strongly that I should transfer ownership of the domain to someone more central to development, etc., I'll be happy to do that - otherwise I'm happy just redirecting and renewing it as it comes up, or at some point setting up real hosting of the domain somewhere (sourceforge?) -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From turner at blueskystudios.com Tue Jan 23 12:19:35 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 12:19:35 -0500 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: References: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: <14957.48423.359577.770038@denmark.blueskystudios.com> >>>>> "PFD" == Paul F Dubois : PFD> This is very generous of you. I don't want to own the domain. I'm not even PFD> sure as a govt employee if you could give it to me. The new Python PFD> Foundation, when it is up and running, would be appropriate. For now, I PFD> suggest you just leave it as is. PFD> I don't understand your remark about "real hosting". numpy.sourceforge.net PFD> really is at sourceforge (? guess it is obvious I'm not familiar with this PFD> term). I just meant to have numpy.org hosted by sourceforge - but I'm not even sure they do that PFD> Both pages have pointers to each other. So technically it is a wash. PFD> People looking for docs need to be on one, for releases on the other. PFD> PFD> My guess is that the link on the project page back to the home page is a PFD> little more confusing for a newbie than the clearly spelled-out link on the PFD> home page to the project page. When I first used SF I thought "Home" would PFD> be the project page. So my guess is that numpy.sourceforge.net would be PFD> best. The operative word is "guess". that's what I was thinking too - just thought I'd ask for other opinions (I do think it's a little odd that the title of that page is still "LLNL Python Extensions" rather than "Numerical Python") ok, so the redirection is set up - will probably take a while to propogate, of course -JT From turner at blueskystudios.com Tue Jan 23 12:28:39 2001 From: turner at blueskystudios.com (John A. Turner) Date: Tue, 23 Jan 2001 12:28:39 -0500 Subject: [Numpy-discussion] numpy.org redirection In-Reply-To: References: <14957.45684.132068.755774@denmark.blueskystudios.com> Message-ID: <14957.48967.738242.832224@denmark.blueskystudios.com> >>>>> "PFD" == Paul F Dubois : PFD> This is very generous of you. by the way, it isn't such a big deal any more since the registrars have opened up - I use http://www.directnic.com, which is only $15/year and has nice easy admin of the domains - far superior to networksolutions so I've done this for several open source projects (gnuplot, octave, and latex2html) - usually when I use but haven't been able to contribute as much in the way of development as I would have liked -- John A. Turner, Ph.D. Senior Research Associate Blue Sky Studios, 44 S. Broadway, White Plains, NY 10601 http://www.blueskystudios.com/ (914) 259-6319 From paul at pfdubois.com Wed Jan 24 10:46:07 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Wed, 24 Jan 2001 07:46:07 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available Message-ID: Numeric Python 17.3.0 is available in source form at http://sourceforge.net/projects/numpy This is the last planned release in the 17 family. Binary versions will appear later -- developers, please help. Release 18 will require Python 2.1. Several of the new features in Python 2.1 are specifically meant to enable improvements in Numeric and we are going to try to take advantage of them. Paul F. Dubois Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory From pete at shinners.org Wed Jan 24 12:18:04 2001 From: pete at shinners.org (Pete Shinners) Date: Wed, 24 Jan 2001 09:18:04 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available References: Message-ID: <000701c08629$9ea26d80$0200a8c0@petebox> > This is the last planned release in the 17 family. Binary versions will > appear later -- developers, please help. hello paul. i didn't see a changelog or anything in the new release? regardless, i've finally put together my precompiled ZIP for win32 and python2.0. it is available here; http://pygame.seul.org/ftp/contrib/Numeric-17.3-win32-2.0.zip once this file is available from the sourceforge site i will remove it from my little server, so please don't go linking this URL all over the world :] also note that i finally broke from the win32 binary naming convention that had been in use since the 16.0 release. i don't know if that was actually some sort of convention, or just an old tradition that got accidentally started :] either way, feel free to rename this file however you see fit. one last thing. in the "setup.py" for Numeric, one of the extensions included "Libraries=['m']", but this is not needed on windows. (in fact, there is no math library, so the compile was failing). i took this library out for my build, and everything seems happy. finally. unlike my previous release, i also have finally included some quick information in the readme, i'll just snip out the new information from that file... """ This build of Numeric 17.3.0 was built by Pete Shinners (pete at shinners.org) on January 24, 2001 To install, extract the ZIP file into your Python folder. (for example, on my system this is C:\python20\) Unfortunately, I do not make use of the scientific packages like FFT, MA, and RNG, so I cannot test that they work, but I do know the base ArrayObject works well, and these extra packages have worked fine on my previous releases. I am also the maintainer of the pygame library. Pygame is a great library for writing games in python, and includes a numeric module for directly accessing pixel values with numeric arrays. http://pygame.seul.org Following is the original README included with the Numeric source ----------------------------------------------------------------- """ + open('numeric-17.3/readme').read() From mhagger at alum.mit.edu Sat Jan 27 12:10:43 2001 From: mhagger at alum.mit.edu (Michael Haggerty) Date: Sat, 27 Jan 2001 12:10:43 -0500 (EST) Subject: [Numpy-discussion] Gnuplot.py version 1.5 released Message-ID: <14963.275.224865.406047@freak.kaiserty.com> This is to announce the release of version 1.5 of Gnuplot.py. Gnuplot.py is a Python [1] package that allows you to create graphs from within Python using the gnuplot [2] plotting program. Gnuplot.py can be obtained from http://gnuplot-py.sourceforge.net/ Prerequisites (see footnotes): the Python interpreter [1] the Python Numeric module [3] the gnuplot program [2] Some ways this package can be used: 1. Interactive data processing: Use Python's excellent Numeric package to create and manipulate arrays of numbers, and use Gnuplot.py to visualize the results. 2. Web graphics: write CGI scripts in Python that use gnuplot to output plots in GIF format and return them to the client. 3. Glue for numerical applications (this is my favorite): wrap your C++/C/Fortran subroutines so that they are callable from Python, then you can perform numerical computations interactively from scripts or from the command line and use Gnuplot.py to plot the output on the fly. 4. Compute a series of datasets in Python and plot them one after the other using Gnuplot.py to produce a crude animation. New features in this version: + Added distutils support. + Broke up the module a bit for better maintainability. The most commonly-used facilities are still available through "import Gnuplot", but some specialized things have been moved to separate modules, in particular funcutils.py and PlotItems.py. + funcutils.tabulate_function() can be used to evaluate a function on a 1-D or 2-D grid of points (this replaces grid_function, which only worked with 2-D grids). + Added two helper functions, funcutils.compute_Data and funcutils.compute_GridData, which compute a function's values on a set of points and package the results into a PlotItem. + GridFunc is no longer an independent class; it is now a factory function that returns a GridData. GridFunc is deprecated in favor of funcutils.compute_GridData. + Changed set_option to work from a table, so that it doesn't need to be overloaded so often. + Implemented test_persist for each platform to make it easier for users to determine whether the `-persist' option is supported. + Added a prefer_persist option to serve as the default `persist' choice. + Following a suggestion by Jannie Hofmeyr, use "from os import popen" for Python 2.0 under Windows. I don't use Windows, so let me know how this works. + Added support for the `axes' and `smooth' options of the `plot' command. + Reworked the comment strings in an effort to make them work nicely with happydoc. Features already present in older versions: + Two and three-dimensional plots. + Plot data from memory, from a file, or from an expression. + Support for multiple simultaneous gnuplot sessions. + Can pass arbitrary commands to the gnuplot program. + Object oriented, extensible design with several built-in types of plot items. + Portable and easy to install (nothing to compile except on Windows). + Support for MS Windows, using the `pgnuplot.exe' program. + Support for sending data to gnuplot as `inline' or `binary' data. These are optimizations that also remove the need for temporary files. Temporary files are still the default. Footnotes: ---------- [1] Python is an excellent object-oriented scripting/rapid development language that is also especially good at gluing programs together. [2] gnuplot is a free, popular, very portable plotting program with a command-line interface. It can make 2-d and 3-d plots and can output to myriad printers and graphics terminals. [3] The Numeric Python extension is a Python module that adds fast and convenient array manipulations to the Python language. -- Michael Haggerty mhagger at alum.mit.edu From jsaenz at wm.lc.ehu.es Mon Jan 29 09:46:44 2001 From: jsaenz at wm.lc.ehu.es (Jon Saenz) Date: Mon, 29 Jan 2001 15:46:44 +0100 (MET) Subject: [Numpy-discussion] Is this a wheel? Message-ID: Hello, there. I needed last Saturday a function which returns the greatest/smallest element of a NumPy array. I seeked through the documentation and found the argmax/argmin functions. However, they must be called recursively to find the greatest(smallest) element of a multidimendional array. As I needed to process a BIG dataset of multidimensional arrays, I wrote a function in C which returns as a NumPy array shaped (2,) the [smallest one,biggest one] elements in an arbitrarily shaped NumPy array. It is pure C and works for multidimensional arrays. The return typecode is the same of the input array (except with complex numbers, which compare numbers through their modules). I can make this function available to general public by means of my WEB page or my starship account as a module. However, I wonder: a) Is this a wheel already invented some 40,000 years ago? May be I missed something in the manual? b) If the answer to the previous question is NO, would you (main developers) be interested in making it available as one of the "general purpose" NumPy functions? It is quite general-purpose, indeed. I have needed it five times or so in the last two years... Looking after your comments. Jon Saenz. | Tfno: +34 946012470 Depto. Fisica Aplicada II | Fax: +34 944648500 Facultad de Ciencias. \\ Universidad del Pais Vasco \\ Apdo. 644 \\ 48080 - Bilbao \\ SPAIN From paul at pfdubois.com Mon Jan 29 10:20:48 2001 From: paul at pfdubois.com (Paul F. Dubois) Date: Mon, 29 Jan 2001 07:20:48 -0800 Subject: [Numpy-discussion] Is this a wheel? In-Reply-To: Message-ID: >>> import Numeric >>> x=Numeric.arange(24) >>> x.shape=(3,2,4) >>> print Numeric.maximum.reduce(x.flat) 23 The .flat operator does not copy the data. -----Original Message----- From: numpy-discussion-admin at lists.sourceforge.net [mailto:numpy-discussion-admin at lists.sourceforge.net]On Behalf Of Jon Saenz Sent: Monday, January 29, 2001 6:47 AM To: Numpy-Discussion at Lists. Sourceforge. Net Subject: [Numpy-discussion] Is this a wheel? Hello, there. I needed last Saturday a function which returns the greatest/smallest element of a NumPy array. I seeked through the documentation and found the argmax/argmin functions. However, they must be called recursively to find the greatest(smallest) element of a multidimendional array. As I needed to process a BIG dataset of multidimensional arrays, I wrote a function in C which returns as a NumPy array shaped (2,) the [smallest one,biggest one] elements in an arbitrarily shaped NumPy array. It is pure C and works for multidimensional arrays. The return typecode is the same of the input array (except with complex numbers, which compare numbers through their modules). I can make this function available to general public by means of my WEB page or my starship account as a module. However, I wonder: a) Is this a wheel already invented some 40,000 years ago? May be I missed something in the manual? b) If the answer to the previous question is NO, would you (main developers) be interested in making it available as one of the "general purpose" NumPy functions? It is quite general-purpose, indeed. I have needed it five times or so in the last two years... Looking after your comments. Jon Saenz. | Tfno: +34 946012470 Depto. Fisica Aplicada II | Fax: +34 944648500 Facultad de Ciencias. \\ Universidad del Pais Vasco \\ Apdo. 644 \\ 48080 - Bilbao \\ SPAIN _______________________________________________ Numpy-discussion mailing list Numpy-discussion at lists.sourceforge.net http://lists.sourceforge.net/lists/listinfo/numpy-discussion From dpgrote at lbl.gov Mon Jan 29 14:29:24 2001 From: dpgrote at lbl.gov (David P Grote) Date: Mon, 29 Jan 2001 11:29:24 -0800 Subject: [Numpy-discussion] Is this a wheel? References: Message-ID: <3A75C494.70200@lbl.gov> An HTML attachment was scrubbed... URL: From pete at shinners.org Tue Jan 30 03:25:55 2001 From: pete at shinners.org (Pete Shinners) Date: Tue, 30 Jan 2001 00:25:55 -0800 Subject: [Numpy-discussion] [ANN] Numeric-17.3.0.tar.gz available References: <000701c08629$9ea26d80$0200a8c0@petebox> Message-ID: <002901c08a96$458bef20$0200a8c0@petebox> just curious about the status on this. i've got this file still available, but i notice it hasn't shown up on the Numpy downloads on sourceforge. Last week, I wrote: > > This is the last planned release in the 17 family. Binary versions will > > appear later -- developers, please help. > > hello paul. i didn't see a changelog or anything in the new release? > > regardless, i've finally put together my precompiled ZIP for win32 > and python2.0. it is available here; > http://pygame.seul.org/ftp/contrib/Numeric-17.3-win32-2.0.zip > > once this file is available from the sourceforge site i will > remove it from my little server, so please don't go linking > this URL all over the world :] > > also note that i finally broke from the win32 binary naming > convention that had been in use since the 16.0 release. i > don't know if that was actually some sort of convention, or > just an old tradition that got accidentally started :] > either way, feel free to rename this file however you see fit. > > one last thing. in the "setup.py" for Numeric, one of the > extensions included "Libraries=['m']", but this is not needed > on windows. (in fact, there is no math library, so the > compile was failing). i took this library out for my build, > and everything seems happy. > > finally. unlike my previous release, i also have finally included > some quick information in the readme, i'll just snip out the new > information from that file...