From hugelmopf at web.de Sun May 1 04:47:05 2005 From: hugelmopf at web.de (Frank) Date: Sun, 1 May 2005 10:47:05 +0200 Subject: [SciPy-user] ImportError for scipy.xplt Message-ID: <200505011047.06024.hugelmopf@web.de> Hi all, I have a problem with importing the xplt part of scipy: Python 2.4.1 (#2, Apr 2 2005, 04:26:17) [GCC 3.3.5 (Debian 1:3.3.5-12)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from scipy.xplt import * Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy_base/ppimport.py", line 303, in __getattr__ module = self._ppimport_importer() File "/usr/lib/python2.4/site-packages/scipy_base/ppimport.py", line 258, in _ppimport_importer raise ImportError,self.__dict__.get('_ppimport_exc_info')[1] ImportError: Array can not be safely cast to required type >>> Importing other parts of scipy works fine (from scipy.integrate import *). This happens on Debian Sarge (AMD64) with scipy-0.3.2 under python-2.4.1 and python-2.3.5 Has anybody experienced the same or a solution? Thanks, Frank From u_kazu at nifty.com Mon May 2 03:39:19 2005 From: u_kazu at nifty.com (Kazuhiko UEBAYASHI) Date: Mon, 2 May 2005 16:39:19 +0900 (JST) Subject: [SciPy-user] How to make the matrix include variable "x" Message-ID: <5511131.1115019559796.u_kazu@nifty.com> I'm trying to use SciPy module. Is it possble to make the following Matrix using SciPy ? A(x) = (x**0 x**1 x**2 ). I tried in Python with SciPy >> from scipy import * >> x0 = 10; x1 = 8; x2 = 3 >> print mat('[x0; x1; x2]') Matrix([[0], [1], [2]]) . I expected the answer Matrix([[10], [8], [3]]) but it didn't work well. --- Kazuhiko UEBAYASHI u_kazu(a)nifty.com From rkern at ucsd.edu Mon May 2 03:55:20 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 02 May 2005 00:55:20 -0700 Subject: [SciPy-user] How to make the matrix include variable "x" In-Reply-To: <5511131.1115019559796.u_kazu@nifty.com> References: <5511131.1115019559796.u_kazu@nifty.com> Message-ID: <4275DCE8.1010000@ucsd.edu> Kazuhiko UEBAYASHI wrote: > I'm trying to use SciPy module. > > Is it possble to make the following Matrix using SciPy ? > > A(x) = (x**0 x**1 x**2 ). > > > I tried in Python with SciPy > >> from scipy import * > >> x0 = 10; x1 = 8; x2 = 3 > >> print mat('[x0; x1; x2]') > Matrix([[0], > [1], > [2]]) > . I expected the answer > Matrix([[10], > [8], > [3]]) > but it didn't work well. When mat() is interpreting a string, it doesn't evaluate variables. In this case, it's ignoring the 'x' characters. You want mat([[x0], [x1], [x2]]) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From david.grant at telus.net Mon May 2 05:37:13 2005 From: david.grant at telus.net (David Grant) Date: Mon, 02 May 2005 02:37:13 -0700 Subject: [SciPy-user] How to make the matrix include variable "x" In-Reply-To: <5511131.1115019559796.u_kazu@nifty.com> References: <5511131.1115019559796.u_kazu@nifty.com> Message-ID: <200505020237.13486.david.grant@telus.net> On May 2, 2005 12:39 am, Kazuhiko UEBAYASHI wrote: > I'm trying to use SciPy module. > > Is it possble to make the following Matrix using SciPy ? > > A(x) = (x**0 x**1 x**2 ). I'm not sure what you're trying to do here, but it doesn't seem relevant for what you do below. > I tried in Python with SciPy > > >> from scipy import * > >> > >> x0 = 10; x1 = 8; x2 = 3 > >> print mat('[x0; x1; x2]') You probably meant to do: print mat([x0, x1, x2]) Your first error was putting the quotes '...' and your second error was the semicolons. I think that comes from a column vector notation in matlab but I'm not sure. > Matrix([[0], > [1], > [2]]) > . I expected the answer > Matrix([[10], > [8], > [3]]) > but it didn't work well. > I get the following In [1]: from scipy import * In [2]: x0=10;x1=8;x2=3; In [3]: print mat([x0,x1,x2]) Matrix([ [10, 8, 3]]) (ugly formatting on the Matrix object there!!!) David From zollars at caltech.edu Mon May 2 14:11:33 2005 From: zollars at caltech.edu (Eric Zollars) Date: Mon, 02 May 2005 11:11:33 -0700 Subject: [SciPy-user] install problems Message-ID: <1115057493.27359.4.camel@gaijin> There are ATLAS libraries installed on the system that are incompatible with the scipy install. I have built my own ATLAS/LAPACK libraries but how do I block setup.py from looking in /usr/lib and /usr/local/lib? I have tried setting $BLAS and $LAPACK, but that doesn't help. I still get Import Errors with the wrong library listed. Any help is appreciated. Eric From dd55 at cornell.edu Mon May 2 14:26:31 2005 From: dd55 at cornell.edu (Darren Dale) Date: Mon, 2 May 2005 14:26:31 -0400 Subject: [SciPy-user] install problems In-Reply-To: <1115057493.27359.4.camel@gaijin> References: <1115057493.27359.4.camel@gaijin> Message-ID: <200505021426.31670.dd55@cornell.edu> Hi Eric, On Monday 02 May 2005 2:11 pm, Eric Zollars wrote: > There are ATLAS libraries installed on the system that are incompatible > with the scipy install. I have built my own ATLAS/LAPACK libraries but > how do I block setup.py from looking in /usr/lib and /usr/local/lib? I > have tried setting $BLAS and $LAPACK, but that doesn't help. I still > get Import Errors with the wrong library listed. > Try editing scipy_core/scipy_distutils/sample_site.cfg and save it as site.cfg. Also, if you are not installing from CVS, you need to edit six lines in system_info.py, in that same directory, starting at line #285 ? ? ? ? ?try: ? ? ? ? ? ? ?f = __file__ ? ? ? ? ?except NameError,msg: ? ? ? ? ? ? ?f = sys.argv[0] ? ? ? ? ?cf = os.path.join(os.path.split(os.path.abspath(f))[0], ? ? ? ? ? ? ? ? ? ? ? ? ? ?'site.cfg') hope this helps, Darren -- Darren S. Dale Bard Hall Department of Materials Science and Engineering Cornell University Ithaca, NY. 14850 dd55 at cornell.edu From zollars at caltech.edu Mon May 2 20:42:28 2005 From: zollars at caltech.edu (Eric Zollars) Date: Mon, 02 May 2005 17:42:28 -0700 Subject: [SciPy-user] install problems In-Reply-To: <200505021426.31670.dd55@cornell.edu> References: <1115057493.27359.4.camel@gaijin> <200505021426.31670.dd55@cornell.edu> Message-ID: <1115080948.27954.18.camel@gaijin> On Mon, 2005-05-02 at 11:26, Darren Dale wrote: > Hi Eric, > > On Monday 02 May 2005 2:11 pm, Eric Zollars wrote: > > There are ATLAS libraries installed on the system that are incompatible > > with the scipy install. I have built my own ATLAS/LAPACK libraries but > > how do I block setup.py from looking in /usr/lib and /usr/local/lib? I > > have tried setting $BLAS and $LAPACK, but that doesn't help. I still > > get Import Errors with the wrong library listed. > > > > Try editing scipy_core/scipy_distutils/sample_site.cfg and save it as > site.cfg. Also, if you are not installing from CVS, you need to edit six > lines in system_info.py, in that same directory, starting at line #285 > > try: > f = __file__ > except NameError,msg: > f = sys.argv[0] > cf = os.path.join(os.path.split(os.path.abspath(f))[0], > 'site.cfg') > > hope this helps, > Darren Thanks, that blocked setup.py from getting the wrong libs but nothing I do seems to get scipy installed. I had to modify site.cfg: [DEFAULT] library_dirs = /ul/zollars/lib include_dirs = /ul/zollars/include because setup continued to find the bad libs when the default system lib directories were listed. Will this be a problem? [atlas] library_dirs = /ul/zollars/opt/ATLAS/lib/Linux_g77_32 atlas_libs = lapack_g77_32, f77blas_g77_32, cblas_g77_32,atlas_g77_32 I am getting this with setup.py install: gcc -pthread -shared build/temp.linux-ppc64-2.4/build/src/atlas_version_-0x74e85a32.o -L/ul/zollars/opt/ATLAS/lib/Linux_g77_32 -latlas_g77_32 -o build/temp.linux-ppc64-2.4/atlas_version.so FOUND: libraries = ['lapack_g77_32', 'f77blas_g77_32', 'cblas_g77_32', 'atlas_g77_32'] library_dirs = ['/ul/zollars/opt/ATLAS/lib/Linux_g77_32'] language = c define_macros = [('ATLAS_INFO', '"\\"3.4.2\\""')] lapack_opt_info: atlas_threads_info: scipy_distutils.system_info.atlas_threads_info scipy_core/scipy_distutils/system_info.py:598: UserWarning: ********************************************************************* Could not find lapack library within the ATLAS installation. ********************************************************************* I built the full lapack library according to the instructions with ATLAS. What else can I check? I have also tried setting the $BLAS and $LAPACK variables which do lead to a successful scipy install, but then fails the t=scipy.test(level=1). In this case the errors indicate problems with the scipy.linalg module, further suggesting problems with LAPACK. But I don't know what else to check. Any suggestions? Eric From igorcarron at gmail.com Mon May 2 21:04:51 2005 From: igorcarron at gmail.com (Igor Carron) Date: Mon, 2 May 2005 20:04:51 -0500 Subject: [SciPy-user] Install problem using windows binaries. Message-ID: <3ff18af05050218043d0a43de@mail.gmail.com> Hi, I am a newbie. I have an install problem. I am using Win XP, on a celeron 1.2 Ghz. I just tested the install I got from the binaries section ( Windows Binaries, PIII on python 2.2.x at http://www.scipy.org/download/) I have looked at the files that are not found (see below), yet they are there. What did I do wrong ? Thank you in advance, Igor. >>> t=scipy.test() !! No test file 'test_Mplot.py' found for !! No test file 'test_lena.py' found for !! No test file 'test_build_py.py' found for !! No test file 'test_gistC.py' found for Found 4 tests for scipy.io.array_import Found 23 tests for scipy_base.function_base !! No test file 'test_ltisys.py' found for !! No test file 'test_info_integrate.py' found for !! No test file 'test_vq.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_machar.py' found for Found 128 tests for scipy.linalg.fblas !! No test file 'test_ccompiler.py' found for !! No test file 'test_info_interpolate.py' found for !! No test file 'test_gist.py' found for !! No test file 'test_info_cow.py' found for !! No test file 'test_spam.py' found for !! No test file 'test_write_style.py' found for !! No test file 'test_quadpack.py' found for !! No test file 'test_info_gplt.py' found for !! No test file 'test_tree.py' found for Found 92 tests for scipy.stats.stats Found 36 tests for scipy.linalg.decomp !! No test file 'test_display_test.py' found for !! No test file 'test_config_compiler.py' found for !! No test file 'test_quadrature.py' found for Found 20 tests for scipy.fftpack.pseudo_diffs !! No test file 'test_sigtools.py' found for !! No test file 'test_optimize.py' found for !! No test file 'test_colorbar.py' found for !! No test file 'test__dsuperlu.py' found for !! No test file 'test_scipy_test_version.py' found for !! No test file 'test_build_clib.py' found for !! No test file 'test_specfun.py' found for !! No test file 'test__compiled_base.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_common_routines.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_pexec.py' found for !! No test file 'test_sparsetools.py' found for !! No test file 'test_iterative.py' found for !! No test file 'test__fftpack.py' found for Found 5 tests for scipy.interpolate.fitpack !! No test file 'test_calc_lwork.py' found for !! No test file 'test_ode.py' found for !! No test file 'test___cvs_version__.py' found for !! No test file 'test__ssuperlu.py' found for !! No test file 'test___init__.py' found for !! No test file 'test__zeros.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_cblas.py' found for !! No test file 'test___cvs_version__.py' found for Found 12 tests for scipy.io.mmio !! No test file 'test_plwf.py' found for !! No test file 'test__flinalg.py' found for Found 2 tests for scipy_base.fastumath !! No test file 'test_linesearch.py' found for !! No test file 'test_waveforms.py' found for !! No test file 'test_convolve.py' found for Found 10 tests for scipy.stats.morestats Found 4 tests for scipy.linalg.lapack !! No test file 'test_ga_list.py' found for !! No test file 'test_special_version.py' found for Found 19 tests for scipy.fftpack.basic !! No test file 'test_anneal.py' found for !! No test file 'test_fitpack2.py' found for !! No test file 'test_gene.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_bsplines.py' found for !! No test file 'test_numpyio.py' found for !! No test file 'test_extension.py' found for !! No test file 'test_data_store.py' found for !! No test file 'test_slice3.py' found for !! No test file 'test_scipy_base_version.py' found for !! No test file 'test_algorithm.py' found for Found 1 tests for scipy.optimize.zeros !! No test file 'test_vode.py' found for !! No test file 'test_info_xplt.py' found for !! No test file 'test___init__.py' found for !! No test file 'test___cvs_version__.py' found for !! No test file 'test_info_cluster.py' found for !! No test file 'test_mio.py' found for Found 4 tests for scipy_base.index_tricks Found 4 tests for scipy.fftpack.helper !! No test file 'test_fftpack_version.py' found for !! No test file 'test_movie.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_scaling.py' found for !! No test file 'test_core.py' found for !! No test file 'test_info_signal.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_selection.py' found for Found 41 tests for scipy.linalg.basic !! No test file 'test_odepack.py' found for !! No test file 'test_exec_command.py' found for !! No test file 'test_common_routines.py' found for !! No test file 'test_interpolate.py' found for !! No test file 'test_common_routines.py' found for !! No test file 'test_shapetest.py' found for !! No test file 'test_lbfgsb.py' found for !! No test file 'test_language.py' found for !! No test file 'test_info_xxx.py' found for Found 43 tests for scipy_base.shape_base !! No test file 'test_linalg_version.py' found for !! No test file 'test_interface.py' found for !! No test file 'test___init__.py' found for Found 9 tests for scipy_base.matrix_base !! No test file 'test_minpack2.py' found for !! No test file 'test_info_scipy_base.py' found for Found 342 tests for scipy.special.basic !! No test file 'test_unixccompiler.py' found for !! No test file 'test__minpack.py' found for !! No test file 'test_clapack.py' found for !! No test file 'test_interface.py' found for !! No test file 'test_scimath.py' found for !! No test file 'test_new_plot.py' found for !! No test file 'test_moduleTNC.py' found for Found 6 tests for scipy.linalg.matfuncs !! No test file 'test___init__.py' found for !! No test file 'test_info_plt.py' found for !! No test file 'test___init__.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_install_data.py' found for !! No test file 'test___init__.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_sdist.py' found for !! No test file 'test_sync_cluster.py' found for !! No test file 'test__quadpack.py' found for !! No test file 'test_bdist_rpm.py' found for !! No test file 'test_yorick.py' found for !! No test file 'test_testing.py' found for !! No test file 'test_scipy_version.py' found for !! No test file 'test_flapack.py' found for !! No test file 'test_info_fftpack.py' found for !! No test file 'test_dumb_shelve.py' found for !! No test file 'test_build_src.py' found for !! No test file 'test__odepack.py' found for !! No test file 'test_build.py' found for !! No test file 'test_futil.py' found for !! No test file 'test_tnc.py' found for !! No test file 'test_population.py' found for !! No test file 'test_cephes.py' found for !! No test file 'test_helpmod.py' found for !! No test file 'test__support.py' found for Found 39 tests for scipy_base.type_check Found 2 tests for scipy_base.limits !! No test file 'test__iterative.py' found for !! No test file 'test_ga_util.py' found for !! No test file 'test_genome.py' found for !! No test file 'test___init__.py' found for Found 26 tests for scipy.sparse.Sparse !! No test file 'test_info_linalg.py' found for !! No test file 'test_install.py' found for !! No test file 'test_dist.py' found for !! No test file 'test___init__.py' found for Found 1 tests for scipy.optimize.cobyla !! No test file 'test_pl3d.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_log.py' found for !! No test file 'test_from_template.py' found for !! No test file 'test_info_special.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_flinalg.py' found for Found 14 tests for scipy.linalg.blas !! No test file 'test_filter_design.py' found for !! No test file 'test_rv.py' found for !! No test file 'test_tree_opt.py' found for !! No test file 'test_minpack.py' found for !! No test file 'test_spline.py' found for !! No test file 'test___init__.py' found for Found 2 tests for scipy.xxx.foo !! No test file 'test_pilutil.py' found for !! No test file 'test__fitpack.py' found for !! No test file 'test__zsuperlu.py' found for !! No test file 'test__csuperlu.py' found for !! No test file 'test__lbfgsb.py' found for !! No test file 'test_orthogonal.py' found for !! No test file 'test_dfitpack.py' found for !! No test file 'test_scipy_distutils_version.py' found for !! No test file 'test_info_io.py' found for !! No test file 'test__superlu.py' found for Found 70 tests for scipy.stats.distributions !! No test file 'test_rand.py' found for !! No test file 'test_info_stats.py' found for !! No test file 'test___init__.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_info_sparse.py' found for !! No test file 'test_pyPlot.py' found for !! No test file 'test_statlib.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_plot_utility.py' found for !! No test file 'test_info_optimize.py' found for !! No test file 'test_pickler.py' found for !! No test file 'test_line_endings.py' found for !! No test file 'test_install_headers.py' found for !! No test file 'test_polynomial.py' found for !! No test file 'test_ppimport.py' found for !! No test file 'test_misc_util.py' found for !! No test file 'test_cow.py' found for !! No test file 'test_build_ext.py' found for Found 3 tests for scipy.signal.signaltools !! No test file 'test__cobyla.py' found for Found 0 tests for __main__ Don't worry about a warning regarding the number of bytes read. ...............................................................................................................................................................................................................................................................................................................................E....Ties preclude use of exact statistic. ..Ties preclude use of exact statistic. .............................TESTING CONVERGENCE zero should be 1 function f2 cc.bisect : 1.0000000000001952 cc.ridder : 1.0000000000004661 cc.brenth : 0.9999999999999997 cc.brentq : 0.9999999999999577 function f3 cc.bisect : 1.0000000000001952 cc.ridder : 1.0000000000000000 cc.brenth : 1.0000000000000009 cc.brentq : 1.0000000000000011 function f4 cc.bisect : 1.0000000000001952 cc.ridder : 1.0000000000001454 cc.brenth : 0.9999999999993339 cc.brentq : 0.9999999999993339 function f5 cc.bisect : 1.0000000000001952 cc.ridder : 1.0000000000004574 cc.brenth : 0.9999999999991444 cc.brentq : 0.9999999999991444 function f6 cc.bisect : 1.0000000000001952 cc.ridder : 1.0000000000002294 cc.brenth : 0.9999999999990271 cc.brentq : 1.0000000000010412 .........................................................................................................................................................................................................................................................................................................................................Gegenbauer, a = 2.6210331981 .......................................................................................shifted jacobi p,q = 1.58344541285 1.50395456195 ............................Result may be inaccurate, approximate err = 1.99093559359e-008 ...Result may be inaccurate, approximate err = 1.50259560743e-010 ..................................................E.E..........EE...............Result: [ 4.95535778 0.6666553 ] (exact result = 4.955356249106168, 0.666666666666666) .................Testing alpha .Testing anglit .Testing arcsine ..Testing beta .Testing betaprime ..Testing bradford .Testing burr .Testing cauchy .Testing chi .Testing chi2 .Testing dgamma ..Testing dweibull .Testing erlang .Testing expon .Testing exponpow .Testing exponweib .Testing f .Testing fatiguelife .Testing fisk .Testing foldcauchy .Testing foldnorm .Testing frechet_l .Testing frechet_r .Testing gamma .Testing genextreme .Testing gengamma .Testing genhalflogistic .Testing genlogistic .Testing genpareto ..Testing gilbrat .Testing gompertz .Testing gumbel_l .Testing gumbel_r .Testing halfcauchy .Testing halflogistic .Testing halfnorm ..Testing hypsecant .Testing laplace .Testing loggamma .Testing logistic .Testing lognorm ..Testing lomax .Testing maxwell .Testing nakagami ..Testing ncf .Testing nct .Testing ncx2 .Testing norm .Testing pareto ..Testing powerlaw ....Testing rayleigh .Testing reciprocal .Testing t .Testing triang .Testing tukeylambda .Testing uniform .Testing weibull_max .Testing weibull_min ..... ====================================================================== ERROR: check_simple_todense (scipy.io.mmio.test_mmio.test_mmio_coordinate) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python22\Lib\site-packages\scipy\io\tests\test_mmio.py", line 152, in check_simple_todense b = mmread(fn).todense() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 203, in todense csc = self.tocsc() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1336, in tocsc data, row, col = self._normalize() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1318, in _normalize import itertools ImportError: No module named itertools ====================================================================== ERROR: check_tocoo (scipy.sparse.Sparse.test_Sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python22\Lib\site-packages\scipy\sparse\tests\test_Sparse.py", line 75, in check_tocoo assert_array_almost_equal(a.todense(),self.dat) File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 203, in todense csc = self.tocsc() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1336, in tocsc data, row, col = self._normalize() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1318, in _normalize import itertools ImportError: No module named itertools ====================================================================== ERROR: check_tocsr (scipy.sparse.Sparse.test_Sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python22\Lib\site-packages\scipy\sparse\tests\test_Sparse.py", line 82, in check_tocsr a = self.datsp.tocsr() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 586, in tocsr return self.tocoo().tocsr() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1344, in tocsr data,row,col = self._normalize(rowfirst=1) File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1311, in _normalize import itertools ImportError: No module named itertools ====================================================================== ERROR: check_tocoo (scipy.sparse.Sparse.test_Sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python22\Lib\site-packages\scipy\sparse\tests\test_Sparse.py", line 75, in check_tocoo assert_array_almost_equal(a.todense(),self.dat) File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 203, in todense csc = self.tocsc() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1336, in tocsc data, row, col = self._normalize() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1318, in _normalize import itertools ImportError: No module named itertools ====================================================================== ERROR: check_tocsc (scipy.sparse.Sparse.test_Sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python22\Lib\site-packages\scipy\sparse\tests\test_Sparse.py", line 78, in check_tocsc a = self.datsp.tocsc() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 983, in tocsc return self.tocoo().tocsc() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1336, in tocsc data, row, col = self._normalize() File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line 1318, in _normalize import itertools ImportError: No module named itertools ---------------------------------------------------------------------- Ran 972 tests in 3.856s From rkern at ucsd.edu Mon May 2 21:17:57 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 02 May 2005 18:17:57 -0700 Subject: [SciPy-user] Install problem using windows binaries. In-Reply-To: <3ff18af05050218043d0a43de@mail.gmail.com> References: <3ff18af05050218043d0a43de@mail.gmail.com> Message-ID: <4276D145.1030803@ucsd.edu> Igor Carron wrote: > Hi, > > I am a newbie. I have an install problem. > I am using Win XP, on a celeron 1.2 Ghz. > I just tested the install I got from the binaries section ( Windows > Binaries, PIII on python 2.2.x at http://www.scipy.org/download/) > I have looked at the files that are not found (see below), yet they > are there. What did I do wrong ? Thank you in advance, > > Igor. > > >>>>t=scipy.test() > > !! No test file 'test_Mplot.py' found for 'scipy.xplt.Mplot' from '...-packages\\scipy\\xplt\\Mplot.pyc'> These are just informational warnings saying that there are no tests for these particular modules. Ignore them. > ====================================================================== > ERROR: check_simple_todense (scipy.io.mmio.test_mmio.test_mmio_coordinate) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "C:\Python22\Lib\site-packages\scipy\io\tests\test_mmio.py", > line 152, in check_simple_todense > b = mmread(fn).todense() > File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line > 203, in todense > csc = self.tocsc() > File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line > 1336, in tocsc > data, row, col = self._normalize() > File "C:\Python22\Lib\site-packages\scipy\sparse\Sparse.py", line > 1318, in _normalize > import itertools > ImportError: No module named itertools This is our fault, not yours. itertools is a standard library module introduced in Python 2.3. If you don't need to use sparse matrices, you can just ignore this. If you do need sparse matrices, you will need to upgrade to Python 2.3. I recommend using Enthought's distribution of Python 2.3.5 which has lots of modules, including Scipy, already compiled and tested for you. http://download.enthought.com/enthought_python-2.3.5-1069.exe -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From joe at enthought.com Mon May 2 21:25:15 2005 From: joe at enthought.com (Joe Cooper) Date: Mon, 02 May 2005 20:25:15 -0500 Subject: [SciPy-user] Install problem using windows binaries. In-Reply-To: <4276D145.1030803@ucsd.edu> References: <3ff18af05050218043d0a43de@mail.gmail.com> <4276D145.1030803@ucsd.edu> Message-ID: <4276D2FB.7060406@enthought.com> Robert Kern wrote: > > This is our fault, not yours. itertools is a standard library module > introduced in Python 2.3. If you don't need to use sparse matrices, you > can just ignore this. If you do need sparse matrices, you will need to > upgrade to Python 2.3. I recommend using Enthought's distribution of > Python 2.3.5 which has lots of modules, including Scipy, already > compiled and tested for you. The "tested" part might be stretching it a little. ;-) We're working on being able to call it "tested", though. And I will fix problems that get reported. From igorcarron at gmail.com Tue May 3 05:04:19 2005 From: igorcarron at gmail.com (Igor Carron) Date: Tue, 3 May 2005 04:04:19 -0500 Subject: [SciPy-user] Install problem using windows binaries. In-Reply-To: <3ff18af05050218043d0a43de@mail.gmail.com> References: <3ff18af05050218043d0a43de@mail.gmail.com> Message-ID: <3ff18af0505030204551e4ad2@mail.gmail.com> Robert and Joe, Thanks. "..!! No test file 'test_Mplot.py' found for 'scipy.xplt.Mplot' from '...-packages\\scipy\\xplt\\Mplot.pyc'> These are just informational warnings saying that there are no tests for these particular modules. Ignore them..." I guess if these are just warnings, there are not well phrased. Mplot.py does exist and is in the same directory as Mplot.pyc. Couldn't a better phrase be used like: "The test file 'test_Mplot.py' currently does not have a test" or something to that effect. Thank you again for your swift response. Igor. -- Igor Carron, Ph.D. Pegasus Team, DARPA Grand Challenge 2005: http://pegasusbridge.blogspot.com/ From rkern at ucsd.edu Tue May 3 05:16:21 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 03 May 2005 02:16:21 -0700 Subject: [SciPy-user] Install problem using windows binaries. In-Reply-To: <3ff18af0505030204551e4ad2@mail.gmail.com> References: <3ff18af05050218043d0a43de@mail.gmail.com> <3ff18af0505030204551e4ad2@mail.gmail.com> Message-ID: <42774165.3070505@ucsd.edu> Igor Carron wrote: > Robert and Joe, > > Thanks. > > "..!! No test file 'test_Mplot.py' found for >>'scipy.xplt.Mplot' from '...-packages\\scipy\\xplt\\Mplot.pyc'> > > > These are just informational warnings saying that there are no tests for > these particular modules. Ignore them..." > > > I guess if these are just warnings, there are not well phrased. > Mplot.py does exist and is in the same directory as Mplot.pyc. > Couldn't a better phrase be used like: > "The test file 'test_Mplot.py' currently does not have a test" or > something to that effect. The file "Mplot.py" does exist, but the message is not saying that it does not exist. The message is saying, correctly, that the file "test_Mplot.py" does not exist. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From u_kazu at nifty.com Tue May 3 09:00:52 2005 From: u_kazu at nifty.com (=?ISO-2022-JP?B?S2F6dWhpa28gVUVCQVlBU0hJIC8gGyRCPmVOUxsoQiAbJEIwbBsoQg==?= =?ISO-2022-JP?B?GyRCSScbKEI=?=) Date: Tue, 03 May 2005 22:00:52 +0900 Subject: [SciPy-user] How to make the matrix include variable "x" In-Reply-To: <200505020237.13486.david.grant@telus.net> References: <5511131.1115019559796.u_kazu@nifty.com> <200505020237.13486.david.grant@telus.net> Message-ID: <42777604.3000300@nifty.com> Hi. Thank you very much for your replys. Robert Kern wrote: > When mat() is interpreting a string, it doesn't evaluate variables. I'm newbie about python & scipy. How did you know this point? Could you tell me a reference site? David Grant wrote: > it doesn't seem relevant for what you do below. Absolutely. I was confused what I wanted to say. (& I'm not good at English) Now I can make the variable Matrix. (3x1 matrix = colum vector; 1x3 matrix = row vector). Then next I'm going to make the N x N matrix from Nth sets of 1xN matirix. Is it possible? Thanks. -- Kazuhiko UEBAYASHI mailto:u_kazu(a)nifty.com From gpajer at rider.edu Tue May 3 09:51:17 2005 From: gpajer at rider.edu (Gary Pajer) Date: Tue, 03 May 2005 09:51:17 -0400 Subject: [SciPy-user] pyvtk ? In-Reply-To: <426E7091.6050607@rider.edu> References: <426E7091.6050607@rider.edu> Message-ID: <427781D5.2050305@rider.edu> Gary Pajer wrote: > Pearu, et al, > > I have been looking at MayaVi, and I want to explore further. pyvtk > is suggested (and Pearu's name is attached to it) but ... > > http://cens.ioc.ee/projects/pyvtk/ has been unresponsive for a few > days. Is this a server problem, or has it moved? Or has it been > withdrawn? this site is still unresponsive. Anyone have a clue? From dd55 at cornell.edu Tue May 3 14:17:13 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 3 May 2005 14:17:13 -0400 Subject: [SciPy-user] install problems In-Reply-To: <1115080948.27954.18.camel@gaijin> References: <1115057493.27359.4.camel@gaijin> <200505021426.31670.dd55@cornell.edu> <1115080948.27954.18.camel@gaijin> Message-ID: <200505031417.14211.dd55@cornell.edu> On Monday 02 May 2005 8:42 pm, you wrote: > > Thanks, that blocked setup.py from getting the wrong libs but nothing I > do seems to get scipy installed. > > I had to modify site.cfg: > [DEFAULT] > library_dirs = /ul/zollars/lib > include_dirs = /ul/zollars/include > because setup continued to find the bad libs when the default system lib > directories were listed. Will this be a problem? > > [atlas] > library_dirs = /ul/zollars/opt/ATLAS/lib/Linux_g77_32 > atlas_libs = lapack_g77_32, f77blas_g77_32, cblas_g77_32,atlas_g77_32 > I am getting this with setup.py install: > > gcc -pthread -shared > build/temp.linux-ppc64-2.4/build/src/atlas_version_-0x74e85a32.o > -L/ul/zollars/opt/ATLAS/lib/Linux_g77_32 -latlas_g77_32 -o > build/temp.linux-ppc64-2.4/atlas_version.so > FOUND: > libraries = ['lapack_g77_32', 'f77blas_g77_32', 'cblas_g77_32', > 'atlas_g77_32'] > library_dirs = ['/ul/zollars/opt/ATLAS/lib/Linux_g77_32'] > language = c > define_macros = [('ATLAS_INFO', '"\\"3.4.2\\""')] > > lapack_opt_info: > atlas_threads_info: > scipy_distutils.system_info.atlas_threads_info > scipy_core/scipy_distutils/system_info.py:598: UserWarning: > ********************************************************************* > Could not find lapack library within the ATLAS installation. > ********************************************************************* > > I built the full lapack library according to the instructions with > ATLAS. What else can I check? > > I have also tried setting the $BLAS and $LAPACK variables which do lead > to a successful scipy install, but then fails the > t=scipy.test(level=1). In this case the errors indicate problems with > the scipy.linalg module, further suggesting problems with LAPACK. But I > don't know what else to check. > It looks like there is a problem with some of your libraries. Are you sure you built the full libraries, and successfully? Just for kicks, how big are they? I havent tried building Scipy against 3.4.2, can you upgrade? $ ls -lh /usr/lib/lapack/atlas/ total 9.6M -rw-r--r-- 1 root root 5.5M Apr 11 15:31 liblapack.a -rwxr-xr-x 1 root root 855 Apr 11 15:31 liblapack.la lrwxrwxrwx 1 root root 18 Apr 11 15:31 liblapack.so -> liblapack.so.0.0.0 lrwxrwxrwx 1 root root 18 Apr 11 15:31 liblapack.so.0 -> liblapack.so.0.0.0 -rwxr-xr-x 1 root root 4.1M Apr 11 15:31 liblapack.so.0.0.0 $ ls -lh /usr/lib/blas/atlas/ total 944K -rw-r--r-- 1 root root 356K Apr 11 15:17 libblas.a -rwxr-xr-x 1 root root 824 Apr 11 15:17 libblas.la lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so -> libblas.so.0.0.0 lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so.0 -> libblas.so.0.0.0 -rwxr-xr-x 1 root root 139K Apr 11 15:17 libblas.so.0.0.0 -rw-r--r-- 1 root root 299K Apr 11 15:17 libcblas.a -rwxr-xr-x 1 root root 825 Apr 11 15:17 libcblas.la lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so -> libcblas.so.0.0.0 lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so.0 -> libcblas.so.0.0.0 -rwxr-xr-x 1 root root 138K Apr 11 15:17 libcblas.so.0.0.0 -- Darren S. Dale Bard Hall Department of Materials Science and Engineering Cornell University Ithaca, NY. 14850 dd55 at cornell.edu From zollars at caltech.edu Tue May 3 14:43:21 2005 From: zollars at caltech.edu (Eric Zollars) Date: Tue, 03 May 2005 11:43:21 -0700 Subject: [SciPy-user] install problems In-Reply-To: <200505031417.14211.dd55@cornell.edu> References: <1115057493.27359.4.camel@gaijin> <1115080948.27954.18.camel@gaijin> <200505031417.14211.dd55@cornell.edu> Message-ID: <1115145801.28025.13.camel@gaijin> > It looks like there is a problem with some of your libraries. Are you sure you > built the full libraries, and successfully? Just for kicks, how big are they? > I havent tried building Scipy against 3.4.2, can you upgrade? > > $ ls -lh /usr/lib/lapack/atlas/ > total 9.6M > -rw-r--r-- 1 root root 5.5M Apr 11 15:31 liblapack.a > -rwxr-xr-x 1 root root 855 Apr 11 15:31 liblapack.la > lrwxrwxrwx 1 root root 18 Apr 11 15:31 liblapack.so -> liblapack.so.0.0.0 > lrwxrwxrwx 1 root root 18 Apr 11 15:31 liblapack.so.0 -> liblapack.so.0.0.0 > -rwxr-xr-x 1 root root 4.1M Apr 11 15:31 liblapack.so.0.0.0 > > $ ls -lh /usr/lib/blas/atlas/ > total 944K > -rw-r--r-- 1 root root 356K Apr 11 15:17 libblas.a > -rwxr-xr-x 1 root root 824 Apr 11 15:17 libblas.la > lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so -> libblas.so.0.0.0 > lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so.0 -> libblas.so.0.0.0 > -rwxr-xr-x 1 root root 139K Apr 11 15:17 libblas.so.0.0.0 > -rw-r--r-- 1 root root 299K Apr 11 15:17 libcblas.a > -rwxr-xr-x 1 root root 825 Apr 11 15:17 libcblas.la > lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so -> libcblas.so.0.0.0 > lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so.0 -> libcblas.so.0.0.0 > -rwxr-xr-x 1 root root 138K Apr 11 15:17 libcblas.so.0.0.0 Thank you for your continued help Darren. ls -lh -rw-r--r-- 1 zollars mayo 8.0M May 2 15:41 libatlas_g77_32.a -rw-r--r-- 1 zollars mayo 282K May 2 15:41 libcblas_g77_32.a -rw-r--r-- 1 zollars mayo 334K May 2 15:41 libf77blas_g77_32.a -rw-r--r-- 1 zollars mayo 6.6M May 2 15:41 liblapack_g77_32.a -rw-r--r-- 1 zollars mayo 319K May 2 15:41 libtstatlas_g77_32.a As you can see the lapack library has been supplemented with the full lapack set of routines. I agree I think the problem is here. I need to confirm that I am doing everything correct with g77/gcc interoperability. In my ATLAS Makefile I have: F2CDEFS = -DAdd__ -DStringSunStyle The only libraries linked in are: LIBS = -lm (i.e. no g2c,f2c,etc.) The LAPACK makefile does not have any of these options. My suspicion is that -DAdd__ may have something to do with the problem. In the meantime I have built scipy with BLAS_SRC and LAPACK_SRC. Also, I am sticking with ATLAS 3.4.2 for now because 3.6 requires gcc-3.3 and I haven't been motivated to attempt a local build of gcc. Thanks again. Eric From dd55 at cornell.edu Tue May 3 15:44:13 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 3 May 2005 15:44:13 -0400 Subject: [SciPy-user] install problems In-Reply-To: <1115145801.28025.13.camel@gaijin> References: <1115057493.27359.4.camel@gaijin> <200505031417.14211.dd55@cornell.edu> <1115145801.28025.13.camel@gaijin> Message-ID: <200505031544.13516.dd55@cornell.edu> On Tuesday 03 May 2005 2:43 pm, Eric Zollars wrote: > > It looks like there is a problem with some of your libraries. Are you > > sure you built the full libraries, and successfully? Just for kicks, how > > big are they? I havent tried building Scipy against 3.4.2, can you > > upgrade? > > > > $ ls -lh /usr/lib/lapack/atlas/ > > total 9.6M > > -rw-r--r-- 1 root root 5.5M Apr 11 15:31 liblapack.a > > -rwxr-xr-x 1 root root 855 Apr 11 15:31 liblapack.la > > lrwxrwxrwx 1 root root 18 Apr 11 15:31 liblapack.so -> > > liblapack.so.0.0.0 lrwxrwxrwx 1 root root 18 Apr 11 15:31 > > liblapack.so.0 -> liblapack.so.0.0.0 -rwxr-xr-x 1 root root 4.1M Apr 11 > > 15:31 liblapack.so.0.0.0 > > > > $ ls -lh /usr/lib/blas/atlas/ > > total 944K > > -rw-r--r-- 1 root root 356K Apr 11 15:17 libblas.a > > -rwxr-xr-x 1 root root 824 Apr 11 15:17 libblas.la > > lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so -> libblas.so.0.0.0 > > lrwxrwxrwx 1 root root 16 Apr 11 15:17 libblas.so.0 -> > > libblas.so.0.0.0 -rwxr-xr-x 1 root root 139K Apr 11 15:17 > > libblas.so.0.0.0 > > -rw-r--r-- 1 root root 299K Apr 11 15:17 libcblas.a > > -rwxr-xr-x 1 root root 825 Apr 11 15:17 libcblas.la > > lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so -> > > libcblas.so.0.0.0 lrwxrwxrwx 1 root root 17 Apr 11 15:17 libcblas.so.0 > > -> libcblas.so.0.0.0 -rwxr-xr-x 1 root root 138K Apr 11 15:17 > > libcblas.so.0.0.0 > > Thank you for your continued help Darren. > ls -lh > -rw-r--r-- 1 zollars mayo 8.0M May 2 15:41 libatlas_g77_32.a > -rw-r--r-- 1 zollars mayo 282K May 2 15:41 libcblas_g77_32.a > -rw-r--r-- 1 zollars mayo 334K May 2 15:41 libf77blas_g77_32.a > -rw-r--r-- 1 zollars mayo 6.6M May 2 15:41 liblapack_g77_32.a > -rw-r--r-- 1 zollars mayo 319K May 2 15:41 libtstatlas_g77_32.a > > As you can see the lapack library has been supplemented with the full > lapack set of routines. I agree I think the problem is here. I need to > confirm that I am doing everything correct with g77/gcc > interoperability. > > In my ATLAS Makefile I have: > F2CDEFS = -DAdd__ -DStringSunStyle > The only libraries linked in are: > LIBS = -lm (i.e. no g2c,f2c,etc.) > > The LAPACK makefile does not have any of these options. My suspicion is > that -DAdd__ may have something to do with the problem. > > In the meantime I have built scipy with BLAS_SRC and LAPACK_SRC. > Also, I am sticking with ATLAS 3.4.2 for now because 3.6 requires > gcc-3.3 and I haven't been motivated to attempt a local build of gcc. > I sorry I can't help you with the ATLAS compile issues. I use Gentoo, and for better or worse have never edited a makefile. Why do I feel ashamed of this? -- Darren S. Dale Bard Hall Department of Materials Science and Engineering Cornell University Ithaca, NY. 14850 dd55 at cornell.edu From pearu at scipy.org Tue May 3 16:55:02 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 3 May 2005 15:55:02 -0500 (CDT) Subject: [SciPy-user] pyvtk ? In-Reply-To: <427781D5.2050305@rider.edu> References: <426E7091.6050607@rider.edu> <427781D5.2050305@rider.edu> Message-ID: On Tue, 3 May 2005, Gary Pajer wrote: > Gary Pajer wrote: > >> Pearu, et al, >> >> I have been looking at MayaVi, and I want to explore further. pyvtk is >> suggested (and Pearu's name is attached to it) but ... >> >> http://cens.ioc.ee/projects/pyvtk/ has been unresponsive for a few >> days. Is this a server problem, or has it moved? Or has it been >> withdrawn? > > this site is still unresponsive. Anyone have a clue? Could you send me the IP of your machine off-list? This is to check whether it was added to hosts.deny list of cens.ioc.ee. cens is taking at least one attack per day and so its security level is set quite high these days. Pearu From malex at tagancha.org Tue May 3 17:56:22 2005 From: malex at tagancha.org (Oleksandr Moskalenko) Date: Tue, 3 May 2005 15:56:22 -0600 Subject: [SciPy-user] Plot window suppression in scipy.gplt Message-ID: <20050503215622.GA27552@mrb319.cvmbs.colostate.edu> Hi, First, thanks for scipy - a great piece of software. I've run into a problem I can't resolve with docs and reading the source code. I've been going through the mailing lists through the search interface and used Google, but I couldn't find information on how I could suppress the plot window that pops up when running gplt.plot. I am producing a png hardcopy of the plot with gplt.output and that is all I need, really. This script will be used as a back-end for a web app and I don't want to see that window pop up on every form submission. Thanks, Alex From bryan.cole at teraview.com Wed May 4 05:34:30 2005 From: bryan.cole at teraview.com (Bryan Cole) Date: Wed, 04 May 2005 10:34:30 +0100 Subject: [SciPy-user] Compiling C++ extensions using Enthon-2.3.5 Message-ID: Hi, I'm trying to wrap a small C++ library on windows, using Enthon-2.3.5 (the test version) and the version of GCC distributed with this. When I run: python setup.py build_ext --compiler=mingw32 it doesn't call gcc (or g++) as expected but calls 'cc' (presumably a placeholder for the correct compiler executable) which is undefined. How do I tell distutils the correct path to the CPP compiler? Note, compiling C-extensions works great right out-the-box. "c:\python2.3\enthought\mingw\bin\gcc.exe" is called, as expected. It's just *.cpp source files that don't work. C++ extensions work fine with MSVC-6. I've not tried this with the "stable" Enthon version yet. cheers, Bryan From H.FANGOHR at soton.ac.uk Wed May 4 06:25:51 2005 From: H.FANGOHR at soton.ac.uk (Hans Fangohr) Date: Wed, 4 May 2005 11:25:51 +0100 (BST) Subject: [SciPy-user] Plot window suppression in scipy.gplt In-Reply-To: <20050503215622.GA27552@mrb319.cvmbs.colostate.edu> References: <20050503215622.GA27552@mrb319.cvmbs.colostate.edu> Message-ID: Hi Alex, > First, thanks for scipy - a great piece of software. I've run into a problem I > can't resolve with docs and reading the source code. I've been going through > the mailing lists through the search interface and used Google, but I couldn't > find information on how I could suppress the plot window that pops up when > running gplt.plot. I am producing a png hardcopy of the plot with gplt.output > and that is all I need, really. This script will be used as a back-end for a > web app and I don't want to see that window pop up on every form submission. I don't know much about gplt. However, I know matplotlib (http://matplotlib.sourceforge.net/) which does 'off-line' rendering (as default). It's a young but promising project. Maybe this is useful, Hans From gruben at bigpond.net.au Wed May 4 06:51:50 2005 From: gruben at bigpond.net.au (Gary Ruben) Date: Wed, 04 May 2005 20:51:50 +1000 Subject: [SciPy-user] pyvtk ? In-Reply-To: <427781D5.2050305@rider.edu> References: <426E7091.6050607@rider.edu> <427781D5.2050305@rider.edu> Message-ID: <4278A946.2000500@bigpond.net.au> I've just sent Gary P the PyVTK tar offlist to help him out. I can get to Pearu's site fine. Gary R. Gary Pajer wrote: > Gary Pajer wrote: > >> Pearu, et al, >> >> I have been looking at MayaVi, and I want to explore further. pyvtk >> is suggested (and Pearu's name is attached to it) but ... >> >> http://cens.ioc.ee/projects/pyvtk/ has been unresponsive for a few >> days. Is this a server problem, or has it moved? Or has it been >> withdrawn? > > > this site is still unresponsive. Anyone have a clue? > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From nwagner at mecha.uni-stuttgart.de Wed May 4 07:11:01 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 04 May 2005 13:11:01 +0200 Subject: [SciPy-user] linalg.solve and singular systems Message-ID: <4278ADC5.1010605@mecha.uni-stuttgart.de> Hi all, I was surprised about the behaviour of linalg.solve in case of singular/nearly singular coefficient matrices. The user should get at least a warning. The singular values clearly show that the matrix is ill-conditioned. Any comment or suggestion would be appreciated. Thanks in advance Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: strange.py Type: text/x-python Size: 191 bytes Desc: not available URL: From profeta at esrf.fr Wed May 4 08:19:15 2005 From: profeta at esrf.fr (Mickael Profeta) Date: Wed, 04 May 2005 14:19:15 +0200 Subject: [SciPy-user] Compilation Scipy 0.3.2 Message-ID: <4278BDC3.9060203@esrf.fr> Hi I try to compile scipy_complet 0.3.2 on a Linux platform with gcc 3.4.3, python 2.4, fftw 2.1.5 and F2PY-2.45.241_1926. I found some troubles in the compilation of ffpack, and after investigation, it seems to be related with all fortran libraries wrapped with scipy. The error is for example this one: gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-proto+types -I/scisoft/ESRF_sw/linux_i386_03/include -fPIC' compile options: '-DSCIPY_FFTW_H -I/scisoft/ESRF_sw/linux_i386_03/include -Ibuil+d/src -I/scisoft/ESRF_sw/linux_i386_03/include/python2.4 -c' /mntdirect/_scisoft/ESRF_sw/linux_i386_03/bin/g77 -L/scisoft/ESRF_sw/linux_i386_03/lib build/temp.linux-i686-2.4/build/src/Lib/fftpack/_fftpackmodule.o build/temp.linux-i686-2.4/Lib/fftpack/src/zfft.o build/temp.linux-i686-2.4/Lib/fftpack/src/drfft.o build/temp.linux-i686-2.4/Lib/fftpack/src/zrfft.o build/temp.linux-i686-2.4/Lib/fftpack/src/zfftnd.o build/temp.linux-i686-2.4/build/src/fortranobject.o -L/scisoft/ESRF_sw/linux_i386_03/lib -Lbuild/temp.linux-i686-2.4 -ldfftpack -lrfftw -lfftw -lg2c -o build/lib.linux-i686-2.4/scipy/fftpack/_fftpack.so build/temp.linux-i686-2.4/build/src/Lib/fftpack/_fftpackmodule.o(.text+0x23): In function `int_from_pyobj': build/src/Lib/fftpack/_fftpackmodule.c:100: undefined reference to `PyInt_Type' build/temp.linux-i686-2.4/build/src/Lib/fftpack/_fftpackmodule.o(.text+0x33):bui+ld/src/Lib/fftpack/_fftpackmodule.c:100: undefined reference to `PyType_IsSubtyp To solve the compilation problem, it is necessary to add -shared to link step. But I do not understand why this -shared is not present. This seems to be related with scipy-distutils: % python scipy_distutils/gnufcompiler.py config_fc --verbose customize GnuFCompiler GnuFCompiler instance properties: archiver = ['ar', '-cr'] compile_switch = '-c' compiler_f77 = ['/mntdirect/_scisoft/ESRF_sw/linux_i386_03/bin/g77', '-Wall', '-fno-second-underscore', '-fPIC', '-O3', '-funroll -loops'] compiler_f90 = None compiler_fix = None libraries = ['g2c'] library_dirs = [] linker_so = ['/mntdirect/_scisoft/ESRF_sw/linux_i386_03/bin/g77', '-L/scisoft/ESRF_sw/linux_i386_03/lib'] object_switch = '-o ' ranlib = ['ranlib'] version_cmd = ['/mntdirect/_scisoft/ESRF_sw/linux_i386_03/bin/g77', ' --version'] 3.4.3 linker_so does not contains the -shared flag. Does anyone got an idea on how to solve the problem, I get lost with all setup.py... Thanks for help Mickael -- Mickael Profeta Scientific Software Service European Synchrotron Radiation Facility Tel: 04.76.88.26.04 From bryan.cole at teraview.com Wed May 4 09:12:42 2005 From: bryan.cole at teraview.com (Bryan Cole) Date: Wed, 04 May 2005 14:12:42 +0100 Subject: [SciPy-user] Re: Compiling C++ extensions using Enthon-2.3.5 References: Message-ID: Problem Solved! In case anyone else cares, I needed to modify the Mingw32CCompiler class (defined in 'C:Python23\Lib\distutils\cygwinccompiler') to add a 'compiler_cxx="g++ -mno-cygwin ... etc. ..."' (copied from the compiler=... line with gcc replaced by g++). I can't remember the exact wording as I'm writing from memory: it's fairly obvious after browsing this file and reading the distutils.cygwinccompiler docs. From pajer at iname.com Thu May 5 15:27:39 2005 From: pajer at iname.com (Gary) Date: Thu, 05 May 2005 15:27:39 -0400 Subject: [SciPy-user] pyvtk ? In-Reply-To: References: <426E7091.6050607@rider.edu> <427781D5.2050305@rider.edu> Message-ID: <427A73AB.6040707@iname.com> Pearu Peterson wrote: > > > On Tue, 3 May 2005, Gary Pajer wrote: > >> Gary Pajer wrote: >> >>> Pearu, et al, >>> >>> I have been looking at MayaVi, and I want to explore further. pyvtk >>> is suggested (and Pearu's name is attached to it) but ... >>> >>> http://cens.ioc.ee/projects/pyvtk/ has been unresponsive for a >>> few days. Is this a server problem, or has it moved? Or has it >>> been withdrawn? >> >> >> this site is still unresponsive. Anyone have a clue? > > > Could you send me the IP of your machine off-list? I was able to download from home. (Gary R: thanks for your effort ... but somehow I never got your e-mail with the attachment). Pearu, if you want to persue the IP address thing, let me know. Otherwise, we can drop this issue. -gary From malex at tagancha.org Thu May 5 15:50:09 2005 From: malex at tagancha.org (Oleksandr Moskalenko) Date: Thu, 5 May 2005 13:50:09 -0600 Subject: [SciPy-user] Plot window suppression in scipy.gplt In-Reply-To: <20050503215622.GA27552@mrb319.cvmbs.colostate.edu> References: <20050503215622.GA27552@mrb319.cvmbs.colostate.edu> Message-ID: <20050505195009.GB15310@mrb319.cvmbs.colostate.edu> * Oleksandr Moskalenko [2005-05-03 15:56:22 -0600]: > Hi, > > First, thanks for scipy - a great piece of software. I've run into a problem I > can't resolve with docs and reading the source code. I've been going through > the mailing lists through the search interface and used Google, but I couldn't > find information on how I could suppress the plot window that pops up when > running gplt.plot. I am producing a png hardcopy of the plot with gplt.output > and that is all I need, really. This script will be used as a back-end for a > web app and I don't want to see that window pop up on every form submission. > > Thanks, > > Alex Well, I haven't received any replies concerning external window suppression when using gplt. However, I solved my problem by starting with a suggestion found in http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2003-July/001799.html. In case someone runs into the same problem later on I'm including a summary of the working solution. My final code: self.outfile = os.path.join(outdir,outname) g=gplt.current() g._send('set term png') self.outcommand = 'set output "%s"; replot' % (self.outfile) g._send(self.outcommand) gplt.hold('on') gplt.plot(self.initxval,self.yinitreal,"title 'Empirical-real' with points pt 4") g._send(self.outcommand) gplt.plot(self.initxval,self.yinitimag,"title 'Empirical-imag.' with points pt 3") g._send(self.outcommand) gplt.plot(self.intxval,self.intreal,"title 'Interpolated-real' with lines") g._send(self.outcommand) gplt.plot(self.intxval,self.intimag,"title 'Interpolated-imag.' with lines") g._send(self.outcommand) gplt.legend('left top Right enhanced') g._send(self.outcommand) gplt.title('Cubic-spline interpolation of optical data') g._send(self.outcommand) gplt.xtitle('Wavelength, nm') g._send(self.outcommand) gplt.ytitle('Refraction and Conductivity') gplt.close() As you can see it seems necessary to set the output of gnuplot on each plotting pass, otherwise only the first plot will be found in the output file. As I am working on a web application with Quixote (a Pythonic web framework) I also ran into a problem with subsequent runs that added to the previous plot, so "gplt.close()" was necessary for a clean separation between plots. This is all that was needed to work. Regards, Alex. From hugelmopf at web.de Fri May 6 08:20:47 2005 From: hugelmopf at web.de (Frank) Date: Fri, 6 May 2005 14:20:47 +0200 Subject: [SciPy-user] ImportError for scipy.xplt In-Reply-To: <200505011047.06024.hugelmopf@web.de> References: <200505011047.06024.hugelmopf@web.de> Message-ID: <200505061420.47744.hugelmopf@web.de> In the scipy list archives I found a thread[1] which addresses my problem and I tried the suggestion in Arnd Baecker's post, saying to edit the arguments of "find_mask" in slice3.py. Now scipy.xplt is working, although Numeric still has some problems that I can reproduce with the algorithm explained in that thread. So there still seems to exist some problem either with Numeric 23.8 or its Debian packages. I will keep an eye on it, but am happy that scipy.xplt is working now. Frank [1] http://www.scipy.net/pipermail/scipy-user/2004-October/003496.html From u_kazu at nifty.com Sun May 8 23:43:19 2005 From: u_kazu at nifty.com (Kazuhiko UEBAYASHI) Date: Mon, 9 May 2005 12:43:19 +0900 (JST) Subject: [SciPy-user] Changing the graph title with scipy - plt plot. Message-ID: <671603.1115610199542.u_kazu@nifty.com> Hi, According to http://www.scipy.org/documentation/plottutorial.html #"Using plt and gplt" , we can change the graph title, doing >>> plt.title('First Order Bessel Function') . But It does n't work. Python interactive shell says >>> plt.title('First Order Bessel Function') Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.3/site-packages/scipy/plt/interface.py", \ line 144, in title _active.title.text = name AttributeError: 'instancemethod' object has no attribute 'text' . But {x,y} title is changed by >> plt.xtitle('x axis'); plt.ytitle('y axis') . Does anyone know how to change the graph titile on python interactive shell using scipy, plt? --- Kazuhiko UEBAYASHI u_kazu(a)nifty.com From ryanfedora at comcast.net Mon May 9 22:29:15 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Mon, 09 May 2005 22:29:15 -0400 Subject: [SciPy-user] installation in FC3 Message-ID: <42801C7B.5000003@comcast.net> (my appologies to the moderator. I initially sent this message from the wrong account and it bounced back pending moderator approval.) Hey, I am having trouble getting scipy installed in Linux. I am a grad student in mechanical engineering and have been using Matlab in Windows to do much of my data analysis and some curve fitting/optimization. I would like to switch over to using scipy in linux instead, but I am having trouble getting things installed. I am fairly new to linux but have been using python for about 9 months. I am running Fedora Core 3 and kde on a Compaq Presario laptop (P4). I tried in stalling using rpms, but the Numeric rpm kept telling me that I needed python >=2.3, even though I am running 2.3.4. I finally got Numeric installed from source, but the scipy rpm gives a segmentation fault when I try import scipy from within python. So, I have beening doing everything to install scipy from source and thought things were going fairly well. I type python setup.py install and lots of stuff scrolls by and things are compiling and then I get the message: In file included from Lib/xplt/src/play/x11/colors.c:9: Lib/xplt/src/play/x11/playx.h:11:22: X11/Xlib.h: No such file or directory and then lots of related errors come up. I tried searching the scipy archives for X11 and Xlib.h, but couldn't find anything. I also tried "locate Xlib.h" but it is don't on my system. Any help would be appreciated, Ryan Krauss requested info from web site follows: [ryan at localhost ~]$ python -c 'from f2py2e.diagnose import run;run()' ------ os.name='posix' ------ sys.platform='linux2' ------ sys.version: 2.3.4 (#1, Oct 26 2004, 16:42:40) [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] ------ sys.prefix: /usr ------ sys.path=':/usr/lib/python23.zip:/usr/lib/python2.3:/usr/lib/python2.3/plat-linux2:/usr/lib/python2.3/lib-tk:/usr/lib/python2.3/lib-dynload:/usr/lib/python2.3/site-packages:/usr/lib/python2.3/site-packages/Numeric:/usr/lib/python2.3/site-packages/gtk-2.0:/usr/lib/python2.3/site-packages/wx-2.6-gtk2-unicode' ------ Failed to import numarray: No module named numarray Failed to import scipy_distutils: No module named scipy_distutils Found Numeric version '23.8' in /usr/lib/python2.3/site-packages/Numeric/Numeric.pyc Found f2py2e version '2.45.241_1926' in /usr/lib/python2.3/site-packages/f2py2e/f2py2e.pyc -------------------------------------- #1 -------------------------------------- [ryan at localhost ~]$ python -c 'import os,sys;print os.name,sys.platform' posix linux2 [ryan at localhost ~]$ uname -a Linux localhost.localdomain 2.6.11-1.14_FC3 #1 Thu Apr 7 19:23:49 EDT 2005 i686i686 i386 GNU/Linux ----------------------------------- #2 ----------------------------------- [ryan at localhost ~]$ gcc -v Reading specs from /usr/lib/gcc/i386-redhat-linux/3.4.2/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-java-awt=gtk --host=i386-redhat-linux Thread model: posix gcc version 3.4.2 20041017 (Red Hat 3.4.2-6.fc3) [ryan at localhost ~]$ g77 --version GNU Fortran (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3) Copyright (C) 2004 Free Software Foundation, Inc. --------------------------------- #3 --------------------------------- [ryan at localhost ~]$ python -c 'import sys;print sys.version' 2.3.4 (#1, Oct 26 2004, 16:42:40) [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] ---------------------------------- #4 ---------------------------------- [ryan at localhost ~]$ python -c 'import Numeric;print Numeric.__version__' 23.8 ----------------------------------- #5 ----------------------------------- [ryan at localhost ~]$ f2py -v 2.45.241_1926 ----------------------------------- #6 ----------------------------------- [ryan at localhost linalg]$ python setup_atlas_version.py build_ext --inplace --force Traceback (most recent call last): File "setup_atlas_version.py", line 5, in ? from scipy_distutils.misc_util import get_path, default_config_dict ImportError: No module named scipy_distutils.misc_util [ryan at localhost linalg]$ python -c 'import atlas_version' --------------------------------- #7 ---------------------------------\[ryan at localhost scipy_distutils]$ python system_info.py _pkg_config_info: NOT AVAILABLE agg2_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: /usr/local/lib/atlas/libf77blas.a ) ( paths: /usr/local/lib/atlas/libcblas.a ) ( paths: /usr/local/lib/atlas/libatlas.a ) ( paths: /usr/local/lib/atlas/libf77blas.a ) ( paths: /usr/local/lib/atlas/libcblas.a ) ( paths: /usr/local/lib/atlas/libatlas.a ) system_info.atlas_blas_info ( include_dirs = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) ( paths: /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/local/lib/atlas'] language = c include_dirs = ['/usr/local/lib/atlas'] atlas_blas_threads_info: ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: /usr/local/lib/atlas/libatlas.a ) ( paths: /usr/local/lib/atlas/libatlas.a ) system_info.atlas_blas_threads_info NOT AVAILABLE atlas_info: ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: /usr/local/lib/atlas/libf77blas.a ) ( paths: /usr/local/lib/atlas/libcblas.a ) ( paths: /usr/local/lib/atlas/libatlas.a ) ( paths: /usr/local/lib/atlas/liblapack.a ) system_info.atlas_info ( include_dirs = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) ( paths: /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h ) FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/local/lib/atlas/'] language = f77 include_dirs = ['/usr/local/lib/atlas'] atlas_threads_info: ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: /usr/local/lib/atlas/libatlas.a ) ( paths: /usr/local/lib/atlas/libatlas.a ) system_info.atlas_threads_info NOT AVAILABLE blas_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/libblas.so ) FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 blas_opt_info: Traceback (most recent call last): File "system_info.py", line 1430, in ? show_all() File "system_info.py", line 1427, in show_all r = c.get_info() File "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", line 314, in get_info self.calc_info() File "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", line 972, in calc_info atlas_version = get_atlas_version(**version_info) File "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", line 809, in get_atlas_version from core import Extension, setup File "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/core.py", line 5, in ? from scipy_distutils.dist import Distribution ImportError: No module named scipy_distutils.dist #7b: build_flib.py is not in this directory From falted at pytables.org Tue May 10 07:27:46 2005 From: falted at pytables.org (Francesc Altet) Date: Tue, 10 May 2005 13:27:46 +0200 Subject: [SciPy-user] PyTables 1.0 released Message-ID: <200505101327.46233.falted@pytables.org> ========================= Announcing PyTables 1.0 ========================= The Carabos crew is very proud to announce the immediate availability of **PyTables release 1.0**. On this release you will find a series of exciting new features, being the most important the Undo/Redo capabilities, support for objects (and indexes!) with more than 2**31 rows, better I/O performance for Numeric objects, new time datatypes (useful for time-stamping fields), support for Octave HDF5 files and improved support for HDF5 native files. What it is ========== **PyTables** is a package for managing hierarchical datasets and designed to efficiently cope with extremely large amounts of data (with support for full 64-bit file addressing). It features an object-oriented interface that, combined with C extensions for the performance-critical parts of the code, makes it a very easy-to-use tool for high performance data storage and retrieval. It is built on top of the HDF5 library and the numarray package, and provides containers for both heterogeneous data (``Table``) and homogeneous data (``Array``, ``EArray``) as well as containers for keeping lists of objects of variable length (like Unicode strings or general Python objects) in a very efficient way (``VLArray``). It also sports a series of filters allowing you to compress your data on-the-fly by using different compressors and compression enablers. But perhaps the more interesting features are its powerful browsing and searching capabilities that allow you to perform data selections over heterogeneous datasets exceeding gigabytes of data in just tenths of second. Besides, the PyTables I/O is buffered, implemented in C and carefully tuned so that you can reach much better performance with PyTables than with your own home-grown wrappings to the HDF5 library. Changes more in depth ===================== Improvements: - New Undo/Redo feature (i.e. integrated support for undoing and/or redoing actions). This functionality lets you to put marks in specific places of your data operations, so that you can make your HDF5 file pop back (undo) to a specific mark (for example for inspecting how your data looked at that point). You can also go forward to a more recent marker (redo). You can even do jumps to the marker you want using just one instruction. - Reading Numeric objects from ``*Array`` and ``Table`` (Numeric columns) objects have a 50-100x speedup. With that, Louis Wicker reported that a speed of 350 MB/s can be achieved with Numeric objects (on a SGI Altix with a Raid 5 disk array) while with numarrays, this speed approaches 900 MB/s. This improvement has been possible mainly due to a nice recipe from Todd Miller. Thanks Todd! - With PyTables 1.0 you can finally create Tables, EArrays and VLArrays with more than 2**31 (~ 2 thousand millions) rows, as well as retrieve them. Before PyTables 1.0, retrieving data on these beasts was not well supported, in part due to limitations in some slicing functions in Python (that rely on 32-bit adressing). So, we have made the necessary modifications in these functions to support 64-bit indexes and integrated them into PyTables. As a result, our tests shows that this feature works just fine. - As a consequence of the above, you can now index columns of tables with more than 2**31 rows. For instance, indexes have been created for integer columns with 10**10 (yeah, 10 thousand million) rows in less than 1 hour using an Opteron @ 1.6 GHz system (~ 1 hour and half with a Xeon Intel32 @ 2.5 GHz platform). Enjoy! - Now PyTables supports the native HDF5 time types, both 32-bit signed integer and 64-bit fixed point timestamps. They are mapped to ``Int32`` and ``Float64`` values for easy manipulation. See the documentation for the ``Time32Col`` and ``Time64Col`` classes. - Massive internal reorganization of the methods that deal with the hierarchy. Hopefully, that will enable a better understanding of the code for anybody wanting to add/modify features. - The opening and copying of files with large number of objects has been made faster by correcting a typo in ``Table._open()``. Thanks to Ashley Walsh for sending a patch for this. - Now, one can modify rank-0 (scalar) ``EArray`` datasets. Thanks to Norbert Nemec for providing a patch for this. - You are allowed from this version on to add non-valid natural naming names as node or attribute names. A warning is issued to warn (but not forbid) you in such a case. Of course, you have to use the ``getattr()`` function so as to access such invalid natural names. - The indexes of ``Table`` and ``*Array`` datasets can be of long type besides of integer type. However, indexes in slices are still restricted to regular integer type. - The concept of ``READ_ONLY`` system attributes has disappeared. You can change them now at your own risk! However, you still cannot remove or rename system attributes. - Now, one can do reads in-between write loops using ``table.row`` instances. This is thanks to a decoupling in I/O buffering: now there is a buffer for reading and other for writing, so that no collisions take place anymore. Fixes #1186892. - Support for Octave HDF5 output format. Even complex arrays are supported. Thanks to Edward C. Jones for reporting this format. Backward-incompatible changes: - The format of indexes has been changed and indexes in files created with PyTables pre-1.0 versions are ignored now. However, ``ptrepack`` can still save your life because it is able to convert your old files into the new indexing format. Also, if you copy the affected tables to other locations (by using ``Leaf.copy()``), it will regenerate your indexes with the new format for you. - The API has changed a little bit (nothing serious) for some methods. See ``RELEASE-NOTES.txt`` for more details. Bug fixes: - Added partial support for native HDF5 chunked datasets. They can be read now, and even extended, but only along the first extensible dimension. This limitation may be removed when multiple extensible dimensions are supported in PyTables. - Formerly, when the name of a column in a table was subsumed in another column name, PyTables crashed while retrieving information of the former column. That has been fixed. - A bug prevented the use of indexed columns of tables that were in other hierarchical level than root. This is solved now. - When a ``Group`` was renamed you were not able to modify its attributes. This has been fixed. - When whether ``Table.modifyColumns()`` or ``Table.modifyRows()`` were called, a subsequent call to ``Table.flush()`` didn't really flush the modified data to disk. This works as intended now. - Fixed some issues when iterating over ``*Array`` objects using the ``List`` or ``Tuple`` flavor. Important note for Python 2.4 and Windows users =============================================== If you are willing to use PyTables with Python 2.4 in Windows platforms, you will need to get the HDF5 library compiled for MSVC 7.1, aka .NET 2003. It can be found at: ftp://ftp.ncsa.uiuc.edu/HDF/HDF5/current/bin/windows/5-164-win-net.ZIP Users of Python 2.3 on Windows will have to download the version of HDF5 compiled with MSVC 6.0 available in: ftp://ftp.ncsa.uiuc.edu/HDF/HDF5/current/bin/windows/5-164-win.ZIP Where can PyTables be applied? ============================== PyTables is not designed to work as a relational database competitor, but rather as a teammate. If you want to work with large datasets of multidimensional data (for example, for multidimensional analysis), or just provide a categorized structure for some portions of your cluttered RDBS, then give PyTables a try. It works well for storing data from data acquisition systems (DAS), simulation software, network data monitoring systems (for example, traffic measurements of IP packets on routers), very large XML files, or for creating a centralized repository for system logs, to name only a few possible uses. What is a table? ================ A table is defined as a collection of records whose values are stored in fixed-length fields. All records have the same structure and all values in each field have the same data type. The terms "fixed-length" and "strict data types" seem to be quite a strange requirement for a language like Python that supports dynamic data types, but they serve a useful function if the goal is to save very large quantities of data (such as is generated by many scientific applications, for example) in an efficient manner that reduces demand on CPU time and I/O resources. What is HDF5? ============= For those people who know nothing about HDF5, it is a general purpose library and file format for storing scientific data made at NCSA. HDF5 can store two primary objects: datasets and groups. A dataset is essentially a multidimensional array of data elements, and a group is a structure for organizing objects in an HDF5 file. Using these two basic constructs, one can create and store almost any kind of scientific data structure, such as images, arrays of vectors, and structured and unstructured grids. You can also mix and match them in HDF5 files according to your needs. Platforms ========= We are using Linux on top of Intel32 as the main development platform, but PyTables should be easy to compile/install on other UNIX machines. This package has also been successfully compiled and tested on a FreeBSD 5.4 with Opteron64 processors, a UltraSparc platform with Solaris 7 and Solaris 8, a SGI Origin3000 with Itanium processors running IRIX 6.5 (using the gcc compiler), Microsoft Windows and MacOSX (10.2 although 10.3 should work fine as well). In particular, it has been thoroughly tested on 64-bit platforms, like Linux-64 on top of an Intel Itanium, AMD Opteron (in 64-bit mode) or PowerPC G5 (in 64-bit mode) where all the tests pass successfully. Regarding Windows platforms, PyTables has been tested with Windows 2000 and Windows XP (using the Microsoft Visual C compiler), but it should also work with other flavors as well. Web site ======== Go to the PyTables web site for more details: http://pytables.sourceforge.net/ To know more about the company behind the PyTables development, see: http://www.carabos.com/ Share your experience ===================== Let us know of any bugs, suggestions, gripes, kudos, etc. you may have. ---- **Enjoy data!** -- The PyTables Team From jeremy at jeremysanders.net Tue May 10 11:29:09 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Tue, 10 May 2005 16:29:09 +0100 (BST) Subject: [SciPy-user] Numarray image processing Message-ID: Hi - I'm trying to think of an efficient way to do the following in numarray. I have an image, and I want to caclulate the mean of the pixels at a particular radius in the image (from a fixed point), and subtract this mean away from those pixels at that radius. Can anyone provide insight into how this can be done without looping over the pixels? I've managed to generate an image containing the radius of each pixel, but this isn't much use. Thanks Jeremy -- Jeremy Sanders http://www.jeremysanders.net/ Cambridge, UK Public Key Server PGP Key ID: E1AAE053 From jmiller at stsci.edu Tue May 10 11:48:50 2005 From: jmiller at stsci.edu (Todd Miller) Date: Tue, 10 May 2005 11:48:50 -0400 Subject: [SciPy-user] Numarray image processing In-Reply-To: References: Message-ID: <1115740130.27493.18.camel@halloween.stsci.edu> I'm not an astronomer, so take this with a grain of salt, but this is what comes to mind for me: >>> import numarray >>> a = numarray.arange(100, shape=(10,10)) >>> indices = numarray.indices(a.shape) >>> indices array([[[0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 1, 1, 1, 1, 1, 1, 1, 1, 1], [2, 2, 2, 2, 2, 2, 2, 2, 2, 2], [3, 3, 3, 3, 3, 3, 3, 3, 3, 3], [4, 4, 4, 4, 4, 4, 4, 4, 4, 4], [5, 5, 5, 5, 5, 5, 5, 5, 5, 5], [6, 6, 6, 6, 6, 6, 6, 6, 6, 6], [7, 7, 7, 7, 7, 7, 7, 7, 7, 7], [8, 8, 8, 8, 8, 8, 8, 8, 8, 8], [9, 9, 9, 9, 9, 9, 9, 9, 9, 9]], [[0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]]]) >>> x0, y0 = 5, 5 >>> dx, dy = indices[0]-y0,indices[1]-x0 >>> numarray.sqrt(dx**2+dy**2)<3 array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 1, 1, 1, 1, 0, 0], [0, 0, 0, 1, 1, 1, 1, 1, 0, 0], [0, 0, 0, 1, 1, 1, 1, 1, 0, 0], [0, 0, 0, 1, 1, 1, 1, 1, 0, 0], [0, 0, 0, 1, 1, 1, 1, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], type=Bool) >>> mask = numarray.sqrt(dx**2+dy**2)<3 >>> a[mask] array([33, 34, 35, 36, 37, 43, 44, 45, 46, 47, 53, 54, 55, 56, 57, 63, 64, 65, 66, 67, 73, 74, 75, 76, 77]) >>> a[mask].mean() 55.0 >>> a[mask] -= a[mask].mean() >>> a[mask] array([-22, -21, -20, -19, -18, -12, -11, -10, -9, -8, -2, -1, 0, 1, 2, 8, 9, 10, 11, 12, 18, 19, 20, 21, 22]) >>> a array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [ 10, 11, 12, 13, 14, 15, 16, 17, 18, 19], [ 20, 21, 22, 23, 24, 25, 26, 27, 28, 29], [ 30, 31, 32, -22, -21, -20, -19, -18, 38, 39], [ 40, 41, 42, -12, -11, -10, -9, -8, 48, 49], [ 50, 51, 52, -2, -1, 0, 1, 2, 58, 59], [ 60, 61, 62, 8, 9, 10, 11, 12, 68, 69], [ 70, 71, 72, 18, 19, 20, 21, 22, 78, 79], [ 80, 81, 82, 83, 84, 85, 86, 87, 88, 89], [ 90, 91, 92, 93, 94, 95, 96, 97, 98, 99]]) Regards, Todd On Tue, 2005-05-10 at 11:29, Jeremy Sanders wrote: > Hi - > > I'm trying to think of an efficient way to do the following in numarray. I > have an image, and I want to caclulate the mean of the pixels at a > particular radius in the image (from a fixed point), and subtract this > mean away from those pixels at that radius. > > Can anyone provide insight into how this can be done without looping over > the pixels? > > I've managed to generate an image containing the radius of each pixel, but > this isn't much use. > > Thanks > > Jeremy -- From anthony.seward at ieee.org Tue May 10 12:24:09 2005 From: anthony.seward at ieee.org (Anthony Joseph Seward) Date: Tue, 10 May 2005 10:24:09 -0600 (MDT) Subject: [SciPy-user] installation in FC3 In-Reply-To: <42801C7B.5000003@comcast.net> References: <42801C7B.5000003@comcast.net> Message-ID: <11958.129.238.237.96.1115742249.squirrel@www.mza.com> Where are you getting the Numeric .rpm from? I've modified the one from Fedora Extras to use Numeric 23.8 and this seemed to work for me. An rpm for a python package that is to be installed on Fedora should require a python-abi version not a python version. Tony Ryan Krauss said: > (my appologies to the moderator. I initially sent this message from the > wrong account and it bounced back pending moderator approval.) > > Hey, > > I am having trouble getting scipy installed in Linux. I am a grad > student in mechanical engineering and have been using Matlab in Windows > to do much of my data analysis and some curve fitting/optimization. I > would like to switch over to using scipy in linux instead, but I am > having trouble getting things installed. I am fairly new to linux but > have been using python for about 9 months. I am running Fedora Core 3 > and kde on a Compaq Presario laptop (P4). I tried in stalling using > rpms, but the Numeric rpm kept telling me that I needed python >=2.3, > even though I am running 2.3.4. I finally got Numeric installed from > source, but the scipy rpm gives a segmentation fault when I try import > scipy from within python. > > So, I have beening doing everything to install scipy from source and > thought things were going fairly well. I type python setup.py install > and lots of stuff scrolls by and things are compiling and then I get the > message: > In file included from Lib/xplt/src/play/x11/colors.c:9: > Lib/xplt/src/play/x11/playx.h:11:22: X11/Xlib.h: No such file or directory > and then lots of related errors come up. I tried searching the scipy > archives for X11 and Xlib.h, but couldn't find anything. I also tried > "locate Xlib.h" but it is don't on my system. > > Any help would be appreciated, > > Ryan Krauss > > > > requested info from web site follows: > [ryan at localhost ~]$ python -c 'from f2py2e.diagnose import run;run()' > ------ > os.name='posix' > ------ > sys.platform='linux2' > ------ > sys.version: > 2.3.4 (#1, Oct 26 2004, 16:42:40) > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] > ------ > sys.prefix: > /usr > ------ > sys.path=':/usr/lib/python23.zip:/usr/lib/python2.3:/usr/lib/python2.3/plat-linux2:/usr/lib/python2.3/lib-tk:/usr/lib/python2.3/lib-dynload:/usr/lib/python2.3/site-packages:/usr/lib/python2.3/site-packages/Numeric:/usr/lib/python2.3/site-packages/gtk-2.0:/usr/lib/python2.3/site-packages/wx-2.6-gtk2-unicode' > ------ > Failed to import numarray: No module named numarray > Failed to import scipy_distutils: No module named scipy_distutils > Found Numeric version '23.8' in > /usr/lib/python2.3/site-packages/Numeric/Numeric.pyc > Found f2py2e version '2.45.241_1926' in > /usr/lib/python2.3/site-packages/f2py2e/f2py2e.pyc > > -------------------------------------- > #1 > -------------------------------------- > [ryan at localhost ~]$ python -c 'import os,sys;print os.name,sys.platform' > posix linux2 > [ryan at localhost ~]$ uname -a > Linux localhost.localdomain 2.6.11-1.14_FC3 #1 Thu Apr 7 19:23:49 EDT > 2005 i686i686 i386 GNU/Linux > > ----------------------------------- > #2 > ----------------------------------- > [ryan at localhost ~]$ gcc -v > Reading specs from /usr/lib/gcc/i386-redhat-linux/3.4.2/specs > Configured with: ../configure --prefix=/usr --mandir=/usr/share/man > --infodir=/usr/share/info --enable-shared --enable-threads=posix > --disable-checking --with-system-zlib --enable-__cxa_atexit > --disable-libunwind-exceptions --enable-java-awt=gtk > --host=i386-redhat-linux > Thread model: posix > gcc version 3.4.2 20041017 (Red Hat 3.4.2-6.fc3) > [ryan at localhost ~]$ g77 --version > GNU Fortran (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3) > Copyright (C) 2004 Free Software Foundation, Inc. > > --------------------------------- > #3 > --------------------------------- > [ryan at localhost ~]$ python -c 'import sys;print sys.version' > 2.3.4 (#1, Oct 26 2004, 16:42:40) > [GCC 3.4.2 20041017 (Red Hat 3.4.2-6.fc3)] > > ---------------------------------- > #4 > ---------------------------------- > [ryan at localhost ~]$ python -c 'import Numeric;print Numeric.__version__' > 23.8 > > ----------------------------------- > #5 > ----------------------------------- > [ryan at localhost ~]$ f2py -v > 2.45.241_1926 > > ----------------------------------- > #6 > ----------------------------------- > [ryan at localhost linalg]$ python setup_atlas_version.py build_ext > --inplace --force > Traceback (most recent call last): > File "setup_atlas_version.py", line 5, in ? > from scipy_distutils.misc_util import get_path, default_config_dict > ImportError: No module named scipy_distutils.misc_util > [ryan at localhost linalg]$ python -c 'import atlas_version' > > --------------------------------- > #7 > ---------------------------------\[ryan at localhost scipy_distutils]$ > python system_info.py > _pkg_config_info: > NOT AVAILABLE > > agg2_info: > ( src_dirs = .:/usr/local/src ) > NOT AVAILABLE > > atlas_blas_info: > ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) > ( paths: /usr/local/lib/atlas ) > ( paths: /usr/lib/sse2 ) > ( paths: /usr/local/lib/atlas/libf77blas.a ) > ( paths: /usr/local/lib/atlas/libcblas.a ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > ( paths: /usr/local/lib/atlas/libf77blas.a ) > ( paths: /usr/local/lib/atlas/libcblas.a ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > system_info.atlas_blas_info > ( include_dirs = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) > ( paths: > /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h > ) > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/local/lib/atlas'] > language = c > include_dirs = ['/usr/local/lib/atlas'] > > atlas_blas_threads_info: > ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) > ( paths: /usr/local/lib/atlas ) > ( paths: /usr/lib/sse2 ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > system_info.atlas_blas_threads_info > NOT AVAILABLE > > atlas_info: > ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) > ( paths: /usr/local/lib/atlas ) > ( paths: /usr/lib/sse2 ) > ( paths: /usr/local/lib/atlas/libf77blas.a ) > ( paths: /usr/local/lib/atlas/libcblas.a ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > ( paths: /usr/local/lib/atlas/liblapack.a ) > system_info.atlas_info > ( include_dirs = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) > ( paths: > /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h > ) > FOUND: > libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/local/lib/atlas/'] > language = f77 > include_dirs = ['/usr/local/lib/atlas'] > > atlas_threads_info: > ( library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) > ( paths: /usr/local/lib/atlas ) > ( paths: /usr/lib/sse2 ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > ( paths: /usr/local/lib/atlas/libatlas.a ) > system_info.atlas_threads_info > NOT AVAILABLE > > blas_info: > ( library_dirs = /usr/local/lib:/usr/lib ) > ( paths: /usr/lib/libblas.so ) > FOUND: > libraries = ['blas'] > library_dirs = ['/usr/lib'] > language = f77 > > blas_opt_info: > Traceback (most recent call last): > File "system_info.py", line 1430, in ? > show_all() > File "system_info.py", line 1427, in show_all > r = c.get_info() > File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 314, in get_info > self.calc_info() > File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 972, in calc_info > atlas_version = get_atlas_version(**version_info) > File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 809, in get_atlas_version > from core import Extension, setup > File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/core.py", > line 5, in ? > from scipy_distutils.dist import Distribution > ImportError: No module named scipy_distutils.dist > > #7b: > > build_flib.py is not in this directory > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > From ryanfedora at comcast.net Tue May 10 12:49:48 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Tue, 10 May 2005 12:49:48 -0400 Subject: [SciPy-user] installation in FC3 In-Reply-To: <11958.129.238.237.96.1115742249.squirrel@www.mza.com> References: <42801C7B.5000003@comcast.net> <11958.129.238.237.96.1115742249.squirrel@www.mza.com> Message-ID: <4280E62C.60305@comcast.net> An HTML attachment was scrubbed... URL: From anthony.seward at ieee.org Tue May 10 13:12:17 2005 From: anthony.seward at ieee.org (Anthony Joseph Seward) Date: Tue, 10 May 2005 11:12:17 -0600 (MDT) Subject: [SciPy-user] installation in FC3 In-Reply-To: <4280E62C.60305@comcast.net> References: <42801C7B.5000003@comcast.net> <11958.129.238.237.96.1115742249.squirrel@www.mza.com> <4280E62C.60305@comcast.net> Message-ID: <39868.129.238.237.96.1115745137.squirrel@www.mza.com> Get the python-numeric RPM from Fedora Extras and proceed from there. I don't use atrpms. I prefer fedora extras, freshrpms, dag an planetccrma. Tony Ryan Krauss said: > The Numeric .rpm was from sourceforge. I had already installed > pythonabi-2.3.4-1.rhfc3.at from atrpms.net. > > Ryan > > Anthony Joseph Seward wrote: Where are you getting the Numeric .rpm > from? I've modified the one from Fedora Extras to use Numeric 23.8 and > this seemed to work for me. An rpm for a python package that is to be > installed on Fedora should require a python-abi version not a python > version. Tony Ryan Krauss said: (my appologies to the > moderator. I initially sent this message from the wrong account and it > bounced back pending moderator approval.) Hey, I am having trouble > getting scipy installed in Linux. I am a grad student in mechanical > engineering and have been using Matlab in Windows to do much of my data > analysis and some curve fitting/optimization. I would like to switch > over to using scipy in linux instead, but I am having trouble getting > things installed. I am fairly new to linux but have been using python > for about 9 months. I am running Fedora Core 3 and kde on a Compaq > Presario laptop (P4). I tried in stalling using rpms, but the Numeric > rpm kept telling me that I needed python >=2.3, even though I am running > 2.3.4. I finally got Numeric installed from source, but the scipy rpm > gives a segmentation fault when I try import scipy from within python. > So, I have beening doing everything to install scipy from source and > thought things were going fairly well. I type python setup.py install > and lots of stuff scrolls by and things are compiling and then I get the > message: In file included from Lib/xplt/src/play/x11/colors.c:9: > Lib/xplt/src/play/x11/playx.h:11:22: X11/Xlib.h: No such file or > directory and then lots of related errors come up. I tried searching the > scipy archives for X11 and Xlib.h, but couldn't find anything. I also > tried "locate Xlib.h" but it is don't on my system. Any help would be > appreciated, Ryan Krauss requested info from web site follows: > [ryan at localhost ~]$ python -c 'from f2py2e.diagnose import run;run()' > ------ os.name='posix' ------ sys.platform='linux2' ------ sys.version: > 2.3.4 (#1, Oct 26 2004, 16:42:40) [GCC 3.4.2 20041017 (Red Hat > 3.4.2-6.fc3)] ------ sys.prefix: /usr ------ > sys.path=':/usr/lib/python23.zip:/usr/lib/python2.3:/usr/lib/python2.3/plat-linux2:/usr/lib/python2.3/lib-tk:/usr/lib/python2.3/lib-dynload:/usr/lib/python2.3/site-packages:/usr/lib/python2.3/site-packages/Numeric:/usr/lib/python2.3/site-packages/gtk-2.0:/usr/lib/python2.3/site-packages/wx-2.6-gtk2-unicode' > ------ Failed to import numarray: No module named numarray Failed to > import scipy_distutils: No module named scipy_distutils Found Numeric > version '23.8' in /usr/lib/python2.3/site-packages/Numeric/Numeric.pyc > Found f2py2e version '2.45.241_1926' in > /usr/lib/python2.3/site-packages/f2py2e/f2py2e.pyc > -------------------------------------- #1 > -------------------------------------- [ryan at localhost ~]$ python -c > 'import os,sys;print os.name,sys.platform' posix linux2 [ryan at localhost > ~]$ uname -a Linux localhost.localdomain 2.6.11-1.14_FC3 #1 Thu Apr 7 > 19:23:49 EDT 2005 i686i686 i386 GNU/Linux > ----------------------------------- #2 > ----------------------------------- [ryan at localhost ~]$ gcc -v Reading > specs from /usr/lib/gcc/i386-redhat-linux/3.4.2/specs Configured with: > ../configure --prefix=/usr --mandir=/usr/share/man > --infodir=/usr/share/info --enable-shared --enable-threads=posix > --disable-checking --with-system-zlib --enable-__cxa_atexit > --disable-libunwind-exceptions --enable-java-awt=gtk > --host=i386-redhat-linux Thread model: posix gcc version 3.4.2 20041017 > (Red Hat 3.4.2-6.fc3) [ryan at localhost ~]$ g77 --version GNU Fortran (GCC) > 3.4.2 20041017 (Red Hat 3.4.2-6.fc3) Copyright (C) 2004 Free Software > Foundation, Inc. --------------------------------- #3 > --------------------------------- [ryan at localhost ~]$ python -c 'import > sys;print sys.version' 2.3.4 (#1, Oct 26 2004, 16:42:40) [GCC 3.4.2 > 20041017 (Red Hat 3.4.2-6.fc3)] ---------------------------------- #4 > ---------------------------------- [ryan at localhost ~]$ python -c 'import > Numeric;print Numeric.__version__' 23.8 > ----------------------------------- #5 > ----------------------------------- [ryan at localhost ~]$ f2py -v > 2.45.241_1926 ----------------------------------- #6 > ----------------------------------- [ryan at localhost linalg]$ python > setup_atlas_version.py build_ext --inplace --force Traceback (most recent > call last): File "setup_atlas_version.py", line 5, in ? from > scipy_distutils.misc_util import get_path, default_config_dict > ImportError: No module named scipy_distutils.misc_util [ryan at localhost > linalg]$ python -c 'import atlas_version' > --------------------------------- #7 > ---------------------------------\[ryan at localhost scipy_distutils]$ > python system_info.py _pkg_config_info: NOT AVAILABLE agg2_info: ( > src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( > library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: > /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: > /usr/local/lib/atlas/libf77blas.a ) ( paths: > /usr/local/lib/atlas/libcblas.a ) ( paths: > /usr/local/lib/atlas/libatlas.a ) ( paths: > /usr/local/lib/atlas/libf77blas.a ) ( paths: > /usr/local/lib/atlas/libcblas.a ) ( paths: > /usr/local/lib/atlas/libatlas.a ) system_info.atlas_blas_info ( > include_dirs = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) ( > paths: > /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h > ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs > = ['/usr/local/lib/atlas'] language = c include_dirs = > ['/usr/local/lib/atlas'] atlas_blas_threads_info: ( library_dirs = > /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: > /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: > /usr/local/lib/atlas/libatlas.a ) ( paths: > /usr/local/lib/atlas/libatlas.a ) system_info.atlas_blas_threads_info > NOT AVAILABLE atlas_info: ( library_dirs = > /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: > /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: > /usr/local/lib/atlas/libf77blas.a ) ( paths: > /usr/local/lib/atlas/libcblas.a ) ( paths: > /usr/local/lib/atlas/libatlas.a ) ( paths: > /usr/local/lib/atlas/liblapack.a ) system_info.atlas_info ( include_dirs > = /usr/local/lib/atlas/:/usr/local/include:/usr/include ) ( paths: > /usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h,/usr/local/lib/atlas/cblas.h > ) FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/local/lib/atlas/'] language = f77 > include_dirs = ['/usr/local/lib/atlas'] atlas_threads_info: ( > library_dirs = /usr/local/lib/atlas/:/usr/local/lib:/usr/lib ) ( paths: > /usr/local/lib/atlas ) ( paths: /usr/lib/sse2 ) ( paths: > /usr/local/lib/atlas/libatlas.a ) ( paths: > /usr/local/lib/atlas/libatlas.a ) system_info.atlas_threads_info NOT > AVAILABLE blas_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: > /usr/lib/libblas.so ) FOUND: libraries = ['blas'] library_dirs > = ['/usr/lib'] language = f77 blas_opt_info: Traceback (most recent > call last): File "system_info.py", line 1430, in ? show_all() > File "system_info.py", line 1427, in show_all r = c.get_info() File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 314, in get_info self.calc_info() File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 972, in calc_info atlas_version = > get_atlas_version(**version_info) File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py", > line 809, in get_atlas_version from core import Extension, setup > File > "/home/ryan/downloads/python/scipy/SciPy_complete-0.3.2/scipy_core/scipy_distutils/core.py", > line 5, in ? from scipy_distutils.dist import Distribution > ImportError: No module named scipy_distutils.dist #7b: build_flib.py is > not in this directory _______________________________________________ > SciPy-user mailing list SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > _______________________________________________ SciPy-user mailing list > SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user > From perry at stsci.edu Tue May 10 14:47:19 2005 From: perry at stsci.edu (Perry Greenfield) Date: Tue, 10 May 2005 14:47:19 -0400 Subject: [SciPy-user] Numarray image processing In-Reply-To: References: Message-ID: <2c27a7dd7ab03c1133b79cc5c44342c6@stsci.edu> On May 10, 2005, at 11:29 AM, Jeremy Sanders wrote: > Hi - > > I'm trying to think of an efficient way to do the following in > numarray. I have an image, and I want to caclulate the mean of the > pixels at a particular radius in the image (from a fixed point), and > subtract this mean away from those pixels at that radius. > > Can anyone provide insight into how this can be done without looping > over the pixels? > > I've managed to generate an image containing the radius of each pixel, > but this isn't much use. > > Thanks > > Jeremy How you want to do this depends on whether you want to do this for only one radius or many. As it turns out, I just did an example for a tutorial that computes the azimuthally-averaged radial. So here it is in the off-chance that it is useful. Note that it is not as fast as C code would be primarily because of the sort needed. It's a bit tricky. >>> # calculate average radial profile >>> y, x = indices((512,512)) # first determine radii of all pixels >>> r = sqrt((x-257.)**2+(y-258)**2 >>> ind = argsort(r.flat) # get sorted indices # needed to arange image values too >>> sr = r.flat[ind] # sorted radii >>> sim = im.flat[ind] # image values sorted by radii >>> ri = sr.astype(Int16) # integer part of radii (bin size = 1) >>> deltar = ri[1:] - ri[-1] # assume all radii represented # (more work if not) >>> rind = where(deltar)[0] # location of changed radius >>> nr = rind[1:] - rind[:-1] # number in radius bin >>> sim = sim*1. # turn into double to avoid integer overflow # (and single precision rounding) >>> csim = cumsum(sim) # cumulative sum to figure out sums for each radii bin >>> tbin = csim[rind[1:]] - csim[rind[:-1]] # sum for image values in radius bins >>> radialprofile = tbin/nr # the answer (note missing 0 radius value) This is only part of what you need. The profile here can be used to generate a profile image. How you do that depends on how much interpolation you want. If none, this would approximate by taking the nearest radius value >>> radiusimage = radialprofile[(r+.5).astype(Int16)] If you only need it for one radius then all you need do is construct a mask from the radius image and total all values in the masked image divided by the total of the mask. Is this what you were looking for? Perry Greenfield From ryanfedora at comcast.net Tue May 10 20:05:59 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Tue, 10 May 2005 20:05:59 -0400 Subject: [SciPy-user] weave and large arrays Message-ID: <42814C67.1070008@comcast.net> Does anyone have an example of using weave to read in large ASCII arrays? Is it possible to use weave to read large *.mat Matlab binary files? I have *.mat files that contain 10-15 column vectors that are 25000 rows long. For now I have written a Matlab script to convert them to ASCII. Thanks, Ryan From tgray at princeton.edu Wed May 11 10:16:12 2005 From: tgray at princeton.edu (Tim Gray) Date: Wed, 11 May 2005 10:16:12 -0400 Subject: [SciPy-user] Problems with fftpack on OS X 10.4 Message-ID: Hi, I am a very happy (new) user of scipy (CVS), matplotlib (.74), and python (2.4.1) for analysis and plotting of scientific data. I have been running a CVS version of scipy on my computer at work, which is running OS X 10.3.8. Everything has been wonderful there. However, on my personal computer, I recently upgraded to OS X 10.4. I reinstalled all my python packages and got everything working, except for scipy (again, the current CVS build). Scipy seemed to build ok, but when I call anything in fftpack, I get the error at the end of this post. I recompiled fftw (2.1.5) and ran make check (or test? they all passed) before rebuilding scipy, but the error remains. Any help would be appreciated. Thanks! ------- /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site.py in __call__(self, *args, **kwds) 326 def __call__(self, *args, **kwds): 327 import pydoc --> 328 return pydoc.help(*args, **kwds) 329 330 def sethelper(): /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/ppimport.py in _ppimport_pydoc_help_call(self, *args, **kwds) 384 _old_pydoc_help_call = _pydoc.help.__class__.__call__ 385 def _ppimport_pydoc_help_call(self,*args,**kwds): --> 386 return _old_pydoc_help_call(self, *map(_ppresolve_ignore_failure,args), 387 **kwds) 388 _ppimport_pydoc_help_call.__doc__ = _old_pydoc_help_call.__doc__ /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/ppimport.py in _ppresolve_ignore_failure(a) 370 371 def _ppresolve_ignore_failure(a): --> 372 return ppresolve(a,ignore_failure=1) 373 374 try: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/ppimport.py in ppresolve(a, ignore_failure) 364 if hasattr(a,'_ppimport_importer') or \ 365 hasattr(a,'_ppimport_module'): --> 366 a = getattr(a,'_ppimport_module',a) 367 if hasattr(a,'_ppimport_attr'): 368 a = a._ppimport_attr /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/ppimport.py in __getattr__(self, name) 301 module = self.__dict__['_ppimport_module'] 302 except KeyError: --> 303 module = self._ppimport_importer() 304 return getattr(module, name) 305 /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/ppimport.py in _ppimport_importer(self) 256 module = sys.modules[name] 257 except KeyError: --> 258 raise ImportError,self.__dict__.get('_ppimport_exc_info')[1] 259 if module is not self: 260 exc_info = self.__dict__.get('_ppimport_exc_info') ImportError: Failure linking new module: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/_fftpack.so: Symbol not found: _fprintf$LDBLStub Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/_fftpack.so Expected in: dynamic lookup From rkern at ucsd.edu Wed May 11 10:31:52 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 11 May 2005 07:31:52 -0700 Subject: [SciPy-user] Problems with fftpack on OS X 10.4 In-Reply-To: References: Message-ID: <42821758.1030101@ucsd.edu> Tim Gray wrote: > Hi, I am a very happy (new) user of scipy (CVS), matplotlib (.74), and python (2.4.1) for analysis and plotting of scientific data. I have been running a CVS version of scipy on my computer at work, which is running OS X 10.3.8. Everything has been wonderful there. > > However, on my personal computer, I recently upgraded to OS X 10.4. I reinstalled all my python packages and got everything working, except for scipy (again, the current CVS build). Scipy seemed to build ok, but when I call anything in fftpack, I get the error at the end of this post. I recompiled fftw (2.1.5) and ran make check (or test? they all passed) before rebuilding scipy, but the error remains. Any help would be appreciated. > ImportError: Failure linking new module: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/_fftpack.so: Symbol not found: _fprintf$LDBLStub > Referenced from: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/fftpack/_fftpack.so > Expected in: dynamic lookup Offhand, I'd suggest using gcc 3.3 to compile Scipy and FFTW, not gcc 4.0 which is the default on Tiger. Python 2.4.1 is compiled with gcc 3.3. $ sudo gcc_select 3.3 If that doesn't work, lemme know, and I'll look into it more deeply. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Wed May 11 12:36:24 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 11 May 2005 09:36:24 -0700 Subject: [SciPy-user] weave and large arrays In-Reply-To: <42814C67.1070008@comcast.net> References: <42814C67.1070008@comcast.net> Message-ID: <42823488.9050700@csun.edu> Ryan Krauss wrote: > Does anyone have an example of using weave to read in large ASCII arrays? Is the load command too slow? (A bit confusingly, there appear to be at least two "load" commands; one in numarray and Numeric which reads ASCII data, and one in the top level of scipy which reads pickled data.) > Is it possible to use weave to read large *.mat Matlab binary files? Does scipy.io.loadmat do what you need? x=scipy.io.loadmat('file.mat') will create a dictionary with entries for each variable in the MAT file. From jeremy at jeremysanders.net Wed May 11 12:54:57 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Wed, 11 May 2005 17:54:57 +0100 (BST) Subject: [SciPy-user] Numarray image processing In-Reply-To: <1115740130.27493.18.camel@halloween.stsci.edu> References: <1115740130.27493.18.camel@halloween.stsci.edu> Message-ID: On Tue, 10 May 2005, Todd Miller wrote: > I'm not an astronomer, so take this with a grain of salt, but this is > what comes to mind for me: That's quite a nice way of doing it. Thanks. The problem is that I need to do this for all radii (integerizing the radius). This is to show the deviations away from spherical symmetry. I can do it in two iterations over my image, iterating over each pixel, calculating its radius, and collecting the total value and number at each radius in an array. I can then go back over the image and subtract the mean value. I also had to ignore NaN pixels. Unfortunately it was taking over 15 minutes for my 3000x3000 image using numarray. I was able to code up a quick C++ program which did it in two seconds. This was with an Athlon 64 3400+. Numarray is obviously great if you can write the computation in a matrix form, but it's quite slow addressing individual pixels (or maybe it's the python around it). Quite often I find it hard to think in non pixel terms... Thanks Jeremy -- Jeremy Sanders http://www.jeremysanders.net/ Cambridge, UK Public Key Server PGP Key ID: E1AAE053 From jeremy at jeremysanders.net Wed May 11 12:58:04 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Wed, 11 May 2005 17:58:04 +0100 (BST) Subject: [SciPy-user] Numarray image processing In-Reply-To: <2c27a7dd7ab03c1133b79cc5c44342c6@stsci.edu> References: <2c27a7dd7ab03c1133b79cc5c44342c6@stsci.edu> Message-ID: On Tue, 10 May 2005, Perry Greenfield wrote: > If you only need it for one radius then all you need do is construct a mask > from the radius image and total all values in the masked image divided by the > total of the mask. > > Is this what you were looking for? That's really great - thanks! If only I could get my brain into working in the numarray way all the time. Jeremy -- Jeremy Sanders http://www.jeremysanders.net/ Cambridge, UK Public Key Server PGP Key ID: E1AAE053 From tgray at princeton.edu Wed May 11 13:05:43 2005 From: tgray at princeton.edu (Tim Gray) Date: Wed, 11 May 2005 13:05:43 -0400 Subject: [SciPy-user] Problems with fftpack on OS X 10.4 In-Reply-To: <42821758.1030101@ucsd.edu> Message-ID: On 5/11/05, Robert Kern wrote: > Offhand, I'd suggest using gcc 3.3 to compile Scipy and FFTW, not gcc > 4.0 which is the default on Tiger. Python 2.4.1 is compiled with gcc 3.3. > > $ sudo gcc_select 3.3 > > If that doesn't work, lemme know, and I'll look into it more deeply. Thank you sir, that did indeed help. I was not aware gcc 4 was the default. Now I am getting an error on import of scipy (traced to scipy/special/cephes.py) that _numeric.nc_cephes doesn't exist. Descending down into the scipy/special/_numeric directory I only found nc_cephes.so... This isn't a huge deal for me since I had a copy of the CVS from April 12th (which doesn't have cephes.py, _numeric, or _numarray) and that builds and runs fine. Thanks again. Tim From Fernando.Perez at colorado.edu Wed May 11 13:27:14 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 11 May 2005 11:27:14 -0600 Subject: [SciPy-user] Numarray image processing In-Reply-To: References: <1115740130.27493.18.camel@halloween.stsci.edu> Message-ID: <42824072.5080403@colorado.edu> Jeremy Sanders wrote: > Unfortunately it was taking over 15 minutes for my 3000x3000 image using > numarray. I was able to code up a quick C++ program which did it in two > seconds. This was with an Athlon 64 3400+. > > Numarray is obviously great if you can write the computation in a matrix > form, but it's quite slow addressing individual pixels (or maybe it's the > python around it). Quite often I find it hard to think in non pixel > terms... You may want to look at weave.inline(), which can be a life-saver in these kinds of situations. I don't know if it works with numarray out of the box, though, since it was written back in the days of Numeric. In the near future that won't be an issue anyways. Regards, f From ryanfedora at comcast.net Wed May 11 14:02:26 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 14:02:26 -0400 Subject: [SciPy-user] weave and large arrays In-Reply-To: <42823488.9050700@csun.edu> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> Message-ID: <428248B2.7080604@comcast.net> Either reading in the ascii data or using io.loadmat without using weave runs for over 15 minutes (I have shut the process down each time and don't know how long it takes to complete). The *.mat file contains 11 column vectors that are 25000 elements long. Stephen Walton wrote: > Ryan Krauss wrote: > >> Does anyone have an example of using weave to read in large ASCII >> arrays? > > > Is the load command too slow? (A bit confusingly, there appear to be > at least two "load" commands; one in numarray and Numeric which reads > ASCII data, and one in the top level of scipy which reads pickled data.) > >> Is it possible to use weave to read large *.mat Matlab binary files? > > > Does scipy.io.loadmat do what you need? > > x=scipy.io.loadmat('file.mat') > > will create a dictionary with entries for each variable in the MAT file. > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From falted at pytables.org Wed May 11 14:38:52 2005 From: falted at pytables.org (Francesc Altet) Date: Wed, 11 May 2005 20:38:52 +0200 Subject: [SciPy-user] weave and large arrays In-Reply-To: <428248B2.7080604@comcast.net> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> Message-ID: <200505112038.52500.falted@pytables.org> A Dimecres 11 Maig 2005 20:02, Ryan Krauss va escriure: > Either reading in the ascii data or using io.loadmat without using weave > runs for over 15 minutes (I have shut the process down each time and > don't know how long it takes to complete). The *.mat file contains 11 > column vectors that are 25000 elements long. I never tried it, but you can try saving your data in HDF5 format [1] and then reading it back with PyTables [2]. By default, PyTables will return to you a NumArray object, but you can easily convert it into a Numeric object using: >>> import tables >>> file = tables.openFile("yourfile.h5") >>> na = file.root.yourdataset[:] >>> file.close() >>> import Numeric >>> n=Numeric.array(na) Beware, if you use recent versions of numarray and Numeric (say 1.3.1 and 24.0b2), this conversion is very fast (100x than if using older versions). If you try this and succeed, please, tell me. If don't, feel free to send me a small matlab file so that I can look at its exact format and see whether I can add support for it in PyTables. [1] http://www.mathworks.com/access/helpdesk/help/techdoc/matlab_prog/ch_imp41.html [2] http://www.pytables.org Hope that helps, -- Francesc Altet From daniel.wheeler at nist.gov Wed May 11 15:01:14 2005 From: daniel.wheeler at nist.gov (Daniel Wheeler) Date: Wed, 11 May 2005 15:01:14 -0400 Subject: [SciPy-user] conference In-Reply-To: <20050511153650.36C013EB67@www.scipy.com> References: <20050511153650.36C013EB67@www.scipy.com> Message-ID: <1C2D7D56-4199-4102-AA13-9D375A77B195@nist.gov> Are there any plans for a SciPy conference this year? ------------------------------------- Daniel Wheeler Telephone: (301) 975-8358 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryanfedora at comcast.net Wed May 11 15:01:35 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 15:01:35 -0400 Subject: [SciPy-user] weave and large arrays In-Reply-To: <200505112038.52500.falted@pytables.org> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <200505112038.52500.falted@pytables.org> Message-ID: <4282568F.5080203@comcast.net> An HTML attachment was scrubbed... URL: From ryanfedora at comcast.net Wed May 11 15:16:05 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 15:16:05 -0400 Subject: [SciPy-user] weave and large arrays In-Reply-To: <4282568F.5080203@comcast.net> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <200505112038.52500.falted@pytables.org> <4282568F.5080203@comcast.net> Message-ID: <428259F5.6050503@comcast.net> An HTML attachment was scrubbed... URL: From stephen.walton at csun.edu Wed May 11 15:38:51 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 11 May 2005 12:38:51 -0700 Subject: [SciPy-user] weave and large arrays In-Reply-To: <428248B2.7080604@comcast.net> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> Message-ID: <42825F4B.6060908@csun.edu> Ryan Krauss wrote: > Either reading in the ascii data or using io.loadmat without using > weave runs for over 15 minutes The *.mat file contains 11 column > vectors that are 25000 elements long. There is something really wrong here then. I executed the following MATLAB code: x1=rand(25000,1); x2=rand(25000,1); ... x11=rand(25000,1); save foo -v4 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 and then loaded 'foo.mat' using io.loadmat in a fraction of a second. I was using MATLAB version 7, hence the -v4 switch to get lowest common denominator MATLAB file. From ryanfedora at comcast.net Wed May 11 15:56:48 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 15:56:48 -0400 Subject: [SciPy-user] weave and large arrays In-Reply-To: <42825F4B.6060908@csun.edu> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <42825F4B.6060908@csun.edu> Message-ID: <42826380.9010605@comcast.net> hmmmmmmmmmm. I had written a Matlab script to load the *.mat files and resave them, but I put the -v4 switch at the end. I am using Matlab version 6.5. I will re-run the script using the -v4 switch first and see what happens. Stephen Walton wrote: > Ryan Krauss wrote: > >> Either reading in the ascii data or using io.loadmat without using >> weave runs for over 15 minutes The *.mat file contains 11 column >> vectors that are 25000 elements long. > > > There is something really wrong here then. I executed the following > MATLAB code: > > x1=rand(25000,1); > x2=rand(25000,1); > ... > x11=rand(25000,1); > save foo -v4 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 > > and then loaded 'foo.mat' using io.loadmat in a fraction of a second. > I was using MATLAB version 7, hence the -v4 switch to get lowest > common denominator MATLAB file. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanfedora at comcast.net Wed May 11 16:44:37 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 16:44:37 -0400 Subject: [SciPy-user] weave and large arrays In-Reply-To: <42825F4B.6060908@csun.edu> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <42825F4B.6060908@csun.edu> Message-ID: <42826EB5.8040707@comcast.net> So after Stephen said that he was able to do this without a problem, I tried it again and my problem went away. I even seem to be able to load the *.mat files from Matlab 6.1 without saving them as -v4. I don't know what the problem was, but I don't seem to be able to recreate it anymore. Thanks to those who tried to help me. I don't know what to say. Stephen Walton wrote: > Ryan Krauss wrote: > >> Either reading in the ascii data or using io.loadmat without using >> weave runs for over 15 minutes The *.mat file contains 11 column >> vectors that are 25000 elements long. > > > There is something really wrong here then. I executed the following > MATLAB code: > > x1=rand(25000,1); > x2=rand(25000,1); > ... > x11=rand(25000,1); > save foo -v4 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 > > and then loaded 'foo.mat' using io.loadmat in a fraction of a second. > I was using MATLAB version 7, hence the -v4 switch to get lowest > common denominator MATLAB file. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanfedora at comcast.net Wed May 11 17:45:02 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Wed, 11 May 2005 17:45:02 -0400 Subject: [SciPy-user] gui_thread problem In-Reply-To: <42826EB5.8040707@comcast.net> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <42825F4B.6060908@csun.edu> <42826EB5.8040707@comcast.net> Message-ID: <42827CDE.9070108@comcast.net> So, I know I am posting to this list a lot, but I really have tried all I can think of to solve this problem. Sorry. When I try to start my gui_thread, I get a "NotImplementedError: " message. Full message: Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.3/site-packages/gui_thread/__init__.py", line 73, in start wxPython_thread() File "/usr/lib/python2.3/site-packages/gui_thread/wxPython_thread.py", line 164, in wxPython_thread sys.modules[name] = wrap_extmodule(module,call_holder) File "/usr/lib/python2.3/site-packages/gui_thread/wxPython_thread.py", line 62, in wrap_extmodule raise NotImplementedError,`t` NotImplementedError: I googled the error message and someone else had this problem, but I guess was able to solve it my starting the gui_thread before importing scipy. That doesn't solve the problem for me. The module that is being loaded by line 164: sys.modules[name] = wrap_extmodule(module,call_holder) when the error happens is I am running Python 2.3.4 and have installed the wx rpms: wxPython2.6-gtk2-unicode-2.6.0.0-fc2_py2.3.i386.rpm wxPython-common-gtk2-ansi-2.6.0.0-fc2_py2.3.i386.rpm from wxpython.org (I actually have a multi-version install of wxPython, but chaning versions doen't solve this problem. The other version is 2.6-gtk2-ansi) Thanks for any help you can offer, Ryan From rkern at ucsd.edu Wed May 11 17:54:18 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 11 May 2005 14:54:18 -0700 Subject: [SciPy-user] Problems with fftpack on OS X 10.4 In-Reply-To: References: Message-ID: <42827F0A.5010803@ucsd.edu> Tim Gray wrote: > On 5/11/05, Robert Kern wrote: > >>Offhand, I'd suggest using gcc 3.3 to compile Scipy and FFTW, not gcc >>4.0 which is the default on Tiger. Python 2.4.1 is compiled with gcc 3.3. >> >>$ sudo gcc_select 3.3 >> >>If that doesn't work, lemme know, and I'll look into it more deeply. > > Thank you sir, that did indeed help. I was not aware gcc 4 was the default. Now I am getting an error on import of scipy (traced to scipy/special/cephes.py) that _numeric.nc_cephes doesn't exist. Descending down into the scipy/special/_numeric directory I only found nc_cephes.so... > > This isn't a huge deal for me since I had a copy of the CVS from April 12th (which doesn't have cephes.py, _numeric, or _numarray) and that builds and runs fine. You got caught in the middle of the port of this module to numarray. It looks like it just got fixed in CVS, though. Thanks, Todd! -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From prabhu_r at users.sf.net Thu May 12 04:09:46 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Thu, 12 May 2005 13:39:46 +0530 Subject: [SciPy-user] gui_thread problem In-Reply-To: <42827CDE.9070108@comcast.net> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <42825F4B.6060908@csun.edu> <42826EB5.8040707@comcast.net> <42827CDE.9070108@comcast.net> Message-ID: <17027.3914.223599.353812@monster.linux.in> >>>>> "RK" == Ryan Krauss writes: RK> So, I know I am posting to this list a lot, but I really have RK> tried all I can think of to solve this problem. Sorry. RK> When I try to start my gui_thread, I get a RK> "NotImplementedError: " message. [...] RK> from wxpython.org (I actually have a multi-version install of RK> wxPython, but chaning versions doen't solve this problem. The RK> other version is 2.6-gtk2-ansi) My recommendation is that you dump gui_thread and use IPython (http://ipython.scipy.org) along with its -wthread option. Additionally, don't use plt, I don't know if anyone really uses it anymore. Use matplotlib (http://matplotlib.sf.net) instead. cheers, prabhu From ryanfedora at comcast.net Thu May 12 06:37:05 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Thu, 12 May 2005 06:37:05 -0400 Subject: [SciPy-user] gui_thread problem In-Reply-To: <17027.3914.223599.353812@monster.linux.in> References: <42814C67.1070008@comcast.net> <42823488.9050700@csun.edu> <428248B2.7080604@comcast.net> <42825F4B.6060908@csun.edu> <42826EB5.8040707@comcast.net> <42827CDE.9070108@comcast.net> <17027.3914.223599.353812@monster.linux.in> Message-ID: <428331D1.30908@comcast.net> An HTML attachment was scrubbed... URL: From falted at pytables.org Thu May 12 05:06:47 2005 From: falted at pytables.org (Francesc Altet) Date: Thu, 12 May 2005 11:06:47 +0200 Subject: Fwd: Re: [SciPy-user] weave and large arrays Message-ID: <200505121106.47956.falted@pytables.org> A Dimecres 11 Maig 2005 21:16, Ryan Krauss va escriure: > I don't know anything about PyTables.? The data is now in tab delimited > ASCII files where each column is on data vector.? Can PyTables quickly read > that and give me either a matrix of 11 different column vectors? No, PyTables doesn't support importing ASCII files directly. However, you can use a tool on the HDF5 distribution called h5import [1] that can read ASCII files and convert them into HDF5 files. [1] http://hdf.ncsa.uiuc.edu/HDF5/doc/Tools.html#Tools-Import > The student version of Matlab R13 does not seem to support hdf5.? It can > read hdf, but doesn't seem to be able to write at all.? Ops. Well, now that you have solved your problem I suppose that it does not matter anymore. Cheers, -- Francesc Altet From nwagner at mecha.uni-stuttgart.de Thu May 12 09:12:38 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 12 May 2005 15:12:38 +0200 Subject: [SciPy-user] Possible bug wrt bmat Message-ID: <42835646.5020005@mecha.uni-stuttgart.de> Hi all, >>> from scipy import * numerix Numeric 24.0b2 >>> I = identity(2) >>> I array([[1, 0], [0, 1]]) >>> bmat(r_[c_[I,I],c_[I,I]]) Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy_base/matrix_base.py", line 136, in bmat return Matrix.Matrix(obj) File "/usr/local/lib/python2.4/site-packages/scipy_base/ppimport.py", line 102, in __getattr__ return getattr(attr, name) AttributeError: class Matrix has no attribute 'Matrix' I guess it's a bug. Am I missing something ? Nils From nwagner at mecha.uni-stuttgart.de Thu May 12 10:45:26 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 12 May 2005 16:45:26 +0200 Subject: [SciPy-user] scipy.test(1,verbosity=10) failed Message-ID: <42836C06.3090202@mecha.uni-stuttgart.de> Hi all, A scipy.test(1,verbosity=10) using >> import scipy numerix Numeric 24.0b2 >>> scipy.__version__ '0.3.3_304.4613' check_random_complex (scipy.linalg.decomp.test_decomp.test_qr) ... ok check_simple (scipy.linalg.decomp.test_decomp.test_qr) ... ok check_simple_complex (scipy.linalg.decomp.test_decomp.test_qr) ... ok check_simple (scipy.linalg.decomp.test_decomp.test_schur) ... ok failed with check_random (scipy.linalg.decomp.test_decomp.test_svd) ** On entry to DGESDD parameter number 12 had an illegal value Any idea ? Nils From oliphant at ee.byu.edu Thu May 12 19:38:51 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 12 May 2005 17:38:51 -0600 Subject: [SciPy-user] Possible bug wrt bmat In-Reply-To: <42835646.5020005@mecha.uni-stuttgart.de> References: <42835646.5020005@mecha.uni-stuttgart.de> Message-ID: <4283E90B.8010906@ee.byu.edu> Nils Wagner wrote: > Hi all, > > >>> from scipy import * > numerix Numeric 24.0b2 This is an error that cropped up from the transition to using numerix -Travis From ryanfedora at comcast.net Thu May 12 21:53:14 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Thu, 12 May 2005 21:53:14 -0400 Subject: [SciPy-user] structure arrays in SciPy Message-ID: <4284088A.9040101@comcast.net> I have some data that I was storing in structure arrays in Matlab. It seems like I can do everything I need to do with mat_struct in SciPy, but I am wondering if there is a better way. Basically, I have some experimental data I need to turn into Bode plots and then store with some describing information. Each test would have a magnitude vector, a phase vector, and then some miscellanious information such as the input and output (which would be strings), and raw data filenames (a list of strings), and maybe a date and a description blurb. A group of these tests with different inputs and outputs would be one set that I would save as a datafile. In Matlab, each test was an element of a structure array. Each element of the structure array had unique magnitude and phase vectors and input and output labels. The rest of the data was the same but got repeated for each element in the structure array (I want to keep it all together in one structure and what was repeated was small compared to the data vectors). I think I can do all of this with a list of mat_struct's and I seem to be able to pickle and unpickle them with io.pickler.objload and objsave, but I didn't know if there is some more elegant way of doing this in SciPy. Any thoughts? Thanks, Ryan From nwagner at mecha.uni-stuttgart.de Fri May 13 01:55:54 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 13 May 2005 07:55:54 +0200 Subject: [SciPy-user] Possible bug wrt bmat In-Reply-To: <4283E90B.8010906@ee.byu.edu> References: <42835646.5020005@mecha.uni-stuttgart.de> <4283E90B.8010906@ee.byu.edu> Message-ID: On Thu, 12 May 2005 17:38:51 -0600 Travis Oliphant wrote: > Nils Wagner wrote: > >> Hi all, >> >> >>> from scipy import * >> numerix Numeric 24.0b2 > > This is an error that cropped up from the transition to >using numerix > > -Travis Travis, Thank you very much for your prompt reply. How about the following check_simple (scipy.linalg.decomp.test_decomp.test_qr) ... ok check_simple_complex (scipy.linalg.decomp.test_decomp.test_qr) ... ok check_simple (scipy.linalg.decomp.test_decomp.test_schur) ... ok check_random (scipy.linalg.decomp.test_decomp.test_svd) ** On entry to DGESDD pa rameter number 12 had an illegal value Is it related to the latest version of f2py ? I didn't receive this message before. f2py -v numerix Numeric 24.0b2 2.46.243_1985 Any pointer would be appreciated. Thanks in advance Nils > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From rosario.ruiloba at noveltis.fr Fri May 13 04:47:11 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 13 May 2005 10:47:11 +0200 Subject: [SciPy-user] Installing problems Message-ID: <1115974031.1287.9.camel@rosae.noveltis.fr> Hello, I'm trying to install Scipy-0.3.2 from sources on RedHat 8.0. I've installed ATLAS, LAPACK and BLAS as it's explained in Scipy web pages. But I have a problem when I build Scipy. I'm giving you some results of the building below. Do you know what is missing? Thank you, Rosa ----------------- blas_opt_info: atlas_blas_threads_info: scipy_distutils.system_info.atlas_blas_threads_info NOT AVAILABLE atlas_blas_info: scipy_distutils.system_info.atlas_blas_info FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/home/ruiloba/my_libs/src/ATLAS/lib/Linux_P4SSE2'] language = c ----------- ATLAS version 3.6.0 creating build/temp.linux-i686-2.2/config_pygist ============= begin top level configuration ============= compiling '_configtest.c': /* check whether libm is broken */ #include int main(int argc, char *argv[]) { return exp(-720.) > 1.0; /* typically an IEEE denormal */ } gcc -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o gcc _configtest.o -o _configtest _configtest.o: In function `main': /home/ruiloba/my_libs/SCIPY/SciPy_complete-0.3.2/_configtest.c:5: undefined reference to `exp' collect2: ld returned 1 exit status _configtest.o: In function `main': /home/ruiloba/my_libs/SCIPY/SciPy_complete-0.3.2/_configtest.c:5: undefined reference to `exp' collect2: ld returned 1 exit status failure. removing: _configtest.c _configtest.o ----------------------------- building extension "scipy.linalg.atlas_version" sources building extension "scipy.linalg._iterative" sources Traceback (most recent call last): File "setup.py", line 111, in ? setup_package(ignore_packages) File "setup.py", line 98, in setup_package url = "http://www.scipy.org", File "scipy_core/scipy_distutils/core.py", line 73, in setup return old_setup(**new_attr) File "/usr/src/build/143041-i386/install/usr/lib/python2.2/distutils/core.py", line 138, in setup File "/usr/src/build/143041-i386/install/usr/lib/python2.2/distutils/dist.py", line 893, in run_commands File "/usr/src/build/143041-i386/install/usr/lib/python2.2/distutils/dist.py", line 913, in run_command File "/usr/src/build/143041-i386/install/usr/lib/python2.2/distutils/command/build.py", line 107, in run File "/usr/lib/python2.2/cmd.py", line 330, in run_command print "\n" File "/usr/src/build/143041-i386/install/usr/lib/python2.2/distutils/dist.py", line 913, in run_command File "scipy_core/scipy_distutils/command/build_src.py", line 81, in run self.build_sources() File "scipy_core/scipy_distutils/command/build_src.py", line 88, in build_sources self.build_extension_sources(ext) File "scipy_core/scipy_distutils/command/build_src.py", line 124, in build_extension_sources sources = self.template_sources(sources, ext) File "scipy_core/scipy_distutils/command/build_src.py", line 197, in template_sources outstr = process_str(fid.read()) File "scipy_core/scipy_distutils/from_template.py", line 197, in process_str writestr += fix_capitals(newstr[oldend:sub[0]]) File "scipy_core/scipy_distutils/from_template.py", line 182, in fix_capitals astr = maxre.sub(r"MAX(\g<1>,\g<2>)",astr) File "/usr/lib/python2.2/pre.py", line 344, in sub return self.subn(repl, string, count)[0] File "/usr/lib/python2.2/pre.py", line 366, in subn repl = pcre_expand(_Dummy, repl) TypeError: 'NoneType' object is not callable -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 From rkern at ucsd.edu Fri May 13 05:02:16 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 13 May 2005 02:02:16 -0700 Subject: [SciPy-user] Installing problems In-Reply-To: <1115974031.1287.9.camel@rosae.noveltis.fr> References: <1115974031.1287.9.camel@rosae.noveltis.fr> Message-ID: <42846D18.2020107@ucsd.edu> Rosa Ruiloba wrote: > Hello, > > I'm trying to install Scipy-0.3.2 from sources on RedHat 8.0. I've > installed ATLAS, LAPACK and BLAS as it's explained in Scipy web pages. > But I have a problem when I build Scipy. I'm giving you some results of > the building below. Do you know what is missing? > File "/usr/lib/python2.2/pre.py", line 366, in subn > repl = pcre_expand(_Dummy, repl) > TypeError: 'NoneType' object is not callable Odd. This problem should be confined to Python 2.2, though. Can you upgrade to 2.4.1? or even 2.3.5? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rosario.ruiloba at noveltis.fr Fri May 13 07:53:38 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 13 May 2005 13:53:38 +0200 Subject: [SciPy-user] Installing problems In-Reply-To: <42846D18.2020107@ucsd.edu> References: <1115974031.1287.9.camel@rosae.noveltis.fr> <42846D18.2020107@ucsd.edu> Message-ID: <1115985218.32576.9.camel@rosae.noveltis.fr> Hello, I've installed python 2.3.5 (as user) but I have a problem installing a new numeric version (23.5): see setup.py execution result below.In $BUILD_DIR~/my_libs/src/ATLAS/lib/Linux_P4SSE2 I have libcblas.a and I suppose that gcc is looking for a libcblas.so. Do you nom what objects must be used to create libcblas.so? Do you have that file for my RH 8.0-gcc 3.2 ? Thank you, Rosa gcc -pthread -shared build/temp.linux-i686-2.4/Src/lapack_litemodule.o build/temp.linux-i686-2.4/Src/blas_lite.o build/temp.linux-i686-2.4/Src/f2c_lite.o build/temp.linux-i686-2.4/Src/zlapack_lite.o build/temp.linux-i686-2.4/Src/dlapack_lite.o -L$BUILD_DIR~/my_libs/src/ATLAS/lib/Linux_P4SSE2 -llapack -lcblas -lf77blas -latlas -lg2c -o build/lib.linux-i686-2.4/lapack_lite.so /usr/bin/ld: cannot find -lcblas collect2: ld returned 1 exit status error: command 'gcc' failed with exit status 1 On Fri, 2005-05-13 at 11:02, Robert Kern wrote: > Rosa Ruiloba wrote: > > Hello, > > > > I'm trying to install Scipy-0.3.2 from sources on RedHat 8.0. I've > > installed ATLAS, LAPACK and BLAS as it's explained in Scipy web pages. > > But I have a problem when I build Scipy. I'm giving you some results of > > the building below. Do you know what is missing? > > > File "/usr/lib/python2.2/pre.py", line 366, in subn > > repl = pcre_expand(_Dummy, repl) > > TypeError: 'NoneType' object is not callable > > Odd. > > This problem should be confined to Python 2.2, though. Can you upgrade > to 2.4.1? or even 2.3.5? > > -- > Robert Kern > rkern at ucsd.edu > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 From rkern at ucsd.edu Fri May 13 08:09:44 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 13 May 2005 05:09:44 -0700 Subject: [SciPy-user] Installing problems In-Reply-To: <1115985218.32576.9.camel@rosae.noveltis.fr> References: <1115974031.1287.9.camel@rosae.noveltis.fr> <42846D18.2020107@ucsd.edu> <1115985218.32576.9.camel@rosae.noveltis.fr> Message-ID: <42849908.7050205@ucsd.edu> Rosa Ruiloba wrote: > Hello, > > I've installed python 2.3.5 (as user) but I have a problem installing a > new numeric version (23.5): see setup.py execution result below.In > $BUILD_DIR~/my_libs/src/ATLAS/lib/Linux_P4SSE2 I have libcblas.a and I > suppose that gcc is looking for a libcblas.so. Do you nom what objects > must be used to create libcblas.so? Do you have that file for my RH > 8.0-gcc 3.2 ? No, your problem is that "$BUILD_DIR~/my_libs/src/ATLAS/lib/Linux_P4SSE2" is malformed. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rosario.ruiloba at noveltis.fr Fri May 13 08:56:23 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 13 May 2005 14:56:23 +0200 Subject: [SciPy-user] No module named xplt Message-ID: <1115988983.32642.18.camel@rosae.noveltis.fr> Hello I have installed Scipy with python2.4.1 and Numeric23.5. I van import xplt module (see below). My PYTHONPATH seems ok. Do I set other variables? echo $PYTHONPATH /home/ruiloba/my_libs/Python-2.4.1/lib/python2.4/site-packages ls /home/ruiloba/my_libs/Python-2.4.1/lib/python2.4/site-packages dateutil matplotlib Numeric.pth pytz scipy_distutils f2py2e numarray pylab.py scipy scipy_test gui_thread Numeric pylab.pyc scipy_base weave ------------ import scipy from xplt import * Traceback (most recent call last): File "", line 1, in ? ImportError: No module named xplt -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 From ckkart at hoc.net Fri May 13 09:00:08 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Fri, 13 May 2005 15:00:08 +0200 Subject: [SciPy-user] No module named xplt In-Reply-To: <1115988983.32642.18.camel@rosae.noveltis.fr> References: <1115988983.32642.18.camel@rosae.noveltis.fr> Message-ID: <4284A4D8.8070301@hoc.net> Rosa Ruiloba wrote: > import scipy > from xplt import * > Traceback (most recent call last): > File "", line 1, in ? > ImportError: No module named xplt As xplt lives inside scipy you must use from scipy.xplt import * or from scipy import xplt Christian From ryanfedora at comcast.net Fri May 13 09:15:24 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 13 May 2005 09:15:24 -0400 Subject: [SciPy-user] structure arrays in SciPy In-Reply-To: <4284088A.9040101@comcast.net> References: <4284088A.9040101@comcast.net> Message-ID: <4284A86C.3010900@comcast.net> Maybe an example would make this clearer. What if I want to store a collection of objects with the following properties: entry1.input='a1' entry1.output='j2' entry1.datafile=['file1','file2','file3'] entry1.data1=[1.2,1.47,3.56,....] entry1.data2=[-1.5,-19.5,35.5,....] where data1 and data2 are actually Numeric or numarray arrays/matrices. How should I go about storing a collection of these objects in SciPy? Should I be defining my own class and simply make a list of them and pickle the list? Thanks for any help, Ryan Ryan Krauss wrote: > I need to store and retrieve a collection of test results where each > test would have some typical properties like the date it was run and > other scalar desriptors. But the main data from the test will be > vectors or matrices of data the I will later plot. Each vector or > matrix might have 20000 rows of floating point numbers and there may > be 10 or more of these test results that I want to store in a group. > Can SQLOjbect handle storing a matrix instance? I am quite new to SQL > as well as SQLObject, so I don't know if the matrices could somehow be > seperate tables or something. > > I started looking into hdf5 as a way to store this kind of data, but I > think it is more complicated than I need. > > Any help would be appreciated. > > Ryan > > > ------------------------------------------------------- > This SF.Net email is sponsored by Oracle Space Sweepstakes > Want to be the first software developer in space? > Enter now for the Oracle Space Sweepstakes! > http://ads.osdn.com/?ad_id=7393&alloc_id=16281&op=click > _______________________________________________ > sqlobject-discuss mailing list > sqlobject-discuss at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/sqlobject-discuss > Ryan Krauss wrote: > I have some data that I was storing in structure arrays in Matlab. It > seems like I can do everything I need to do with mat_struct in SciPy, > but I am wondering if there is a better way. Basically, I have some > experimental data I need to turn into Bode plots and then store with > some describing information. Each test would have a magnitude vector, > a phase vector, and then some miscellanious information such as the > input and output (which would be strings), and raw data filenames (a > list of strings), and maybe a date and a description blurb. A group > of these tests with different inputs and outputs would be one set that > I would save as a datafile. In Matlab, each test was an element of a > structure array. Each element of the structure array had unique > magnitude and phase vectors and input and output labels. The rest of > the data was the same but got repeated for each element in the > structure array (I want to keep it all together in one structure and > what was repeated was small compared to the data vectors). > > I think I can do all of this with a list of mat_struct's and I seem to > be able to pickle and unpickle them with io.pickler.objload and > objsave, but I didn't know if there is some more elegant way of doing > this in SciPy. > > Any thoughts? > > Thanks, > > Ryan > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From rkern at ucsd.edu Fri May 13 09:41:08 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 13 May 2005 06:41:08 -0700 Subject: [SciPy-user] structure arrays in SciPy In-Reply-To: <4284A86C.3010900@comcast.net> References: <4284088A.9040101@comcast.net> <4284A86C.3010900@comcast.net> Message-ID: <4284AE74.1020102@ucsd.edu> Ryan Krauss wrote: > Maybe an example would make this clearer. What if I want to store a > collection of objects with the following properties: > entry1.input='a1' > entry1.output='j2' > entry1.datafile=['file1','file2','file3'] > entry1.data1=[1.2,1.47,3.56,....] > entry1.data2=[-1.5,-19.5,35.5,....] > > where data1 and data2 are actually Numeric or numarray arrays/matrices. > How should I go about storing a collection of these objects in SciPy? > Should I be defining my own class and simply make a list of them and > pickle the list? I use and highly recommend PyTables for such tasks. http://pytables.sourceforge.net If you're restricted to Scipy only, you could use scipy.io.save(). In [2]:class Params(dict): ...: def __init__(self, **kwds): ...: dict.__init__(self, **kwds) ...: self.__dict__ = self ...: In [4]:entry1 = Params(input='a1', output='j2') In [5]:entry1.datafile = ['file1', 'file2', 'file3'] In [6]:entry1.data1=[1.2,1.47,3.56] In [7]:entry1.data2=[-1.5,-19.5,35.5] In [9]:io.save? Type: function Base Class: String Form: Namespace: Interactive File: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/io/data_store.py Definition: io.save(file_name=None, data=None) Docstring: Save the dictionary "data" into a module and shelf named save In [10]:io.save('entry1', entry1) In [13]:import entry1 In [14]:entry1. entry1.__class__ entry1.__new__ entry1.data_store entry1.__delattr__ entry1.__reduce__ entry1.datafile entry1.__dict__ entry1.__reduce_ex__ entry1.dir entry1.__doc__ entry1.__repr__ entry1.entry1 entry1.__file__ entry1.__setattr__ entry1.input entry1.__getattribute__ entry1.__str__ entry1.output entry1.__hash__ entry1.dat entry1.py entry1.__init__ entry1.data1 entry1.pyc entry1.__name__ entry1.data2 In [14]:entry1.input Out[14]:'a1' -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From dd55 at cornell.edu Fri May 13 09:44:43 2005 From: dd55 at cornell.edu (Darren Dale) Date: Fri, 13 May 2005 09:44:43 -0400 Subject: [SciPy-user] structure arrays in SciPy In-Reply-To: <4284A86C.3010900@comcast.net> References: <4284088A.9040101@comcast.net> <4284A86C.3010900@comcast.net> Message-ID: <200505130944.53437.dd55@cornell.edu> On Friday 13 May 2005 9:15 am, Ryan Krauss wrote: > Maybe an example would make this clearer. What if I want to store a > collection of objects with the following properties: > entry1.input='a1' > entry1.output='j2' > entry1.datafile=['file1','file2','file3'] > entry1.data1=[1.2,1.47,3.56,....] > entry1.data2=[-1.5,-19.5,35.5,....] > > where data1 and data2 are actually Numeric or numarray arrays/matrices. > How should I go about storing a collection of these objects in SciPy? > Should I be defining my own class and simply make a list of them and > pickle the list? As far as I know, there is not the equivalent of a Matlab struct for Python. You can do something like class foo: pass entry1 = foo() entry1.input='a1' ... That will get you most of the functionality of a Matlab struct (if I remember correctly.) It will not get you the slicing behavior of a Matlab struct (which I used, but never liked), you would need to define the class more carefully for that. Darren -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available URL: From stephen.walton at csun.edu Fri May 13 16:56:38 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 13 May 2005 13:56:38 -0700 Subject: [SciPy-user] cephesmodule build problem Message-ID: <42851486.7010202@csun.edu> I'm having a very strange and annoyingly intermittent problem building Scipy 0.3.4613 from CVS. Specifically, if I do "python setup.py build" all is well. If I do "python setup.py bdist_rpm" I get the following error: > gcc: Lib/special/_cephesmodule.c > Lib/special/_cephesmodule.c:17:30: _nc_cephesmodule.c: No such file or > directory > Lib/special/_cephesmodule.c:17:30: _nc_cephesmodule.c: No such file or > directory > error: Command "gcc -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -m32 > -march=i386 -mtune=pentium4 -D_GNU_SOURCE -fPIC -O2 -g -pipe -m32 > -march=i386 -mtune=pentium4 -fPIC -DUSE_MCONF_LE > -DNUMERIC_VERSION="\"24.0b2\"" -DNUMERIC -I/usr/include/python2.3 > -I/usr/include/python2.3 -c Lib/special/_cephesmodule.c -o > build/temp.linux-i686-2.3/Lib/special/_cephesmodule.o" failed with > exit status 1 If I copy the above gcc command and paste it into a terminal window, it succeeds fine. As far as I can tell, the gcc commands for both build and bdist_rpm are the same. I do 'rm -rf build dist MANIFEST' in the root scipy directory before the build in all cases. This is on FC3. From jmiller at stsci.edu Fri May 13 17:59:13 2005 From: jmiller at stsci.edu (Todd Miller) Date: Fri, 13 May 2005 17:59:13 -0400 Subject: [SciPy-user] cephesmodule build problem In-Reply-To: <42851486.7010202@csun.edu> References: <42851486.7010202@csun.edu> Message-ID: <1116021553.31537.75.camel@halloween.stsci.edu> On Fri, 2005-05-13 at 16:56, Stephen Walton wrote: > I'm having a very strange and annoyingly intermittent problem building > Scipy 0.3.4613 from CVS. Specifically, if I do "python setup.py build" > all is well. If I do "python setup.py bdist_rpm" I get the following error: > > > gcc: Lib/special/_cephesmodule.c > > Lib/special/_cephesmodule.c:17:30: _nc_cephesmodule.c: No such file or > > directory > > Lib/special/_cephesmodule.c:17:30: _nc_cephesmodule.c: No such file or > > directory > > error: Command "gcc -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -m32 > > -march=i386 -mtune=pentium4 -D_GNU_SOURCE -fPIC -O2 -g -pipe -m32 > > -march=i386 -mtune=pentium4 -fPIC -DUSE_MCONF_LE > > -DNUMERIC_VERSION="\"24.0b2\"" -DNUMERIC -I/usr/include/python2.3 > > -I/usr/include/python2.3 -c Lib/special/_cephesmodule.c -o > > build/temp.linux-i686-2.3/Lib/special/_cephesmodule.o" failed with > > exit status 1 > > If I copy the above gcc command and paste it into a terminal window, it > succeeds fine. As far as I can tell, the gcc commands for both build > and bdist_rpm are the same. I do 'rm -rf build dist MANIFEST' in the > root scipy directory before the build in all cases. This is on FC3. This problem is caused by some source file restructuring I did earlier this week. Although the MANIFEST is being regenerated, it doesn't automatically pick up two files which are only used indirectly in the build process via #include: _na_cephesmodule.c and _nc_cephesmodule.c, both in Lib/special. I think the other .c and .f files are picked up by virtue of being explicitly listed in extension object constructors. Modifying MANIFEST.in to explicitly include the two files in the MANIFEST took care of the problem for me. The fix is in CVS now. Regards, Todd From ryanfedora at comcast.net Sat May 14 19:56:55 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Sat, 14 May 2005 19:56:55 -0400 Subject: [SciPy-user] slow fft time Message-ID: <42869047.6060708@comcast.net> An HTML attachment was scrubbed... URL: From ryanfedora at comcast.net Sat May 14 20:20:47 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Sat, 14 May 2005 20:20:47 -0400 Subject: [SciPy-user] slow fft time In-Reply-To: <42869047.6060708@comcast.net> References: <42869047.6060708@comcast.net> Message-ID: <428695DF.4060202@comcast.net> An HTML attachment was scrubbed... URL: From perry at stsci.edu Sat May 14 20:51:37 2005 From: perry at stsci.edu (Perry Greenfield) Date: Sat, 14 May 2005 20:51:37 -0400 Subject: [SciPy-user] slow fft time In-Reply-To: <428695DF.4060202@comcast.net> References: <42869047.6060708@comcast.net> <428695DF.4060202@comcast.net> Message-ID: <5002e8b9697c19446648acab3bac5f8e@stsci.edu> You do realize that 21001 is prime, and that FFTs are not 'fast' for prime numbers of elements? Try padding the array to make it a power of 2 number of elements (or at least a product of many small primes) Perry Greenfield On May 14, 2005, at 8:20 PM, Ryan Krauss wrote: > I thought maybe I had screwed something up in my install and it > wasn't using ATLAS or something, but I ran this same code from windows > where I had in stalled from > SciPy_complete-0.3.2.win32P4SSE2-py2.3-num23.5.exe and > Numeric-23.8.win32-py2.3.exe.? It takes even longer in windows - maybe > 20secs (I am ordinarily running FC3 linux with kde). > > Ryan > > > Ryan Krauss wrote: > I am trying to process some experimental data and I need to do a > bunch of fft's.? The time domain matrices are column vectors that are > 21001 elements and there are 3 columns.? It is taking 7 seconds or so > to execute the fft command: > > In [73]: shape(curtmat) > Out[73]: (21001, 3) > > In [74]: type(curtmat) > Out[74]: > > In [75]: fftmat=fft(curtmat,None,0) > > Am I doing something wrong?? Does this seems like a normal ammount of > time to anyone else?? What can I do to make this better? > > Ryan > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From ryanfedora at comcast.net Sat May 14 21:20:31 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Sat, 14 May 2005 21:20:31 -0400 Subject: [SciPy-user] slow fft time In-Reply-To: <5002e8b9697c19446648acab3bac5f8e@stsci.edu> References: <42869047.6060708@comcast.net> <428695DF.4060202@comcast.net> <5002e8b9697c19446648acab3bac5f8e@stsci.edu> Message-ID: <4286A3DF.8040107@comcast.net> Thanks Perry. I did not know that prime number of elements was bad. I knew that 2^N was good, but Matlab always did them so fast that I never worried about making them 2^N. But there were almost always of an even length, so I never hit a prime number. There is already a little zero padding on this data, so just changing from 21001 element to 21000 took my FFT time from over 7secs to 0.045secs. So, I still won't worry about 2^N, but I will avoid prime number lengths. Thanks again. Ryan Perry Greenfield wrote: > You do realize that 21001 is prime, and that FFTs are not 'fast' for > prime numbers of elements? > Try padding the array to make it a power of 2 number of elements (or > at least a product of many > small primes) > > Perry Greenfield > > On May 14, 2005, at 8:20 PM, Ryan Krauss wrote: > >> I thought maybe I had screwed something up in my install and it >> wasn't using ATLAS or something, but I ran this same code from >> windows where I had in stalled from >> SciPy_complete-0.3.2.win32P4SSE2-py2.3-num23.5.exe and >> Numeric-23.8.win32-py2.3.exe. It takes even longer in windows - >> maybe 20secs (I am ordinarily running FC3 linux with kde). >> >> Ryan >> >> >> Ryan Krauss wrote: >> I am trying to process some experimental data and I need to do a >> bunch of fft's. The time domain matrices are column vectors that are >> 21001 elements and there are 3 columns. It is taking 7 seconds or so >> to execute the fft command: >> >> In [73]: shape(curtmat) >> Out[73]: (21001, 3) >> >> In [74]: type(curtmat) >> Out[74]: >> >> In [75]: fftmat=fft(curtmat,None,0) >> >> Am I doing something wrong? Does this seems like a normal ammount >> of time to anyone else? What can I do to make this better? >> >> Ryan >> >> >> >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.net >> http://www.scipy.net/mailman/listinfo/scipy-user >> >> >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.net >> http://www.scipy.net/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From benmcbride at acm.org Mon May 16 19:38:08 2005 From: benmcbride at acm.org (Benjamin McBride) Date: Mon, 16 May 2005 18:38:08 -0500 Subject: [SciPy-user] import error on OS X 10.4 Message-ID: I'm having a problems getting scipy to run on OS X Tiger. I've tried 0.3.2 as well as CVS sources, but get the same error when trying to import scipy. Has anyone had success building scipy on Tiger? Any help would be appreciated, Ben McBride Numeric Version 23.8 10:17 PM:Numeric-23.8:509$ python Python 2.3.5 (#1, Mar 20 2005, 20:38:20) [GCC 3.3 20030304 (Apple Computer, Inc. build 1809)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import scipy Traceback (most recent call last): File "", line 1, in ? File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy/__init__.py", line 11, in ? from scipy_base import * File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy_base/__init__.py", line 5, in ? import numerix File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy_base/numerix.py", line 59, in ? from _nc_imports import * File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy_base/_nc_imports.py", line 10, in ? from fastumath import alter_numeric, restore_numeric ImportError: cannot import name alter_numeric From oliphant at ee.byu.edu Mon May 16 20:17:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 16 May 2005 18:17:52 -0600 Subject: [SciPy-user] import error on OS X 10.4 In-Reply-To: References: Message-ID: <42893830.8060902@ee.byu.edu> Benjamin McBride wrote: > I'm having a problems getting scipy to run on OS X Tiger. I've tried > 0.3.2 as well as CVS sources, but get the same error when trying to > import scipy. Has anyone had success building scipy on Tiger? > > Any help would be appreciated, Todd needs to address these issues as they have something to do with the new numerix stuff. -Travis From rkern at ucsd.edu Mon May 16 23:57:45 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 16 May 2005 20:57:45 -0700 Subject: [SciPy-user] import error on OS X 10.4 In-Reply-To: References: Message-ID: <42896BB9.9050505@ucsd.edu> Benjamin McBride wrote: > I'm having a problems getting scipy to run on OS X Tiger. I've tried > 0.3.2 as well as CVS sources, but get the same error when trying to > import scipy. Has anyone had success building scipy on Tiger? Yes, with the current CVS of Scipy and a recent CVS of Numeric. I don't think my Python 2.4.1 versus your Python 2.3.5 is relevant. You might want to try deleting the installed packages and the build directories and reinstalling. Python 2.4.1 (#2, Mar 31 2005, 00:05:10) [GCC 3.3 20030304 (Apple Computer, Inc. build 1666)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import scipy numerix Numeric 24.0b2 >>> scipy.alter_numeric() >>> -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rosario.ruiloba at noveltis.fr Tue May 17 08:11:56 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 17 May 2005 14:11:56 +0200 Subject: [SciPy-user] gplt.plot problem with float data Message-ID: <1116331916.17733.23.camel@rosae.noveltis.fr> Hello, my data are not correctly plotted with plot command of gplt. Values are float with 10-4 precision. I'm giving you in attached file the result of the plot for the next values (see below). My command line is : gplt.plot(date,h,'with points') Do you know how to resolve the problem? Xplt plot data correctly. 17663.41597 2.767 17663.41944 2.764 17663.42292 2.756 17663.42639 2.758 17663.42986 2.751 17663.43333 2.753 17663.43681 2.749 17663.44028 2.744 17663.44375 2.744 17663.44722 2.743 17663.45069 2.739 17663.45417 2.739 17663.45764 2.735 17663.46111 2.728 17663.46458 2.730 17663.46806 2.726 17663.47153 2.722 17663.47500 2.715 17663.47847 2.716 17663.48194 2.711 17663.48542 2.706 17663.48889 2.709 17663.49236 2.701 17663.49583 2.700 17663.49931 2.697 17663.50278 2.694 17663.50625 2.695 17663.50972 2.692 17663.51319 2.690 17663.51667 2.688 17663.52014 2.685 17663.52361 2.687 17663.52708 2.684 17663.53056 2.679 17663.53403 2.677 17663.53750 2.676 17663.54097 2.670 17663.54444 2.670 17663.54792 2.668 17663.55139 2.664 17663.55486 2.659 17663.55833 2.658 17663.56181 2.659 17663.56528 2.658 17663.56875 2.658 17663.57222 2.659 17663.57569 2.661 17663.57917 2.653 17663.58264 2.652 17663.58611 2.655 17663.58958 2.652 17663.59306 2.651 17663.59653 2.650 17663.60000 2.652 17663.60347 2.645 17663.60694 2.647 17663.61042 2.644 17663.61389 2.644 17663.61736 2.645 17663.62083 2.651 17663.62431 2.652 17663.62778 2.658 17663.63125 2.656 17663.63472 2.653 17663.63819 2.653 17663.64167 2.651 17663.64514 2.661 17663.64861 2.660 17663.65208 2.664 17663.65556 2.666 17663.65903 2.673 17663.66250 2.678 17663.66597 2.684 17663.66944 2.680 17663.67292 2.684 17663.67639 2.682 17663.67986 2.692 17663.68333 2.698 17663.68681 2.699 17663.69028 2.703 17663.69375 2.705 17663.69722 2.714 -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 -------------- next part -------------- A non-text attachment was scrubbed... Name: myfigure.png Type: image/png Size: 4925 bytes Desc: not available URL: From jmiller at stsci.edu Tue May 17 12:07:50 2005 From: jmiller at stsci.edu (Todd Miller) Date: Tue, 17 May 2005 12:07:50 -0400 Subject: [SciPy-user] import error on OS X 10.4 In-Reply-To: <42893830.8060902@ee.byu.edu> References: <42893830.8060902@ee.byu.edu> Message-ID: <1116346069.29195.89.camel@halloween.stsci.edu> On Mon, 2005-05-16 at 20:17, Travis Oliphant wrote: > Benjamin McBride wrote: > > > I'm having a problems getting scipy to run on OS X Tiger. I've tried > > 0.3.2 as well as CVS sources, but get the same error when trying to > > import scipy. Has anyone had success building scipy on Tiger? > > > > Any help would be appreciated, > > Todd needs to address these issues as they have something to do with the > new numerix stuff. Unfortunately, I don't have access to OS-X Tiger anywhere yet so I can suggest but not test. What I know is this: as part of the scipy_base port to numarray, alter_numeric() moved from scipy_base._compiled_base to scipy_base.fastumath. If there is a "stale" fastumath.so lying around from an old installation of scipy, that might cause the symptoms seen here: scipy_base.fastumath exists and imports, but scipy_base.fastumath doesn't supply alter_numeric(). Since Robert was able to build scipy with OS-X Tiger, and since I don't see any conditional C code related to alter_numeric(), I agree with Robert's advice to remove (or move) the old scipy installation (scipy, scipy_base, scipy_distutils, ...) and the scipy/build directory and then to re-build and re-install. Todd From benmcbride at acm.org Tue May 17 20:21:53 2005 From: benmcbride at acm.org (Benjamin McBride) Date: Tue, 17 May 2005 19:21:53 -0500 Subject: [SciPy-user] import error on OS X 10.4 In-Reply-To: <1116346069.29195.89.camel@halloween.stsci.edu> References: <42893830.8060902@ee.byu.edu> <1116346069.29195.89.camel@halloween.stsci.edu> Message-ID: <6E6EA8C3-B703-4E6D-9EE5-31CC32D005B0@acm.org> Fixed! I followed Todd's advice and the install worked. However, after running the test suite there was one failure: ====================================================================== FAIL: check_round (scipy.special.basic.test_basic.test_round) ---------------------------------------------------------------------- Traceback (most recent call last): File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy/special/tests/test_basic.py", line 1789, in check_round assert_array_equal(rnd,rndrl) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy_test/testing.py", line 715, in assert_array_equal assert cond,\ AssertionError: Arrays are not equal (mismatch 25.0%): Array 1: [10 10 10 11] Array 2: [10 10 11 11] ---------------------------------------------------------------------- Thanks for all the help. Ben On May 17, 2005, at 11:07 AM, Todd Miller wrote: > On Mon, 2005-05-16 at 20:17, Travis Oliphant wrote: > >> Benjamin McBride wrote: >> >> >>> I'm having a problems getting scipy to run on OS X Tiger. I've >>> tried >>> 0.3.2 as well as CVS sources, but get the same error when trying to >>> import scipy. Has anyone had success building scipy on Tiger? >>> >>> Any help would be appreciated, >>> >> >> Todd needs to address these issues as they have something to do >> with the >> new numerix stuff. >> > > Unfortunately, I don't have access to OS-X Tiger anywhere yet so I > can > suggest but not test. > > What I know is this: as part of the scipy_base port to numarray, > alter_numeric() moved from scipy_base._compiled_base to > scipy_base.fastumath. If there is a "stale" fastumath.so lying around > from an old installation of scipy, that might cause the symptoms seen > here: scipy_base.fastumath exists and imports, but > scipy_base.fastumath doesn't supply alter_numeric(). > > Since Robert was able to build scipy with OS-X Tiger, and since I > don't > see any conditional C code related to alter_numeric(), I agree with > Robert's advice to remove (or move) the old scipy installation (scipy, > scipy_base, scipy_distutils, ...) and the scipy/build directory and > then > to re-build and re-install. > > Todd > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > From jdhunter at ace.bsd.uchicago.edu Wed May 18 13:45:49 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 18 May 2005 12:45:49 -0500 Subject: [SciPy-user] send me your blurbs Message-ID: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> If you are using python for high performance or scientific computing, could I ask you to send me a blurb (one or two sentences) on what you are using it for and if you have any publications or references related to this work (the refs don't have to be specifically about the python stuff) please include them (extra points for bibtex). I am writing some grant text justifying python in scientific computing and would like to have a range of examples across disciplines to provide. If I could get these today or tomorrow that would be very helpful! Please send these to me off list. Thanks! JDH From sransom at cv.nrao.edu Wed May 18 14:01:17 2005 From: sransom at cv.nrao.edu (Scott Ransom) Date: Wed, 18 May 2005 14:01:17 -0400 Subject: [SciPy-user] send me your blurbs In-Reply-To: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <200505181401.18014.sransom@cv.nrao.edu> Hi John, I use python for data processing, modelling, and for lots of scripting of high-performance computing jobs in order to search for and monitor binary and millisecond pulsars. The data are extremely voluminous (1TB for 10 hours of observing) and come from the world's largest radio telescopes (e,g, Arecibo and the Green Bank Telescope). In addition, all my figures for all my publications are generated using python and PGPLOT (I may soon move to matplotlib). There are a bunch of references to refereed journals in my CV: http://www.cv.nrao.edu/~sransom/s_ransom_cv.pdf But if you just want a couple recent ones: Ransom, S., et al. 2005, ``Twenty-One Millisecond Pulsars in Terzan~5 Using the Green Bank Telescope'', Science, 307, 892 Ransom, S., et al. 2004, ``Green Bank Telescope Measurement of the Systemic Velocity of the Double Pulsar Binary J0737-3039 and Implications for Its Formation'', Astrophysical Journal, 609, L71 Ransom, S., et al. 2004, ``Green Bank Telescope Discovery of Two Binary Millisecond Pulsars in the Globular Cluster M30'', Astrophysical Journal, 604, 328 Hope this works for you! Scott PS: Bibtex versions of the above are here: @ARTICLE{2005Sci...307..892R, author = {{Ransom}, S.~M. and {Hessels}, J.~W.~T. and {Stairs}, I.~H. and {Freire}, P.~C.~C. and {Camilo}, F. and {Kaspi}, V.~M. and {Kaplan}, D.~L. }, title = "{Twenty-One Millisecond Pulsars in Terzan 5 Using the Green Bank Telescope}", journal = {Science}, year = 2005, month = feb, volume = 307, pages = {892-896}, adsurl = {http://adsabs.harvard.edu/cgi-bin/nph-bib_query?bibcode=2005Sci...307..892R&db_key=AST}, adsnote = {Provided by the NASA Astrophysics Data System} } @ARTICLE{2004ApJ...609L..71R, author = {{Ransom}, S.~M. and {Kaspi}, V.~M. and {Ramachandran}, R. and {Demorest}, P. and {Backer}, D.~C. and {Pfahl}, E.~D. and {Ghigo}, F.~D. and {Kaplan}, D.~L.}, title = "{Green Bank Telescope Measurement of the Systemic Velocity of the Double Pulsar Binary J0737-3039 and Implications for Its Formation}", journal = {\apjl}, year = 2004, month = jul, volume = 609, pages = {L71-L74}, adsurl = {http://adsabs.harvard.edu/cgi-bin/nph-bib_query?bibcode=2004ApJ...609L..71R&db_key=AST}, adsnote = {Provided by the NASA Astrophysics Data System} } @ARTICLE{2004ApJ...604..328R, author = {{Ransom}, S.~M. and {Stairs}, I.~H. and {Backer}, D.~C. and {Greenhill}, L.~J. and {Bassa}, C.~G. and {Hessels}, J.~W.~T. and {Kaspi}, V.~M.}, title = "{Green Bank Telescope Discovery of Two Binary Millisecond Pulsars in the Globular Cluster M30}", journal = {\apj}, year = 2004, month = mar, volume = 604, pages = {328-338}, adsurl = {http://adsabs.harvard.edu/cgi-bin/nph-bib_query?bibcode=2004ApJ...604..328R&db_key=AST}, adsnote = {Provided by the NASA Astrophysics Data System} } On Wednesday 18 May 2005 01:45 pm, John Hunter wrote: > If you are using python for high performance or scientific computing, > could I ask you to send me a blurb (one or two sentences) on what you > are using it for and if you have any publications or references > related to this work (the refs don't have to be specifically about > the python stuff) please include them (extra points for bibtex). > > I am writing some grant text justifying python in scientific > computing and would like to have a range of examples across > disciplines to provide. > > If I could get these today or tomorrow that would be very helpful! > Please send these to me off list. > > Thanks! > JDH > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From oliphant at ee.byu.edu Wed May 18 21:41:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 18 May 2005 19:41:55 -0600 Subject: [SciPy-user] Tutorial on advanced slicing with the DCT Message-ID: <428BEEE3.6090308@ee.byu.edu> I just checked in a module implementing the Discrete Cosine Transform into the image library of scipy. The main reason I'm advertising on the scipy users list is that it shows an example of using slices to accomplish generic indexing. The DCT can be implemented using the FFT on a specifically sliced version of the sequence. Basically the DCT can be implemented using an FFT of xtilde where xtilde[:N/2] = x[::2] xtilde[N/2:] = x[N::-2] But, I wanted to do this kind of slicing on an arbitrary axis. So, if the user-provided axis was -2 I wanted the first of these to be: xtilde[...,:N/2,:] = x[...,:N/2,:] or if axis=0 this should be xtilde[:N/2,...] = x[:N/2,...] In order to do that I used a tuple of slice objects on each side xtilde[slice0] = xtilde[slice1] where slice0 is a tuple of slice objects with slice?[axis] set to the correct slicing behavior. slice objects can be contructed using sl = slice(arg1, arg2, arg3) where x[sl] will act equivalently to x[arg1:arg2:arg3] I thought that the solution provided a good example for how to do these sorts of things. If others have even better solutions, please let me know. Best regards, -Travis O. From lanceboyle at cwazy.co.uk Wed May 18 22:55:40 2005 From: lanceboyle at cwazy.co.uk (Lance Boyle) Date: Wed, 18 May 2005 19:55:40 -0700 Subject: [SciPy-user] send me your blurbs In-Reply-To: <200505181401.18014.sransom@cv.nrao.edu> References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> <200505181401.18014.sransom@cv.nrao.edu> Message-ID: Perhaps it would be useful to have a store of such blurbs readily accessible on the SciPy and/or Python home pages, if such does not already exist. Lance On May 18, 2005, at 11:01 AM, Scott Ransom wrote: >> If you are using python for high performance or scientific computing, >> could I ask you to send me a blurb (one or two sentences) on what you >> are using it for and if you have any publications or references >> related to this work (the refs don't have to be specifically about >> the python stuff) please include them (extra points for bibtex). >> >> I am writing some grant text justifying python in scientific >> computing and would like to have a range of examples across >> disciplines to provide. >> >> If I could get these today or tomorrow that would be very helpful! >> Please send these to me off list. >> >> Thanks! >> JDH From vincent at ecovla.nl Thu May 19 03:09:01 2005 From: vincent at ecovla.nl (Vincent Schut) Date: Thu, 19 May 2005 09:09:01 +0200 Subject: [SciPy-user] send me your blurbs In-Reply-To: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <1116486541.17910.2.camel@localhost> Here we use python for remote sensing (satellite imagery) data processing, as an alternative for IDL/ENVI (which we also still use). Cheers, Vincent Schut (www.sarvision.nl) On Wed, 2005-05-18 at 12:45 -0500, John Hunter wrote: > If you are using python for high performance or scientific computing, > could I ask you to send me a blurb (one or two sentences) on what you > are using it for and if you have any publications or references > related to this work (the refs don't have to be specifically about the > python stuff) please include them (extra points for bibtex). > > I am writing some grant text justifying python in scientific computing > and would like to have a range of examples across disciplines to > provide. > > If I could get these today or tomorrow that would be very helpful! > Please send these to me off list. > > Thanks! > JDH > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From zunzun at zunzun.com Thu May 19 08:47:25 2005 From: zunzun at zunzun.com (zunzun at zunzun.com) Date: Thu, 19 May 2005 08:47:25 -0400 Subject: [SciPy-user] send me your blurbs In-Reply-To: References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> <200505181401.18014.sransom@cv.nrao.edu> Message-ID: <20050519124725.GA29235@localhost.members.linode.com> On Wed, May 18, 2005 at 07:55:40PM -0700, Lance Boyle wrote: > If you are using python for high performance or scientific computing, > could I ask you to send me a blurb (one or two sentences) on what you > are using it for and if you have any publications or references > related to this work (the refs don't have to be specifically about > the python stuff) please include them (extra points for bibtex). You should see if my curve and surface fitting web site might do: http://zunzun.com - written in Python and using Scipy. James Phillips From skip at pobox.com Thu May 19 11:30:31 2005 From: skip at pobox.com (Skip Montanaro) Date: Thu, 19 May 2005 10:30:31 -0500 Subject: [SciPy-user] Matrix docs? Message-ID: <17036.45335.704078.815846@montanaro.dyndns.org> I'm the "Python guy" at work, not a scientific programmer. I'm trying to help our researchers start using Python and SciPy (they are used to R, Matlab and SPlus). We're all having trouble finding useful documentation for extremely basic stuff. There's a mat() function that returns a Matrix object, but I can't find any documentation on this Matrix object. The only example I have of mat() usage is in the SciPy tutorial and it appears to use some weird string form as input: mat('[1 3 5; 2 5 1; 2 3 8]') I thought, "surely there must be some way to create a Matrix from a list of lists?" The obvious experiment works: mat([[1, 3, 5], [2, 5, 1], [2, 3, 8]]) but it seems odd to me that this isn't the form shown in the tutorial. My feeble attempts to find something come up empty: http://oliphant.ee.byu.edu:81/scipy_base/mat http://www.scipy.org/documentation/apidocs/scipy/scipy.linalg.matfuncs.html >>> info(mat) Matrix(data, typecode=None, copy=1, savespace=0) None I eventually stumbled on the NumPy docs. It has an array, and I then noticed that the scipy package exports an array() function, which at first glance seems to be about the same as mat(). What's the difference between mat() and array()? Is there some more cohesive documentation? Thanks, -- Skip Montanaro skip at pobox.com From oliphant at ee.byu.edu Thu May 19 11:40:38 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 19 May 2005 09:40:38 -0600 Subject: [SciPy-user] Matrix docs? In-Reply-To: <17036.45335.704078.815846@montanaro.dyndns.org> References: <17036.45335.704078.815846@montanaro.dyndns.org> Message-ID: <428CB376.1030202@ee.byu.edu> Skip Montanaro wrote: >I'm the "Python guy" at work, not a scientific programmer. I'm trying to >help our researchers start using Python and SciPy (they are used to R, >Matlab and SPlus). We're all having trouble finding useful documentation >for extremely basic stuff. There's a mat() function that returns a Matrix >object, but I can't find any documentation on this Matrix object. The only >example I have of mat() usage is in the SciPy tutorial and it appears to use >some weird string form as input: > > mat('[1 3 5; 2 5 1; 2 3 8]') > >I thought, "surely there must be some way to create a Matrix from a list of >lists?" The obvious experiment works: > > mat([[1, 3, 5], [2, 5, 1], [2, 3, 8]]) > > Several things: 1) arrays are not matrices. 2) matrices use arrays for their internal storage 3) the numpy docs are very helpful. 4) mat is just short for Matrix.Matrix which comes from numpy 5) Matrix.Matrix has been expanded to allow this string format so that MATLAB users can feel more comfortable. 6) lists of lists can always be used to instantiate arrays and Matrices >but it seems odd to me that this isn't the form shown in the tutorial. > > Why is this odd? The list of lists approach is extensively documented in the numpy manuals, but this "string approach" which will be more familiar to matlab users is useful and so is documented in the tutorial. >My feeble attempts to find something come up empty: > > http://oliphant.ee.byu.edu:81/scipy_base/mat > http://www.scipy.org/documentation/apidocs/scipy/scipy.linalg.matfuncs.html > > >>> info(mat) > Matrix(data, typecode=None, copy=1, savespace=0) > > None > >I eventually stumbled on the NumPy docs. > The NumPy docs are a great source since scipy just adds features to NumPy. I would suggest always starting there. Again, scipy.mat is the same as numpy's Matrix.Matrix (but more convenient when using from scipy import *) and therefore gives you a matrix object. scipy.array is exactly the same as Numeric.array and therefore gives you an array object (which is not a matrix). The big difference between matrices and arrays is that multiplication for the one is matrix multiplication while multiplication for the other is element-by-element multiplication. You want cohesive documentation --- for now you have to bring your own glue :-) The Enthought people have a .chm file which contains a great deal of information and livedocs is helpful provided there are docstrings (which is mostly true throughout scipy --- but not throughout Numeric as you've noticed, the mat documentation comes from Matrix.Matrix which is sadly lacking). Post on this list if you have further questions. -Travis From skip at pobox.com Thu May 19 12:15:27 2005 From: skip at pobox.com (Skip Montanaro) Date: Thu, 19 May 2005 11:15:27 -0500 Subject: [SciPy-user] Matrix docs? In-Reply-To: <428CB376.1030202@ee.byu.edu> References: <17036.45335.704078.815846@montanaro.dyndns.org> <428CB376.1030202@ee.byu.edu> Message-ID: <17036.48031.91593.386392@montanaro.dyndns.org> >> but it seems odd to me that this isn't the form shown in the >> tutorial. >> >> Travis> Why is this odd? The list of lists approach is extensively Travis> documented in the numpy manuals, but this "string approach" Travis> which will be more familiar to matlab users is useful and so is Travis> documented in the tutorial. Yes, but this is in the Scipy Tutorial, which as far as I've been able to determine makes an assumption that someone new to SciPy will be experienced with Numeric Python. That's certainly not the case for me (an experienced Python programmer, but not a scientific programmer), nor will it be the case for our researchers. Some have used Python. None have used SciPy or Numeric before. It would be really helpful if the first statement of the SciPy Tutorial had a footnote referring the reader to Numeric Python documentation. (Is the NumTut package a Numeric tutorial? Is it available by default through scipy?) >> I eventually stumbled on the NumPy docs. >> Travis> The NumPy docs are a great source since scipy just adds features Travis> to NumPy. I would suggest always starting there. Sure, now I realize that. But as a rank beginner I started with the SciPy tutorial which didn't say, "Stop! We assume you're already familiar with Numeric Python. Better check out blah blah blah..." Travis> The Enthought people have a .chm file which contains a great Travis> deal of information and livedocs is helpful provided there are Travis> docstrings (which is mostly true throughout scipy --- but not Travis> throughout Numeric as you've noticed, the mat documentation Travis> comes from Matrix.Matrix which is sadly lacking). All the more reason to strongly direct newcomers to the Numeric docs. Thanks for the assistance, Skip From Jim.Vickroy at noaa.gov Thu May 19 14:18:49 2005 From: Jim.Vickroy at noaa.gov (Jim Vickroy) Date: Thu, 19 May 2005 12:18:49 -0600 Subject: [SciPy-user] scipy Microsoft Windows installer for Python 2.4? Message-ID: Hello scipy users, Could someone tell me when/if a MS Windows installer for scipy and Python 2.4 will be available? A search of past postings revealed a similar question posted 2005-Feb. Thanks, -- jv From joe at enthought.com Thu May 19 14:57:22 2005 From: joe at enthought.com (Joe Cooper) Date: Thu, 19 May 2005 13:57:22 -0500 Subject: [SciPy-user] scipy Microsoft Windows installer for Python 2.4? In-Reply-To: References: Message-ID: <428CE192.7030409@enthought.com> Jim Vickroy wrote: > Hello scipy users, > > Could someone tell me when/if a MS Windows installer for scipy and Python > 2.4 will be available? > > A search of past postings revealed a similar question posted 2005-Feb. I am 1.5 weeks into backporting the MSI installer from Python 2.4 to 2.3.5 and migrating our Wise installation into the new MSI installer. I expect it to be ready for testing in another few days. 2.4 will follow a couple of weeks after the official 2.3.5 release, assuming I don't get side-tracked by other issues (like migrating scipy.org to a new server, which is also a high priority item, since Plone eats all 768 MB available on the old server and has to be restarted daily, and still runs dog slow by midday). In other words, a 2.4 based Enthon release is being very actively worked on, though indirectly (indirectly, because I'm focusing on 2.3.5 for the time being--internally Enthought needs to stay with some older versions for a while longer, and so that gets my immediate attention). I can only say it will be made available when it is ready, but I expect it to be ready in a month or so. From tom.denniston at gmail.com Thu May 19 16:00:39 2005 From: tom.denniston at gmail.com (Tom Denniston) Date: Thu, 19 May 2005 15:00:39 -0500 Subject: [SciPy-user] strange behavior in RandomArray.randint() Message-ID: I am running under python 1.4 and windows XP. I get the following results in my python interpretter: -------------------------------------------- n=0 import RandomArray while 1: arr = RandomArray.randint(0, 2509, 252) for x in arr: if x>2508: raise "Found number greater than 2508: " + str(x) + " on iteration " + str(n) n=n+1 Traceback (most recent call last): File "", line 5, in -toplevel- raise "Found number greater than 2508: " + str(x) + " on iteration " + str(n) Found number greater than 2508: 2509 on iteration 91719 ------------------------------------------- Even though the Numeric reference (see below) on RandomArray class claims that the number is strictly less than the max. Does anyone know what might cause this problem. Does SciPy have an alternative approach to generating uniform random numbers that is more reliable? randint(minimum, maximum, shape=ReturnFloat) The randint() function returns an array of the specified shape and containing random (standard) integers greater than or equal to minimum and strictly less than maximum . If no shape is specified, a single number is returned. (exerpt from http://www.pfdubois.com/numpy/html2/numpy-19.html#pgfId-303108) From Jim.Vickroy at noaa.gov Thu May 19 17:12:45 2005 From: Jim.Vickroy at noaa.gov (Jim Vickroy) Date: Thu, 19 May 2005 15:12:45 -0600 Subject: [SciPy-user] scipy Microsoft Windows installer for Python 2.4? In-Reply-To: <428CE192.7030409@enthought.com> Message-ID: Thanks very much for your quick response. Best of luck with your server issue. -- jv -----Original Message----- From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net]On Behalf Of Joe Cooper Sent: Thursday, May 19, 2005 12:57 PM To: SciPy Users List Subject: Re: [SciPy-user] scipy Microsoft Windows installer for Python 2.4? Jim Vickroy wrote: > Hello scipy users, > > Could someone tell me when/if a MS Windows installer for scipy and Python > 2.4 will be available? > > A search of past postings revealed a similar question posted 2005-Feb. I am 1.5 weeks into backporting the MSI installer from Python 2.4 to 2.3.5 and migrating our Wise installation into the new MSI installer. I expect it to be ready for testing in another few days. 2.4 will follow a couple of weeks after the official 2.3.5 release, assuming I don't get side-tracked by other issues (like migrating scipy.org to a new server, which is also a high priority item, since Plone eats all 768 MB available on the old server and has to be restarted daily, and still runs dog slow by midday). In other words, a 2.4 based Enthon release is being very actively worked on, though indirectly (indirectly, because I'm focusing on 2.3.5 for the time being--internally Enthought needs to stay with some older versions for a while longer, and so that gets my immediate attention). I can only say it will be made available when it is ready, but I expect it to be ready in a month or so. _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From jeremy at jeremysanders.net Thu May 19 17:13:33 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Thu, 19 May 2005 22:13:33 +0100 (BST) Subject: [SciPy-user] ANN: Veusz-0.6 released Message-ID: SciPy users may be interested in the new feature of embedding Veusz in other Python programs. Veusz runs in its own thread, so that the other Python program can continue to run. Veusz 0.6 --------- Velvet Ember Under Sky Zenith ----------------------------- http://home.gna.org/veusz/ Veusz is Copyright (C) 2003-2005 Jeremy Sanders Licenced under the GPL (version 2 or greater) Veusz is a scientific plotting package written in Python (currently 100% Python). It uses PyQt for display and user-interfaces, and numarray for handling the numeric data. Veusz is designed to produce publication-ready Postscript output. Veusz provides a GUI, command line and scripting interface (based on Python) to its plotting facilities. The plots are built using an object-based system to provide a consistent interface. Changes from 0.5: Please refer to ChangeLog for all the changes. Highlights include: * Major UI enhancements - much faster to control now, more dialogs * Veusz can be embedded within other non-PyQt Python programs. Its plots can be updated at any time from the embedding program using the command line interface. * Dialogs for manipulating datasets using expressions, and direct editing * Multiple documents can be opened simultaneously * Lots of bug fixes (e.g. log axes improvement, rotation of labels) * Unicode support in plots Features of package: * X-Y plots (with errorbars) * Stepped plots (for histograms) * Line plots * Function plots * Fitting functions to data * Stacked plots and arrays of plots * Plot keys * Plot labels * LaTeX-like formatting for text * EPS output * Simple data importing * Scripting interface * Save/Load plots * Dataset manipulation * Embed Veusz within other programs To be done: * Contour plots * Images * UI improvements * Import filters (for qdp and other plotting packages, fits, csv) Requirements: Python (probably 2.3 or greater required) http://www.python.org/ Qt (free edition) http://www.trolltech.com/products/qt/ PyQt (SIP is required to be installed first) http://www.riverbankcomputing.co.uk/pyqt/ http://www.riverbankcomputing.co.uk/sip/ numarray http://www.stsci.edu/resources/software_hardware/numarray Microsoft Core Fonts (recommended) http://corefonts.sourceforge.net/ For documentation on using Veusz, see the "Documents" directory. The manual is in pdf, html and text format (generated from docbook). If you enjoy using Veusz, I would love to hear from you. Please join the mailing lists at https://gna.org/mail/?group=veusz to discuss new features or if you'd like to contribute code. The newest code can always be found in CVS. If non GPL projects are interested in using Veusz code, please contact me. I am happy to consider relicencing code for other free projects, if I am legally allowed to do so. Cheers Jeremy From ryanfedora at comcast.net Thu May 19 22:54:25 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Thu, 19 May 2005 22:54:25 -0400 Subject: [SciPy-user] weave inline functions recompiling on first run Message-ID: <428D5161.9020305@comcast.net> I have a function that uses weave inline that recompiles the first time I run a script every time I start ipython, even though I have not changed the script in some time. The first time the script I get the message: repairing catalog by removing key file changed What can cause this? The code is below. Thanks, Ryan def thresh(iterin, value, startind=0, above=1): # ind=-1 stopind=len(iterin) print('In thresh') print('type(value)='+str(type(value))) print('type(iterin[0])='+str(type(iterin[0]))) print('type(startind)='+str(type(startind))) print('type(stopind)='+str(type(stopind))) print('type(above)='+str(type(above))) code=""" #line 21 "rwkdataproc.py" int ind; for(int j=startind;j0){ if(iterin(j)>=value){ ind=j; break; } } else{ if(iterin(j)<=value){ ind=j; break; } } } return_val = ind; """ N=inline_tools.inline(code,['iterin','value','above','startind','stopind'], compiler='gcc', type_converters = cblitz, verbose = 1) return N From tom at kornack.com Thu May 19 23:40:52 2005 From: tom at kornack.com (Tom Kornack) Date: Thu, 19 May 2005 22:40:52 -0500 Subject: [SciPy-user] Re: send me your blurbs In-Reply-To: <20050519142954.D1B7A3EB22@www.scipy.com> References: <20050519142954.D1B7A3EB22@www.scipy.com> Message-ID: I use scipy and python in general to efficiently analyze large data sets from an experiment that searches for CPT and Lorentz Violation using an atomic magnetometer. Analysis typically involves lots of fitting, statistics and complex data manipulation. I have found that my python implementation of our analysis routines is more robust for very large datasets compared with similar C code. It's also just about as fast as the C routines, to the surprise of everyone. All my figures are now generated using PyX, which I find quite pleasing. Tom Kornack @Article{Kominis:2003, author = {Iannis K. Kominis and Thomas Whitmore Kornack and Joel C. Allred and Michael Valeriovich Romalis}, title = {A subfemtotesla multichannel atomic magnetometer}, journal = {Nature}, day = 10, month = {April}, year = 2003, volume = 422, number = 6932, pages = {596--599}, url = {http://www.nature.com/cgi-taf/DynaPage.taf?file=/nature/ journal/v422/n6932/abs/nature01484_fs.html} } @Article{Kornack:2002, author = {Thomas Whitmore Kornack and Michael Valeriovich Romalis}, title = {Dynamics of Two Overlapping Spin Ensembles Interacting by Spin Exchange}, journal = PRL, day = 4, month = {December}, year = 2002, volume = 89, number = 25, pages = {253002}, url = {http://link.aps.org/abstract/PRL/v89/e253002} } http://kornack.com Fundamental Symmetries Lab, Princeton University 609-716-7259 (h), 609-933-2186 (m), 609-258-0702 (w), 609-258-1625 (f) Thomas Kornack, 157 North Post Road, Princeton Junction, NJ 08550-5009 From brendansimons at yahoo.ca Thu May 19 23:52:46 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Thu, 19 May 2005 23:52:46 -0400 Subject: [SciPy-user] send me your blurbs Message-ID: <58487a3071069431b4a1fe6146e1f92c@yahoo.ca> We use scipy to do non-linear curve-fitting of dimensional data, as part of our nuclear fuel inspection business. I don't have any references for you, as all our publications belong to our clients. Brendan -- Brendan Simons, Project Engineer Stern Laboratories, Hamilton Ontario From Fernando.Perez at colorado.edu Thu May 19 23:58:07 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 19 May 2005 21:58:07 -0600 Subject: [SciPy-user] weave inline functions recompiling on first run In-Reply-To: <428D5161.9020305@comcast.net> References: <428D5161.9020305@comcast.net> Message-ID: <428D604F.9020009@colorado.edu> Ryan Krauss wrote: > I have a function that uses weave inline that recompiles the first time > I run a script every time I start ipython, even though I have not > changed the script in some time. The first time the script I get the > message: > > repairing catalog by removing key > file changed > > What can cause this? I don't know exactly, but I've seen it a lot (and it happens even if you run the script outside ipython, so it's not ipython's fault). I'm pretty convinced it's a bug in weave. About a year ago I tracked and fixed a similar bug, but it took some serious effort: the weave code isn't particularly simple. Since I haven't been able to find a block of time I can sink into this, I've been avoiding it, unfortunately. It might not prove to be so tricky with this one, so give it a go if you have a bit of spare time. cheers, f From swisher at enthought.com Fri May 20 10:53:47 2005 From: swisher at enthought.com (Janet M. Swisher) Date: Fri, 20 May 2005 09:53:47 -0500 Subject: [SciPy-user] send me your blurbs In-Reply-To: References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> <200505181401.18014.sransom@cv.nrao.edu> Message-ID: <428DF9FB.1030508@enthought.com> I'm happy to collect such blurbs for the SciPy site. I'll assume that if you posted a blurb to the list, then it would be OK to quote you on the site. (If that's not true for you, please email me.) JDH, when you get a chance after finishing your grant proposal, could you send me any blurbs you collected off-list? I'll follow up with those sources to get their consent for re-use. --Janet Lance Boyle wrote: > Perhaps it would be useful to have a store of such blurbs readily > accessible on the SciPy and/or Python home pages, if such does not > already exist. > > Lance > > >>> If you are using python for high performance or scientific computing, >>> could I ask you to send me a blurb (one or two sentences) on what you >>> are using it for and if you have any publications or references >>> related to this work (the refs don't have to be specifically about >>> the python stuff) please include them (extra points for bibtex). >>> >>> I am writing some grant text justifying python in scientific >>> computing and would like to have a range of examples across >>> disciplines to provide. >>> >>> If I could get these today or tomorrow that would be very helpful! >>> Please send these to me off list. >>> >>> Thanks! >>> JDH >> -- Janet Swisher --- Senior Technical Writer Enthought, Inc. http://www.enthought.com From jdhunter at ace.bsd.uchicago.edu Fri May 20 11:21:26 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 20 May 2005 10:21:26 -0500 Subject: [SciPy-user] send me your blurbs In-Reply-To: <428DF9FB.1030508@enthought.com> ("Janet M. Swisher"'s message of "Fri, 20 May 2005 09:53:47 -0500") References: <87ekc48m2a.fsf@peds-pc311.bsd.uchicago.edu> <200505181401.18014.sransom@cv.nrao.edu> <428DF9FB.1030508@enthought.com> Message-ID: <8764xdj53d.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Janet" == Janet M Swisher writes: Janet> I'm happy to collect such blurbs for the SciPy site. I'll Janet> assume that if you posted a blurb to the list, then it Janet> would be OK to quote you on the site. (If that's not true Janet> for you, please email me.) JDH, when you get a chance after Janet> finishing your grant proposal, could you send me any blurbs Janet> you collected off-list? I'll follow up with those sources Janet> to get their consent for re-use. Hey Janet, Will do, I've collected them in a folder, about 40 and counting... Only one person asked that the information be kept private until a site launch, and you will see that in the email. They are still trickling in, and I'll get these to you after next Friday (deadline!) In your followup email, it might be nice to ask for an image of screenshot if applicable -- this could make for a nice promotional page. Thanks, JDH From ryanfedora at comcast.net Fri May 20 11:45:39 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 11:45:39 -0400 Subject: [SciPy-user] weave inline functions recompiling on first run In-Reply-To: <428D604F.9020009@colorado.edu> References: <428D5161.9020305@comcast.net> <428D604F.9020009@colorado.edu> Message-ID: <428E0623.2080507@comcast.net> How should I start tracking it if I were serious about it? Ryan Fernando Perez wrote: > Ryan Krauss wrote: > >> I have a function that uses weave inline that recompiles the first >> time I run a script every time I start ipython, even though I have >> not changed the script in some time. The first time the script I get >> the message: >> >> repairing catalog by removing key >> file changed >> >> What can cause this? > > > I don't know exactly, but I've seen it a lot (and it happens even if > you run the script outside ipython, so it's not ipython's fault). I'm > pretty convinced it's a bug in weave. About a year ago I tracked and > fixed a similar bug, but it took some serious effort: the weave code > isn't particularly simple. Since I haven't been able to find a block > of time I can sink into this, I've been avoiding it, unfortunately. It > might not prove to be so tricky with this one, so give it a go if you > have a bit of spare time. > > cheers, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From Fernando.Perez at colorado.edu Fri May 20 12:22:57 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 20 May 2005 10:22:57 -0600 Subject: [SciPy-user] weave inline functions recompiling on first run In-Reply-To: <428E0623.2080507@comcast.net> References: <428D5161.9020305@comcast.net> <428D604F.9020009@colorado.edu> <428E0623.2080507@comcast.net> Message-ID: <428E0EE1.1020009@colorado.edu> Ryan Krauss wrote: > How should I start tracking it if I were serious about it? Well, all I did was go into the weave code and blindly started putting print statements, until I figured out what was wrong :) best, f From Fernando.Perez at colorado.edu Fri May 20 12:29:21 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 20 May 2005 10:29:21 -0600 Subject: [SciPy-user] weave inline functions recompiling on first run In-Reply-To: <428E0EE1.1020009@colorado.edu> References: <428D5161.9020305@comcast.net> <428D604F.9020009@colorado.edu> <428E0623.2080507@comcast.net> <428E0EE1.1020009@colorado.edu> Message-ID: <428E1061.2040801@colorado.edu> Fernando Perez wrote: > Ryan Krauss wrote: > >>How should I start tracking it if I were serious about it? > > > Well, all I did was go into the weave code and blindly started putting print > statements, until I figured out what was wrong :) Sorry for the plug, but I should add that for this kind of complicated debugging in a codebase you aren't familiar with, using an embedded ipython can really help a lot. The instructions are here: http://ipython.scipy.org/doc/manual/node9.html This can really save you a lot of time, and it takes (if you have ipython installed) only two lines of code to implement (which you can copy/paste from the above link). Best, f From ryanfedora at comcast.net Fri May 20 15:52:37 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 15:52:37 -0400 Subject: [SciPy-user] conj and element-by-element multiplication Message-ID: <428E4005.8030003@comcast.net> This is really more of a NumPy question, but I was calculating coherence and I need to multiply a vector by its own complex conjugate element-by-element. This should return a vector of real numbers in my mind, but instead returns a vector with imaginary parts identically equal to zero. Now maybe I have been spoiled by Matlab, but I ended up having to test that the imaginary parts really are 0 and then just kept the real part. Should this happen automatically, or am I just being a lazier whiner? Ryan From stephen.walton at csun.edu Fri May 20 15:59:57 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 20 May 2005 12:59:57 -0700 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E4005.8030003@comcast.net> References: <428E4005.8030003@comcast.net> Message-ID: <428E41BD.8080007@csun.edu> Ryan Krauss wrote: > This is really more of a NumPy question, but I was calculating > coherence and I need to multiply a vector by its own complex > conjugate element-by-element. This should return a vector of real > numbers in my mind, but instead returns a vector with imaginary parts > identically equal to zero. Are you absolutely sure? I do precisely the same calculations, often, and the imaginary parts are not zero; they are small, well within roundoff error. I usually just take scipy.real() of the result when I'm done. From ryanfedora at comcast.net Fri May 20 16:09:17 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 16:09:17 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E41BD.8080007@csun.edu> References: <428E4005.8030003@comcast.net> <428E41BD.8080007@csun.edu> Message-ID: <428E43ED.4070803@comcast.net> So far, it seems to be true. I calculated the coherence of about 10 sets of data so far and included the following code to check this: cohnum=multiply(Gxyave,conj(Gxyave)) cohden=multiply(Gxxave,Gyyave) curavebode.coh=divide(cohnum,cohden) print('max(abs(imag(coh)))='+str(max(abs(imag(curavebode.coh))))) print(str(max(abs(imag(curavebode.coh)))==0.0)) and I am getting 0.0 as the max(abs(imag(coherence))) and True for the boolean test. Ryan Stephen Walton wrote: > Ryan Krauss wrote: > >> This is really more of a NumPy question, but I was calculating >> coherence and I need to multiply a vector by its own complex >> conjugate element-by-element. This should return a vector of real >> numbers in my mind, but instead returns a vector with imaginary parts >> identically equal to zero. > > > Are you absolutely sure? I do precisely the same calculations, often, > and the imaginary parts are not zero; they are small, well within > roundoff error. I usually just take scipy.real() of the result when > I'm done. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From cookedm at physics.mcmaster.ca Fri May 20 16:30:00 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 20 May 2005 16:30:00 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E4005.8030003@comcast.net> (Ryan Krauss's message of "Fri, 20 May 2005 15:52:37 -0400") References: <428E4005.8030003@comcast.net> Message-ID: Ryan Krauss writes: > This is really more of a NumPy question, but I was calculating > coherence and I need to multiply a vector by its own complex > conjugate element-by-element. This should return a vector of real > numbers in my mind, but instead returns a vector with imaginary parts > identically equal to zero. Now maybe I have been spoiled by Matlab, > but I ended up having to test that the imaginary parts really are 0 > and then just kept the real part. There's no downcasting; if you want a vector of real numbers, you have to ask for it. The multiply doesn't actually know that the two things it's multiplying are conjugates of each other, and checking for that would slow down the general case. In this case, you're probably better off doing the norm explicitly. Using scipy, it'd look like this: def norm2(x): """compute |x|^2 = x*conjugate(x)""" if scipy.iscomplexobj(x): return x.real**2 + x.imag**2 else: return x**2 which gives you a real vector back. This is also theoretically twice as fast, as you don't do the imaginary part (which comes to zero). -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ryanfedora at comcast.net Fri May 20 17:08:00 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 17:08:00 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: References: <428E4005.8030003@comcast.net> Message-ID: <428E51B0.60903@comcast.net> An HTML attachment was scrubbed... URL: From cookedm at physics.mcmaster.ca Fri May 20 17:12:57 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 20 May 2005 17:12:57 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E51B0.60903@comcast.net> (Ryan Krauss's message of "Fri, 20 May 2005 17:08:00 -0400") References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> Message-ID: Ryan Krauss writes: > Thanks David, that seems like the right answer. But to get the speed benefit > you mention, I need to use multiply(x.real,x.real) instead of x.real**2 - same > for imag. There is about a factor of 10 time difference between the two. So > my norm2 function now looks like this: Are you on Windows? The pow() function there, well, sucks big time. On other platforms the two are pretty close in speed. I'm working on a better version of the power ufunc for Numeric, so hopefully in a later version this will be fixed (or rather, worked around). -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ryanfedora at comcast.net Fri May 20 17:20:31 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 17:20:31 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> Message-ID: <428E549F.5050305@comcast.net> An HTML attachment was scrubbed... URL: From cookedm at physics.mcmaster.ca Fri May 20 18:03:01 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 20 May 2005 18:03:01 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E549F.5050305@comcast.net> (Ryan Krauss's message of "Fri, 20 May 2005 17:20:31 -0400") References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> <428E549F.5050305@comcast.net> Message-ID: Ryan Krauss writes: > No, I am running Linux. Odd; I do too, and don't see this. glibc's pow doesn't have the same problem as Window's does. x**2 is at most 50% slower, not 10x. Which version of Numeric and python are you using, and which Linux distribution? (Numeric's version you can get from Numeric.__version__) Ideally, x**2 should run in the same time as x*x You're using scipy's multiply, which is the same (I think) as the multiply in Numeric 24.0b2, which are different from 23.8 and earlier (which does a time-wasting postcheck of the array for infinite values). > David M. Cooke wrote: > > Ryan Krauss writes: > > > > Thanks David, that seems like the right answer. But to get the speed benefit > you mention, I need to use multiply(x.real,x.real) instead of x.real**2 - same > for imag. There is about a factor of 10 time difference between the two. So > my norm2 function now looks like this: > > > > Are you on Windows? The pow() function there, well, sucks big time. On > other platforms the two are pretty close in speed. > > I'm working on a better version of the power ufunc for Numeric, so > hopefully in a later version this will be fixed (or rather, worked > around). -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ryanfedora at comcast.net Fri May 20 18:35:31 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Fri, 20 May 2005 18:35:31 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> <428E549F.5050305@comcast.net> Message-ID: <428E6633.9090602@comcast.net> An HTML attachment was scrubbed... URL: From cookedm at physics.mcmaster.ca Fri May 20 22:23:30 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 20 May 2005 22:23:30 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <428E6633.9090602@comcast.net> References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> <428E549F.5050305@comcast.net> <428E6633.9090602@comcast.net> Message-ID: <73B775E2-626C-4113-9C1D-9DAB4C3FE16C@physics.mcmaster.ca> On May 20, 2005, at 18:35, Ryan Krauss wrote: > I am running Numeric 24.02b on python 2.3.4. I am using Fedora > Core 3 and kde. > > I have modified the norm2 function to include printing both times: > > def norm2(x): > """compute |x|^2 = x*conjugate(x)""" > if iscomplexobj(x): > t1=time.time() > mat1=x.real**2 + x.imag**2 > t2=time.time() > mat2=multiply(x.real,x.real) + multiply(x.imag,x.imag) > t3=time.time() > print('pow time='+str(t2-t1)) > print('multiply time='+str(t3-t2)) > return mat2 > else: > return multiply(x,x) > > and get these times printed to the screen: > pow time=0.00401091575623 > multiply time=0.000787019729614 > pow time=0.00442790985107 > multiply time=0.000679016113281 > pow time=0.0130319595337 > multiply time=0.000170946121216 > pow time=0.00465989112854 > multiply time=0.000912189483643 > pow time=0.00434708595276 > multiply time=0.000770807266235 > pow time=0.00153613090515 > multiply time=0.000103950500488 > > Most of the calls are with 4200x3 matrices. The last one is with a > 4200x1 vector (I think). Curious. One more question: are these Complex32 matrices? I.e., are they matrices using C floats (single precision), as opposed to C doubles (double precision)? The reason is that if x is a vector of floats, x**2 upcasts x to doubles first (b/c 2 is an int -- ints are cast to doubles), so that kills the speed, and I can get timings like this. This isn't a good behaviour, and I'm going to see what I can do to fix this for the next version of Numeric. > David M. Cooke wrote: >> Ryan Krauss writes: >>> No, I am running Linux. >> Odd; I do too, and don't see this. glibc's pow doesn't have the >> same problem as Window's does. x**2 is at most 50% slower, not >> 10x. Which version of Numeric and python are you using, and which >> Linux distribution? (Numeric's version you can get from >> Numeric.__version__) Ideally, x**2 should run in the same time as >> x*x You're using scipy's multiply, which is the same (I think) as >> the multiply in Numeric 24.0b2, which are different from 23.8 and >> earlier (which does a time-wasting postcheck of the array for >> infinite values). >>> David M. Cooke wrote: Ryan Krauss >>> writes: Thanks David, that seems like the right answer. But to >>> get the speed benefit you mention, I need to use multiply >>> (x.real,x.real) instead of x.real**2 - same for imag. There is >>> about a factor of 10 time difference between the two. So my norm2 >>> function now looks like this: Are you on Windows? The pow() >>> function there, well, sucks big time. On other platforms the >>> two are pretty close in speed. I'm working on a better version of >>> the power ufunc for Numeric, so hopefully in a later version this >>> will be fixed (or rather, worked around). -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From sransom at nrao.edu Fri May 20 23:24:21 2005 From: sransom at nrao.edu (Scott Ransom) Date: Fri, 20 May 2005 23:24:21 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <73B775E2-626C-4113-9C1D-9DAB4C3FE16C@physics.mcmaster.ca> References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> <428E549F.5050305@comcast.net> <428E6633.9090602@comcast.net> <73B775E2-626C-4113-9C1D-9DAB4C3FE16C@physics.mcmaster.ca> Message-ID: <20050521032421.GA14395@ssh.cv.nrao.edu> Hi David, You can prevent the upcasts using the .savespace() method. For example: In [8]:a = arange(10.0, typecode='F') In [9]:a**2 Out[9]: array([ 0.+0.j, 1.+0.j, 4.+0.j, 9.+0.j, 16.+0.j, 25.+0.j, 36.+0.j, 49.+0.j, 64.+0.j, 81.+0.j]) In [10]:a.savespace() In [11]:a**2 Out[11]: array([ 0.+0.j, 1.+0.j, 4.+0.j, 9.+0.j, 16.+0.j, 25.+0.j, 36.+0.j, 49.+0.j, 64.+0.j, 81.+0.j],'F') This is _very_ useful when dealing with very large arrays... Scott > Curious. One more question: are these Complex32 matrices? I.e., are > they matrices using C floats (single precision), as opposed to C > doubles (double precision)? The reason is that if x is a vector of > floats, x**2 upcasts x to doubles first (b/c 2 is an int -- ints are > cast to doubles), so that kills the speed, and I can get timings like > this. This isn't a good behaviour, and I'm going to see what I can do > to fix this for the next version of Numeric. > -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From ryanfedora at comcast.net Sat May 21 10:58:30 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Sat, 21 May 2005 10:58:30 -0400 Subject: [SciPy-user] conj and element-by-element multiplication In-Reply-To: <73B775E2-626C-4113-9C1D-9DAB4C3FE16C@physics.mcmaster.ca> References: <428E4005.8030003@comcast.net> <428E51B0.60903@comcast.net> <428E549F.5050305@comcast.net> <428E6633.9090602@comcast.net> <73B775E2-626C-4113-9C1D-9DAB4C3FE16C@physics.mcmaster.ca> Message-ID: <428F4C96.1030009@comcast.net> So I am not sure about the single or double precision complex question. I changed my code to print out the types and some more diagnostic information. They all seem to be doubles. Most of the time multiply is 5-7 times faster for 4200x3 matrices and 10-12 times faster for 4200x1 vectors. Sometimes they are strangely nearly the same speed, but this just happens once in a while with no reason - I suspect other system processes?. Take a look at the code and the output: def norm2(x): """compute |x|^2 = x*conjugate(x)""" if iscomplexobj(x): t1=time.time() mat1=x.real**2 + x.imag**2 t2=time.time() mat2=multiply(x.real,x.real) + multiply(x.imag,x.imag) t3=time.time() print('---------------------------') print('pow time='+str(t2-t1)) print('multiply time='+str(t3-t2)) print('pow time/multiply time='+str((t2-t1)/(t3-t2))) print('shape(x)='+str(shape(x))) print('x.typecode='+str(x.typecode())) print('mat1.typecode='+str(mat1.typecode())) print('mat2.typecode='+str(mat2.typecode())) if len(shape(x))==1: print('type(x[0,0])='+str(type(x[0]))) print('type(mat1[0,0])='+str(type(mat1[0]))) print('type(mat2[0,0])='+str(type(mat2[0]))) else: print('type(x[0,0])='+str(type(x[0,0]))) print('type(mat1[0,0])='+str(type(mat1[0,0]))) print('type(mat2[0,0])='+str(type(mat2[0,0]))) return mat2 else: return multiply(x,x) and I now get this output: --------------------------- pow time=0.00702810287476 multiply time=0.00100302696228 pow time/multiply time=7.00689327312 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00710797309875 multiply time=0.00673699378967 pow time/multiply time=1.05506600134 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.0021538734436 multiply time=0.000178098678589 pow time/multiply time=12.093708166 shape(x)=(4200,) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00687909126282 multiply time=0.0013599395752 pow time/multiply time=5.05838008415 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00610399246216 multiply time=0.00104904174805 pow time/multiply time=5.81863636364 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00214910507202 multiply time=0.000160932540894 pow time/multiply time=13.3540740741 shape(x)=(4200,) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= curamp=0.5 --------------------------- pow time=0.00592398643494 multiply time=0.0013701915741 pow time/multiply time=4.32347311641 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00587201118469 multiply time=0.000941038131714 pow time/multiply time=6.23992906005 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00174999237061 multiply time=0.0001380443573 pow time/multiply time=12.677029361 shape(x)=(4200,) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00701999664307 multiply time=0.00106501579285 pow time/multiply time=6.59144839937 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00612807273865 multiply time=0.000999927520752 pow time/multiply time=6.12851692895 shape(x)=(4200, 3) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= --------------------------- pow time=0.00207996368408 multiply time=0.0001540184021 pow time/multiply time=13.5046439628 shape(x)=(4200,) x.typecode=D mat1.typecode=d mat2.typecode=d type(x[0,0])= type(mat1[0,0])= type(mat2[0,0])= curamp=0.75 --------------------------- David M. Cooke wrote: > On May 20, 2005, at 18:35, Ryan Krauss wrote: > >> I am running Numeric 24.02b on python 2.3.4. I am using Fedora Core 3 >> and kde. >> >> I have modified the norm2 function to include printing both times: >> >> def norm2(x): >> """compute |x|^2 = x*conjugate(x)""" >> if iscomplexobj(x): >> t1=time.time() >> mat1=x.real**2 + x.imag**2 >> t2=time.time() >> mat2=multiply(x.real,x.real) + multiply(x.imag,x.imag) >> t3=time.time() >> print('pow time='+str(t2-t1)) >> print('multiply time='+str(t3-t2)) >> return mat2 >> else: >> return multiply(x,x) >> >> and get these times printed to the screen: >> pow time=0.00401091575623 >> multiply time=0.000787019729614 >> pow time=0.00442790985107 >> multiply time=0.000679016113281 >> pow time=0.0130319595337 >> multiply time=0.000170946121216 >> pow time=0.00465989112854 >> multiply time=0.000912189483643 >> pow time=0.00434708595276 >> multiply time=0.000770807266235 >> pow time=0.00153613090515 >> multiply time=0.000103950500488 >> >> Most of the calls are with 4200x3 matrices. The last one is with a >> 4200x1 vector (I think). > > > Curious. One more question: are these Complex32 matrices? I.e., are > they matrices using C floats (single precision), as opposed to C > doubles (double precision)? The reason is that if x is a vector of > floats, x**2 upcasts x to doubles first (b/c 2 is an int -- ints are > cast to doubles), so that kills the speed, and I can get timings like > this. This isn't a good behaviour, and I'm going to see what I can do > to fix this for the next version of Numeric. > >> David M. Cooke wrote: >> >>> Ryan Krauss writes: >>> >>>> No, I am running Linux. >>> >>> Odd; I do too, and don't see this. glibc's pow doesn't have the same >>> problem as Window's does. x**2 is at most 50% slower, not 10x. Which >>> version of Numeric and python are you using, and which Linux >>> distribution? (Numeric's version you can get from >>> Numeric.__version__) Ideally, x**2 should run in the same time as >>> x*x You're using scipy's multiply, which is the same (I think) as >>> the multiply in Numeric 24.0b2, which are different from 23.8 and >>> earlier (which does a time-wasting postcheck of the array for >>> infinite values). >>> >>>> David M. Cooke wrote: Ryan Krauss writes: >>>> Thanks David, that seems like the right answer. But to get the >>>> speed benefit you mention, I need to use multiply (x.real,x.real) >>>> instead of x.real**2 - same for imag. There is about a factor of 10 >>>> time difference between the two. So my norm2 function now looks >>>> like this: Are you on Windows? The pow() function there, well, >>>> sucks big time. On other platforms the two are pretty close in >>>> speed. I'm working on a better version of the power ufunc for >>>> Numeric, so hopefully in a later version this will be fixed (or >>>> rather, worked around). >>> > From NadavH at VisionSense.com Sun May 22 08:41:09 2005 From: NadavH at VisionSense.com (Nadav Horesh) Date: Sun, 22 May 2005 15:41:09 +0300 Subject: [SciPy-user] Error in 2D interpolation Message-ID: <42907DE5.9070106@VisionSense.com> I am getting errors using interpolate.bisplrep function. I've followed the scipy tutorial on 2D interpolation (page 23) and have the following piece of code: . . . old_x, old_y = N.mgrid[boundries[2]:boundries[3]:1j*mat.shape[0], boundries[0]:boundries[2]:1j*mat.shape[1]] print old_x.shape, old_y.shape, mat.shape tck = interpolate.bisplrep(old_x, old_y, mat, s=0) . . . generates the following error: (59, 127) (59, 127) (59, 127) # <== The output of the above print statement iopt,kx,ky,m= 0 3 3 7493 nxest,nyest,nmax= 152 152 152 lwrk1,lwrk2,kwrk= 22908510 13011569 28518 xb,xe,yb,ye= 0.000182 0.011776 0.002325 0.000182 eps,s 1.E-16 0. Traceback (most recent call last): File "process_mats.py", line 81, in ? interpolated = same_grid_interpolation(expanded, boundries, coord, 0.00008) File "process_mats.py", line 68, in same_grid_interpolation tck = interpolate.bisplrep(old_x, old_y, mat, s=0) File "/usr/local/lib/python2.4/site-packages/scipy/interpolate/fitpack.py", line 611, in bisplrep tx,ty,nxest,nyest,wrk,lwrk1,lwrk2) SystemError: error return without exception set System: Dual P4 PC + 512MB Gentoo linux (kernel 2.6.9) Python 2.4.1 scipy version 0.3.2_300.4521 Numeric 24.0 In another system I get MemoryError exception in the same place. The system is P3 + 192MB Gentoo linux (Kernel 2.4.20) Python 2.4.1 Numeric 23.7 Latest scipy from CVS. Any ideas? Nadav. From daniel at rozengardt.net Mon May 23 11:34:49 2005 From: daniel at rozengardt.net (daniel at rozengardt.net) Date: Mon, 23 May 2005 12:34:49 -0300 (ART) Subject: [SciPy-user] Scipy and Python 2.4a3 Message-ID: <33077.170.155.3.4.1116862489.squirrel@www.rozengardt.net> Hello I?m newbie with scipy. I am working with Python 2.4a3 in Windows XP enviroment. I couldn?t install scipy because it seems to work only with Python 2.3. Any help about it? Thanks in advance Daniel Rozengardt From perry at stsci.edu Mon May 23 14:25:04 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 23 May 2005 14:25:04 -0400 Subject: [SciPy-user] Re: [AstroPy] Re: gui/thread issues In-Reply-To: <42921AD5.7050508@nrao.edu> References: <42921AD5.7050508@nrao.edu> Message-ID: <729cb13ce6781f439540fdcdd72e6a7b@stsci.edu> On May 23, 2005, at 2:03 PM, David King wrote: > > In the short run, I would be ok just using Tk alone on IPython or plain > Python, although I'd be happiest having the whole picture. In the > long run, > our project will want to serve varied users, who will no doubt want to > use > python in their own way. > Note that Tkinter windows under the standard Python interpreter are only 'alive' when the intepreter prompt is active. If you invoke a function that takes a long time, the window will be unresponsive since the Tk events are only handled within the Python interpreter loop. In this respect IPython should have an advantage with threading. > > I will continue to study what I have (that forum-thread link was > helpful > too--thanks). But I can't help feeling other python gui programmers > must > run up against this problem too, and wonder why practically all the > standard > Tkinter doc (including Grayson's book) just says "you gotta call > mainloop() > to make anything actually appear and respond in Tkinter" (manifestly > untrue). I think the problem is that, by and large, almost all uses of GUIs are centered on their being standalone applications and thus little thought is given to having a command line coexisting with the GUI. Our needs are pretty atypical in this regard. Perry Greenfield From Fernando.Perez at colorado.edu Mon May 23 14:58:28 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 23 May 2005 12:58:28 -0600 Subject: [SciPy-user] Re: [AstroPy] Re: gui/thread issues In-Reply-To: <729cb13ce6781f439540fdcdd72e6a7b@stsci.edu> References: <42921AD5.7050508@nrao.edu> <729cb13ce6781f439540fdcdd72e6a7b@stsci.edu> Message-ID: <429227D4.20208@colorado.edu> Perry Greenfield wrote: > On May 23, 2005, at 2:03 PM, David King wrote: > >>In the short run, I would be ok just using Tk alone on IPython or plain >>Python, although I'd be happiest having the whole picture. In the >>long run, >>our project will want to serve varied users, who will no doubt want to >>use >>python in their own way. >> > > Note that Tkinter windows under the standard Python interpreter are > only 'alive' when the intepreter prompt is active. If you invoke a > function that takes a long time, the window will be unresponsive since > the Tk events are only handled within the Python interpreter loop. In > this respect IPython should have an advantage with threading. Well, the advantage provided by ipython is that, in addition to Tk (which you get 'for free' with plain python), you can have interactive WX, GTK, and as of current CVS, also QT GUIs controlled from a command line. The issues with the GIL are still there, so any long-running command which doesn't release the GIL will still block. But it does give some advantages. Regards, f From perry at stsci.edu Mon May 23 15:08:20 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 23 May 2005 15:08:20 -0400 Subject: [SciPy-user] Re: [AstroPy] Re: gui/thread issues In-Reply-To: <429227D4.20208@colorado.edu> References: <42921AD5.7050508@nrao.edu> <729cb13ce6781f439540fdcdd72e6a7b@stsci.edu> <429227D4.20208@colorado.edu> Message-ID: On May 23, 2005, at 2:58 PM, Fernando Perez wrote: > > The issues with the GIL are still there, so any long-running command > which doesn't release the GIL will still block. But it does give some > advantages. > > Isn't that mainly an issue as to whether things can run concurrently? As far as event handling, even when the GIL blocks, there still should be thread switches on a regular basis within Python code so that even though only one thread runs at a time, the events will be handled during most long-running functions (perhaps not long-running extension functions). But maybe I'm mixed up on that. Perry From Fernando.Perez at colorado.edu Mon May 23 16:45:12 2005 From: Fernando.Perez at colorado.edu (Fernando.Perez at colorado.edu) Date: Mon, 23 May 2005 14:45:12 -0600 Subject: [SciPy-user] Re: [AstroPy] Re: gui/thread issues In-Reply-To: References: <42921AD5.7050508@nrao.edu> <729cb13ce6781f439540fdcdd72e6a7b@stsci.edu> <429227D4.20208@colorado.edu> Message-ID: <1116881112.429240d84416a@webmail.colorado.edu> Quoting Perry Greenfield : > > On May 23, 2005, at 2:58 PM, Fernando Perez wrote: > > > > The issues with the GIL are still there, so any long-running command > > which doesn't release the GIL will still block. But it does give some > > advantages. > > > > > Isn't that mainly an issue as to whether things can run concurrently? > As far as event handling, even when the GIL blocks, there still should > be thread switches on a regular basis within Python code so that even > though only one thread runs at a time, the events will be handled > during most long-running functions (perhaps not long-running extension > functions). But maybe I'm mixed up on that. You are perfectly correct. What I had in mind (but I wasn't explicit enough) is the common (for scientific users) of calling a long-running C extension (say an eigenvalue calculation for a large matrix). This will block the GUI, since there are no thread switches happening at that point. Regards, f From rosario.ruiloba at noveltis.fr Tue May 24 09:07:28 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 24 May 2005 15:07:28 +0200 Subject: [SciPy-user] Help on polar plot Message-ID: <1116940049.30802.7.camel@rosae.noveltis.fr> Hello, I'm plotting vectors using gplt.polar function.I want to know if it's possible to plot several curves in the same plot with command a different than: polar(theta1,rho1,theta2,rho2) Is it possible to add a polar curve in an existing polar plot? Thank you, Rosa -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 From rosario.ruiloba at noveltis.fr Tue May 24 09:09:08 2005 From: rosario.ruiloba at noveltis.fr (Rosa Ruiloba) Date: 24 May 2005 15:09:08 +0200 Subject: [SciPy-user] Help on polar plot Message-ID: <1116940148.30802.10.camel@rosae.noveltis.fr> Hello, I'm plotting vectors using gplt.polar function.I want to know if it's possible to plot several curves in the same plot with a command different than: polar(theta1,rho1,theta2,rho2) Is it possible to add a polar curve in an existing polar plot? Thank you, Rosa -- -- NOVELTIS Parc Technologique du Canal 2, avenue de l'Europe 31526 RAMONVILLE SAINT AGNE CEDEX Tel: +(33) (0)5.62.88.11.23 From Jim.Vickroy at noaa.gov Tue May 24 10:12:56 2005 From: Jim.Vickroy at noaa.gov (Jim Vickroy) Date: Tue, 24 May 2005 08:12:56 -0600 Subject: [SciPy-user] Scipy and Python 2.4a3 In-Reply-To: <33077.170.155.3.4.1116862489.squirrel@www.rozengardt.net> Message-ID: Hello Daniel, Recently, I made a similar inquiry and received a reply that a Python 2.4-compatible release would be available in ~ 4-6 weeks. -- jv -----Original Message----- From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net]On Behalf Of daniel at rozengardt.net Sent: Monday, May 23, 2005 9:35 AM To: scipy-user at scipy.net Subject: [SciPy-user] Scipy and Python 2.4a3 Hello I?m newbie with scipy. I am working with Python 2.4a3 in Windows XP enviroment. I couldn?t install scipy because it seems to work only with Python 2.3. Any help about it? Thanks in advance Daniel Rozengardt _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From rolf.wester at ilt.fraunhofer.de Wed May 25 10:09:51 2005 From: rolf.wester at ilt.fraunhofer.de (Rolf Wester) Date: Wed, 25 May 2005 16:09:51 +0200 Subject: [SciPy-user] plt multiple plots? Message-ID: <4294872F.90003@ilt.fraunhofer.de> Hi, I would like to use plt to make multiple plots in one window and/or in multiple windows. I couldn't find anything about this in the docs. I would be very appreciative for any help on this problem. Thank you in anticipation Regards Rolf Wester From robidoux at cs.laurentian.ca Thu May 26 17:46:11 2005 From: robidoux at cs.laurentian.ca (robidoux at cs.laurentian.ca) Date: Thu, 26 May 2005 17:46:11 -0400 (EDT) Subject: [SciPy-user] Is it reasonable to ignore a "mild looking" FAILED scipy.test? Message-ID: <200505262146.j4QLkBqX027982@altair.cs.laurentian.ca> Scipy virgin here: Is it reasonable to ignore a "mild looking" FAILED scipy.test and get on with learning to use scipy? The story: After fairly careful linux install, scipy.test(level=1,verbosity=2) gives >>>>>BEGIN OUTPUT ... [snip] FAIL: check_cdf (scipy.stats.distributions.test_distributions.test_fatiguelife) ---------------------------------------------------------------------- Traceback (most recent call last): File "", line 10, in check_cdf AssertionError: D = 0.362327598842; pval = 0.000245540127335; alpha = 0.01 args = (1.0213303917789236,) ---------------------------------------------------------------------- Ran 743 tests in 5.665s FAILED (failures=1, errors=15) <<<<>>>>BEGIN OUTPUT !! No test file 'test_quadpack.py' found for <<<< Hello, I would need some help or advice on the following point : I'm trying to fit some X-ray scattering data with either gaussian or lorentzian functions using the least square algorithm (optimize.leastsq). I started with something quite simple, i.e. defining the residuals for the different functions in a similar way as in the tutorial and then calling the leastsq routine. In the end, I find the results a bit puzzling : if I define the data to be fitted as a gaussian + random noise (as in the tutorial), the fit with gaussian functions is perfect. The fit of the same data with lorentzian is not very good, I have the impression that the fit position is quite OK and the integrated intensity as well, but the maximum intensity is way off... If I use real data where I have two peaks and linear background, as in the example below, then I simply cannot get correct fit, no matter how good the initial estimates are... Can anyone give me a bit of light on this one ? Many thanks Aure def multifuncresiduals(multifunc,y,x,gaunum=2,lornum=0,linenum=0): err = y for i in range(gaunum): #gaussian params come first in list print multifunc[2*i],multifunc[2*i+1] err = err-1/(sqrt(2*pi)*multifunc[2*i+1])*\ exp(-(x-multifunc[2*i])**2/(2*multifunc[2*i+1]**2)) for i in range(lornum): #lorentzian params come first in list i += gaunum #need to correct for index print multifunc[2*i],multifunc[2*i+1] err = err-1/pi*0.5*multifunc[2*i+1]/((x-multifunc[2*i])**2+(0.5*multifunc[2*i+1])**2) for i in range(linenum): i += gaunum + lornum #need to correct for index print multifunc[2*i],multifunc[2*i+1] err = err-(multifunc[2*i+1]+multifunc[2*i]*x) print '----------------' return err def multifunc(y,x,multifuncinit): result = leastsq(multifuncresiduals,multifuncinit,args=(y,x)) return result if __name__ == '__main__': ######################################## #test for multiple gaussian fit based on input file # #read chiplot file content datafilename = 'D:/Boulot/Software/Python/azimuth.chi' x,y =readChi(datafilename) #define init values multigauinit = [90,100,270,100] #initial estimates result= multifunc(y,x,multigauinit)