From david.schneider at bivab.de Tue Sep 3 20:04:15 2013 From: david.schneider at bivab.de (David Schneider) Date: Tue, 3 Sep 2013 20:04:15 +0200 Subject: [pypy-dev] Buildbot update Message-ID: <85123D5D-1C1F-4C4D-874B-E4122B75CA52@bivab.de> Hi all, I just upgraded the PyPy buildbot on buildbot.pypy.org to run on version 0.8.8 of buildbot. This update should not create any significant visible changes and mainly simplifies some things in the codebase. Also the update should make it easier to add some features, such as decoupling the translation from the tests, hopefully allowing to reuse binaries for different test runs. As usual with updates there might be some things that do not work as expected or at all. In case you notice something please let me know either by email or on IRC. Cheers, David From n210241048576 at gmail.com Wed Sep 4 05:53:19 2013 From: n210241048576 at gmail.com (Robert Grosse) Date: Tue, 3 Sep 2013 23:53:19 -0400 Subject: [pypy-dev] Windows 7 x64 development In-Reply-To: References: <1B12946B-109E-4105-B1CB-60FDD1A59E49@gmail.com> Message-ID: It looks like CPython assumes the use of Visual Studio on Windows, but the express edition does not support 64bit compilation. Would it be feasible to use Mingw instead? I've looked around online, but it seems pretty discouraging. On Fri, Aug 23, 2013 at 1:40 PM, Armin Rigo wrote: > Hi again, > > On Wed, Aug 21, 2013 at 9:46 AM, Armin Rigo wrote: > > I finally wrote out the details of what I think is a reasonable plan. > > > > https://bitbucket.org/pypy/pypy/raw/default/pypy/doc/windows.rst > > "What is missing for a full 64-bit translation" > > Updated the file. Anyone with an interest in helping on Win64, please > start by looking there --- the first step does not require any PyPy > knowledge, because it's hacking at *CPython* :-) > > > A bient?t, > > Armin. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From clay.sweetser at gmail.com Wed Sep 4 06:15:40 2013 From: clay.sweetser at gmail.com (Clay Sweetser) Date: Wed, 4 Sep 2013 00:15:40 -0400 Subject: [pypy-dev] Windows 7 x64 development In-Reply-To: References: <1B12946B-109E-4105-B1CB-60FDD1A59E49@gmail.com> Message-ID: Um, I believe you're mistaken. Though it's true that visual studio express doesn't *come * with a 64 bit compiler (at least on windows 7 and below, the latest one for win8 does) you can still download the windows 64 bit SDK and use the one that comes with that. Sincerely, Clay Sweetser "Evil begins when you begin to think of people as things." - Terry Pratchett On Sep 3, 2013 11:53 PM, "Robert Grosse" wrote: > It looks like CPython assumes the use of Visual Studio on Windows, but the > express edition does not support 64bit compilation. Would it be feasible to > use Mingw instead? I've looked around online, but it seems pretty > discouraging. > > > On Fri, Aug 23, 2013 at 1:40 PM, Armin Rigo wrote: > >> Hi again, >> >> On Wed, Aug 21, 2013 at 9:46 AM, Armin Rigo wrote: >> > I finally wrote out the details of what I think is a reasonable plan. >> > >> > https://bitbucket.org/pypy/pypy/raw/default/pypy/doc/windows.rst >> > "What is missing for a full 64-bit translation" >> >> Updated the file. Anyone with an interest in helping on Win64, please >> start by looking there --- the first step does not require any PyPy >> knowledge, because it's hacking at *CPython* :-) >> >> >> A bient?t, >> >> Armin. >> > > > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matti.picus at gmail.com Wed Sep 4 10:28:06 2013 From: matti.picus at gmail.com (Matti Picus) Date: Wed, 04 Sep 2013 11:28:06 +0300 Subject: [pypy-dev] Windows 7 x64 development In-Reply-To: References: <1B12946B-109E-4105-B1CB-60FDD1A59E49@gmail.com> Message-ID: <5226EF16.1040403@gmail.com> In a strange coincidence, compining for 64 bit windows was in today's numpy mailing list, Christoph Gohlke wrote: " I would not recommend the VS Express version. Instead use the "Microsoft Windows SDK for Windows 7 and .NET Framework 3.5 SP1" , which contains compatible 32 and 64 bit compilers for Python 2.6 to 3.2. Use the web installer or mount the ISO with VirtualCloneDrive . Then, on a command prompt in the numpy source directory type (not tested, but should work for 64 bit Python 2.7): setlocal EnableDelayedExpansion call "%ProgramFiles%\Microsoft SDKs\Windows\v7.0\Bin\SetEnv.Cmd" /Release /x64 /vista set DISTUTILS_USE_SDK=1 C:\Python27\python.exe setup.py build " Note that while using the VS Express version may give you a more pleasant IDE to debug with, adapting the steps above to building CPython should allow most of what's needed to implement the first steps of Armin's plan. Matti On 09/04/2013 07:15 AM, Clay Sweetser wrote: > > Um, I believe you're mistaken. Though it's true that visual studio > express doesn't *come * with a 64 bit compiler (at least on windows 7 > and below, the latest one for win8 does) you can still download the > windows 64 bit SDK and use the one that comes with that. > > Sincerely, Clay Sweetser > > "Evil begins when you begin to think of people as things." - Terry > Pratchett > > On Sep 3, 2013 11:53 PM, "Robert Grosse" > wrote: > > It looks like CPython assumes the use of Visual Studio on Windows, > but the express edition does not support 64bit compilation. Would > it be feasible to use Mingw instead? I've looked around online, > but it seems pretty discouraging. > > > On Fri, Aug 23, 2013 at 1:40 PM, Armin Rigo > wrote: > > Hi again, > > On Wed, Aug 21, 2013 at 9:46 AM, Armin Rigo > wrote: > > I finally wrote out the details of what I think is a > reasonable plan. > > > > https://bitbucket.org/pypy/pypy/raw/default/pypy/doc/windows.rst > > "What is missing for a full 64-bit translation" > > Updated the file. Anyone with an interest in helping on > Win64, please > start by looking there --- the first step does not require any > PyPy > knowledge, because it's hacking at *CPython* :-) > > > A bient?t, > > Armin. > > > > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > > > > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev From matti.picus at gmail.com Wed Sep 4 10:08:55 2013 From: matti.picus at gmail.com (Matti Picus) Date: Wed, 04 Sep 2013 11:08:55 +0300 Subject: [pypy-dev] Buildbot update In-Reply-To: <85123D5D-1C1F-4C4D-874B-E4122B75CA52@bivab.de> References: <85123D5D-1C1F-4C4D-874B-E4122B75CA52@bivab.de> Message-ID: <5226EA97.5090508@gmail.com> TL;DR : Do you have any hints/instructions as to how to test a new buildbot task? Thanks for the update, it seems to have cleaned up much of the code base. I would like to add a task that installs the python part of a numpy installation to pypy and runs numpy's tests. I have started the work on the numpy-tests branch. I see you improved the infrastructure for testing buildbot itself, but it is not clear to me how to test and debug the new task, Do I need to set up a master/slave pair or can I test the tasks with only one process, what would a typical test command-line look like, etc. Note that I would have to create a virtualenv and install nose into it since numpy tests require nose, so any hints you could give would be helpful. Matti On 09/03/2013 09:04 PM, David Schneider wrote: > Hi all, > > I just upgraded the PyPy buildbot on buildbot.pypy.org to run on version 0.8.8 of buildbot. This update should not create any significant visible changes and mainly simplifies some things in the codebase. Also the update should make it easier to add some features, such as decoupling the translation from the tests, hopefully allowing to reuse binaries for different test runs. > > As usual with updates there might be some things that do not work as expected or at all. In case you notice something please let me know either by email or on IRC. > > Cheers, > > David > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev From anto.cuni at gmail.com Wed Sep 4 11:38:32 2013 From: anto.cuni at gmail.com (Antonio Cuni) Date: Wed, 04 Sep 2013 10:38:32 +0100 Subject: [pypy-dev] [pypy-commit] pypy default: Enable inlining into the thread module so that Lock.acquire/release have a sane calling convention In-Reply-To: <20130903230304.866591C135D@cobra.cs.uni-duesseldorf.de> References: <20130903230304.866591C135D@cobra.cs.uni-duesseldorf.de> Message-ID: <5226FF98.3040000@gmail.com> Hi Alex, I think that this commit should come with a test_pypy_c test as well. ciao, Anto On 04/09/13 00:03, alex_gaynor wrote: > Author: Alex Gaynor > Branch: > Changeset: r66780:a3e9a5394648 > Date: 2013-09-03 16:02 -0700 > http://bitbucket.org/pypy/pypy/changeset/a3e9a5394648/ > > Log: Enable inlining into the thread module so that Lock.acquire/release > have a sane calling convention > > diff --git a/pypy/module/pypyjit/policy.py b/pypy/module/pypyjit/policy.py > --- a/pypy/module/pypyjit/policy.py > +++ b/pypy/module/pypyjit/policy.py > @@ -109,7 +109,8 @@ > 'posix', '_socket', '_sre', '_lsprof', '_weakref', > '__pypy__', 'cStringIO', '_collections', 'struct', > 'mmap', 'marshal', '_codecs', 'rctime', 'cppyy', > - '_cffi_backend', 'pyexpat', '_continuation', '_io']: > + '_cffi_backend', 'pyexpat', '_continuation', '_io', > + 'thread']: > if modname == 'pypyjit' and 'interp_resop' in rest: > return False > return True > diff --git a/pypy/module/pypyjit/test/test_policy.py b/pypy/module/pypyjit/test/test_policy.py > --- a/pypy/module/pypyjit/test/test_policy.py > +++ b/pypy/module/pypyjit/test/test_policy.py > @@ -45,6 +45,10 @@ > from pypy.module._io.interp_bytesio import W_BytesIO > assert pypypolicy.look_inside_function(W_BytesIO.seek_w.im_func) > > +def test_thread(): > + from pypy.module.thread.os_lock import Lock > + assert pypypolicy.look_inside_function(Lock.descr_lock_acquire.im_func) > + > def test_pypy_module(): > from pypy.module._collections.interp_deque import W_Deque > from pypy.module._random.interp_random import W_Random > _______________________________________________ > pypy-commit mailing list > pypy-commit at python.org > https://mail.python.org/mailman/listinfo/pypy-commit > From anto.cuni at gmail.com Wed Sep 4 16:44:32 2013 From: anto.cuni at gmail.com (Antonio Cuni) Date: Wed, 04 Sep 2013 15:44:32 +0100 Subject: [pypy-dev] [pypy-commit] pypy refactor-translator: Remove fork_before option (unused). In-Reply-To: <20130904111332.C01451C135D@cobra.cs.uni-duesseldorf.de> References: <20130904111332.C01451C135D@cobra.cs.uni-duesseldorf.de> Message-ID: <52274750.7040405@gmail.com> Hi Manuel, did you actually kill support for this feature? I find it occasionally useful: e.g. when working on the JIT you can use --fork-before=pyjitpl and avoid to annotate/rtype the whole pypy interp when you change something. ciao, Anto On 04/09/13 12:13, Manuel Jacob wrote: > Author: Manuel Jacob > Branch: refactor-translator > Changeset: r66781:57dd91f9ccd9 > Date: 2013-09-02 18:08 +0100 > http://bitbucket.org/pypy/pypy/changeset/57dd91f9ccd9/ > > Log: Remove fork_before option (unused). > > diff --git a/rpython/config/translationoption.py b/rpython/config/translationoption.py > --- a/rpython/config/translationoption.py > +++ b/rpython/config/translationoption.py > @@ -127,11 +127,6 @@ > default=False, cmdline=None), > BoolOption("countmallocs", "Count mallocs and frees", default=False, > cmdline=None), > - ChoiceOption("fork_before", > - "(UNIX) Create restartable checkpoint before step", > - ["annotate", "rtype", "backendopt", "database", "source", > - "pyjitpl"], > - default=None, cmdline="--fork-before"), > BoolOption("dont_write_c_files", > "Make the C backend write everyting to /dev/null. " + > "Useful for benchmarking, so you don't actually involve the disk", > _______________________________________________ > pypy-commit mailing list > pypy-commit at python.org > https://mail.python.org/mailman/listinfo/pypy-commit > From arigo at tunes.org Thu Sep 5 10:15:46 2013 From: arigo at tunes.org (Armin Rigo) Date: Thu, 5 Sep 2013 10:15:46 +0200 Subject: [pypy-dev] [pypy-commit] pypy refactor-translator: Remove fork_before option (unused). In-Reply-To: <52274750.7040405@gmail.com> References: <20130904111332.C01451C135D@cobra.cs.uni-duesseldorf.de> <52274750.7040405@gmail.com> Message-ID: Hi, On Wed, Sep 4, 2013 at 4:44 PM, Antonio Cuni wrote: > Hi Manuel, > > did you actually kill support for this feature? I find it occasionally > useful: e.g. when working on the JIT you can use --fork-before=pyjitpl and > avoid to annotate/rtype the whole pypy interp when you change something. Uh? Yes, please don't randomly kill "fork_before". A bient?t, Armin. From david.schneider at bivab.de Thu Sep 5 15:43:48 2013 From: david.schneider at bivab.de (David Schneider) Date: Thu, 5 Sep 2013 15:43:48 +0200 Subject: [pypy-dev] Buildbot update In-Reply-To: <5226EA97.5090508@gmail.com> References: <85123D5D-1C1F-4C4D-874B-E4122B75CA52@bivab.de> <5226EA97.5090508@gmail.com> Message-ID: <5DFF835E-29C6-4D8F-8DB8-923D4AC222F6@bivab.de> On 04.09.2013, at 10:08, Matti Picus wrote: > TL;DR : Do you have any hints/instructions as to how to test a new buildbot task? > > Thanks for the update, it seems to have cleaned up much of the code base. > I would like to add a task that installs the python part of a numpy installation to pypy and runs numpy's tests. I have started the work on the numpy-tests branch. I see you improved the infrastructure for testing buildbot itself, but it is not clear to me how to test and debug the new task, > Do I need to set up a master/slave pair or can I test the tasks with only one process, what would a typical test command-line look like, etc. Note that I would have to create a virtualenv and install nose into it since numpy tests require nose, so any hints you could give would be helpful. > Matti > > On 09/03/2013 09:04 PM, David Schneider wrote: >> Hi all, >> >> I just upgraded the PyPy buildbot on buildbot.pypy.org to run on version 0.8.8 of buildbot. This update should not create any significant visible changes and mainly simplifies some things in the codebase. Also the update should make it easier to add some features, such as decoupling the translation from the tests, hopefully allowing to reuse binaries for different test runs. >> >> As usual with updates there might be some things that do not work as expected or at all. In case you notice something please let me know either by email or on IRC. >> >> Cheers, >> >> David >> _______________________________________________ >> pypy-dev mailing list >> pypy-dev at python.org >> https://mail.python.org/mailman/listinfo/pypy-dev > > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev Hi Matti, To some degree you can unit-test the new builds you configure, but this might turn out to be tricky when you want to make sure that the external tools are invoked correctly. Probably nothing new, but the tests in bot2/pypybuildbot/test/test_builds.py might serve as an example, they test among other things how the results returned by the pytest step are handled. To test that the builds actually work there is AFAIK no way to avoid having a master/slave pair, although it is possible, for development/debugging, to run both from the pypy-buildbot sources. Cheers, David -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 495 bytes Desc: Message signed with OpenPGP using GPGMail URL: From skip at pobox.com Thu Sep 5 17:29:33 2013 From: skip at pobox.com (Skip Montanaro) Date: Thu, 5 Sep 2013 10:29:33 -0500 Subject: [pypy-dev] Specifying lib locations? Message-ID: Trying to build from source, I get the complaint, "failed to guess where ncurses is installed." On our systems they are in decidedly nonstandard locations. A bit of googling didn't turn up an obvious solution, and rpython --help didn't spew any lines that looked promising. How do I specify nonstandard locations of the various dependent libraries? Thanks, Skip From ddvento at ucar.edu Thu Sep 5 18:07:56 2013 From: ddvento at ucar.edu (Davide Del Vento) Date: Thu, 05 Sep 2013 10:07:56 -0600 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: Message-ID: <5228AC5C.4010207@ucar.edu> Skip The way we solved this problem on our system is creating a compiler wrapper. This is a non-pypy-specific solution, which we believe is very effective and convenient. The "normal" gcc is installed in non-standard out-of-path location. A gcc shell script is installed instead. Such a script will call the actual gcc with all the proper -I -L -l of the other libraries (managed by lmod). In fact, in this way, it's a piece of cake to maintain several versions of the same library (and compiler and everything) on the system. I can elaborate more if this is not clear. Regards, NCAR Computational & Information Services Laboratory Consulting Services Software Engineer http://www2.cisl.ucar.edu/uss/csg/ SEA Chair http://sea.ucar.edu/ On 09/05/2013 09:29 AM, Skip Montanaro wrote: > Trying to build from source, I get the complaint, "failed to guess > where ncurses is installed." On our systems they are in decidedly > nonstandard locations. A bit of googling didn't turn up an obvious > solution, and rpython --help didn't spew any lines that looked > promising. How do I specify nonstandard locations of the various > dependent libraries? > > Thanks, > > Skip > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > From arigo at tunes.org Thu Sep 5 18:27:19 2013 From: arigo at tunes.org (Armin Rigo) Date: Thu, 5 Sep 2013 18:27:19 +0200 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: Message-ID: Hi Skip, On Thu, Sep 5, 2013 at 5:29 PM, Skip Montanaro wrote: > Trying to build from source, I get the complaint, "failed to guess > where ncurses is installed." The logic is spread on pages upon pages of pypy/module/_minimal_curses/fficurses.py, including trying with the "pkg-config" or "ncurses5-config" tools, but does not contain a command-line-configurable option. I'd suggest to either edit this file directly, or to use a solution similar to Davide's. A bient?t, Armin. From skip at pobox.com Thu Sep 5 19:33:58 2013 From: skip at pobox.com (Skip Montanaro) Date: Thu, 5 Sep 2013 12:33:58 -0500 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: <5228AC5C.4010207@ucar.edu> References: <5228AC5C.4010207@ucar.edu> Message-ID: > The way we solved this problem on our system is creating a compiler wrapper. > This is a non-pypy-specific solution, which we believe is very effective and > convenient. > > The "normal" gcc is installed in non-standard out-of-path location. A gcc > shell script is installed instead. Such a script will call the actual gcc > with all the proper -I -L -l of the other libraries (managed by lmod). In > fact, in this way, it's a piece of cake to maintain several versions of the > same library (and compiler and everything) on the system. Thanks. Will give that a try. Skip From techtonik at gmail.com Thu Sep 5 20:53:55 2013 From: techtonik at gmail.com (anatoly techtonik) Date: Thu, 5 Sep 2013 21:53:55 +0300 Subject: [pypy-dev] Free SSL certificate for https://bugs.pypy.org Message-ID: *.python.org uses free SSL certificate from http://www.startssl.com/?app=1 PyPy can use this too to get rid of annoying security exceptions for https://bugs.pypy.org and other parts. -- anatoly t. From arigo at tunes.org Thu Sep 5 21:45:04 2013 From: arigo at tunes.org (Armin Rigo) Date: Thu, 5 Sep 2013 21:45:04 +0200 Subject: [pypy-dev] Free SSL certificate for https://bugs.pypy.org In-Reply-To: References: Message-ID: Hi Anatoly, On Thu, Sep 5, 2013 at 8:53 PM, anatoly techtonik wrote: > *.python.org uses free SSL certificate from http://www.startssl.com/?app=1 > > PyPy can use this too to get rid of annoying security exceptions for > https://bugs.pypy.org and other parts. Thanks ! Fwiw, we will soon move bugs.pypy.org somewhere else. I don't know the details (or if we *have* any so far), but the current hosting site is going down. (Thanks Holger for the continued contribution up to now!) A bient?t, Armin. From techtonik at gmail.com Thu Sep 5 21:51:40 2013 From: techtonik at gmail.com (anatoly techtonik) Date: Thu, 5 Sep 2013 22:51:40 +0300 Subject: [pypy-dev] Free SSL certificate for https://bugs.pypy.org In-Reply-To: References: Message-ID: Cool. Any prospects on when it can happen (or a link to tracker issue)? Is it a move to pydotorg infrastructure? Btw, I found a bug about SSL issue - https://bugs.pypy.org/issue723 -- anatoly t. On Thu, Sep 5, 2013 at 10:45 PM, Armin Rigo wrote: > Hi Anatoly, > > On Thu, Sep 5, 2013 at 8:53 PM, anatoly techtonik wrote: >> *.python.org uses free SSL certificate from http://www.startssl.com/?app=1 >> >> PyPy can use this too to get rid of annoying security exceptions for >> https://bugs.pypy.org and other parts. > > Thanks ! Fwiw, we will soon move bugs.pypy.org somewhere else. I > don't know the details (or if we *have* any so far), but the current > hosting site is going down. (Thanks Holger for the continued > contribution up to now!) > > > A bient?t, > > Armin. From alex.gaynor at gmail.com Thu Sep 5 23:45:40 2013 From: alex.gaynor at gmail.com (Alex Gaynor) Date: Thu, 5 Sep 2013 14:45:40 -0700 Subject: [pypy-dev] Free SSL certificate for https://bugs.pypy.org In-Reply-To: References: Message-ID: Yes, we're in the process of moving bugs.pypy.org to be handled by the PSF infrastructure group Alex On Thu, Sep 5, 2013 at 12:51 PM, anatoly techtonik wrote: > Cool. Any prospects on when it can happen (or a link to tracker issue)? > Is it a move to pydotorg infrastructure? > > Btw, I found a bug about SSL issue - https://bugs.pypy.org/issue723 > -- > anatoly t. > > > On Thu, Sep 5, 2013 at 10:45 PM, Armin Rigo wrote: > > Hi Anatoly, > > > > On Thu, Sep 5, 2013 at 8:53 PM, anatoly techtonik > wrote: > >> *.python.org uses free SSL certificate from > http://www.startssl.com/?app=1 > >> > >> PyPy can use this too to get rid of annoying security exceptions for > >> https://bugs.pypy.org and other parts. > > > > Thanks ! Fwiw, we will soon move bugs.pypy.org somewhere else. I > > don't know the details (or if we *have* any so far), but the current > > hosting site is going down. (Thanks Holger for the continued > > contribution up to now!) > > > > > > A bient?t, > > > > Armin. > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > -- "I disapprove of what you say, but I will defend to the death your right to say it." -- Evelyn Beatrice Hall (summarizing Voltaire) "The people's good is the highest law." -- Cicero GPG Key fingerprint: 125F 5C67 DFE9 4084 -------------- next part -------------- An HTML attachment was scrubbed... URL: From anto.cuni at gmail.com Fri Sep 6 11:02:54 2013 From: anto.cuni at gmail.com (Antonio Cuni) Date: Fri, 06 Sep 2013 11:02:54 +0200 Subject: [pypy-dev] [pypy-commit] jitviewer argparse-collect: (RichardN, Edd) Add the jitviewer path to PYTHONPATH automatically. In-Reply-To: <20130905150943.D803D1C0710@cobra.cs.uni-duesseldorf.de> References: <20130905150943.D803D1C0710@cobra.cs.uni-duesseldorf.de> Message-ID: <52299A3E.6000107@gmail.com> Hi, On 05/09/13 17:09, vext01 wrote: > Log: (RichardN, Edd) Add the jitviewer path to PYTHONPATH automatically. > > diff --git a/bin/jitviewer.py b/bin/jitviewer.py > --- a/bin/jitviewer.py > +++ b/bin/jitviewer.py > @@ -1,4 +1,10 @@ > #!/usr/bin/env pypy > import sys > +import os.path > + > +script_path = os.path.abspath(__file__) > +pythonpath = os.path.dirname(os.path.dirname(script_path)) > +sys.path.append(pythonpath) this looks wrong. I think that the jitviewer is supposed to be installed as a normal package inside the pypy distribution to work well. You should do: $ /path/to/pypy/bin/pypy /path/to/jitviewer/setup.py develop this way, setuptools creates a link and the jitviewer package is installed in pypy even if it's physically in the repo (which is convenient for developing). ciao, Anto From max.lavrenov at gmail.com Thu Sep 5 14:30:42 2013 From: max.lavrenov at gmail.com (Max Lavrenov) Date: Thu, 5 Sep 2013 16:30:42 +0400 Subject: [pypy-dev] freeze in project under pypy Message-ID: Hello, I've tried running our application under pypy and found something strange in its behavior. It is a twisted project which accept users http requests, make request to redis, but most of the request time it spends in cpu intensive calculations. Compared average response time , pypy shows ~30% boost which is cool. But plot with longest response time in 10 seconds period looks weird. I've attached this plot. Blue line - pypy. Green line - cpython Y-axis is time in ms Both application run in parallel on the same host under the same loading. Looks like pypy freeze some single requests on 1500 ms time. I speak about single requests because plot with 95th percentile response time looks normally. My first thought was about GC, but looks like pypy lack of debug facility in the gc module. Could you suggest any ideas of why it happens? Best regards, Max Lavrenov -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: v-statsd2.png Type: image/png Size: 118603 bytes Desc: not available URL: From mount.sarah at gmail.com Fri Sep 6 11:53:23 2013 From: mount.sarah at gmail.com (Sarah Mount) Date: Fri, 6 Sep 2013 10:53:23 +0100 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects Message-ID: Hi there, I haven't had much time to work on rcsp since the Sprint, but I've been tinkering with the infrastructure here and there. I have added the project to the travis-ci.org continuous integration platform here: https://travis-ci.org/snim2/rcsp/ The thing I'm interested in is that it would be nice to use travis to a) run automated tests, b) bulid documentation and c) use rpython to translate to an executable, at the very least to check that the translation doesn't error. This would be really useful for others I'm sure, and I'm happy to document this work for pypy and maybe set up a basic pypy project skeleton for github / travis users, which I can package separately or you might want to put into the repo or whatever -- assuming no one has done that already. The sticking point I've got at the moment is that if I get travis to use pypy and build the interpreter it cannot find the rpython executable. Presumably if it could it probably wouldn't be able to find rlib. The "right" fix for that problem is to install both rpython and its libraries in a virtualenv with pip. If I run "pip install" from the command line pip cannot find rpython, so does anyone have a recipe for this already? Thanks, Sarah -- Sarah Mount, Senior Lecturer, University of Wolverhampton website: http://www.snim2.org/ twitter: @snim2 -------------- next part -------------- An HTML attachment was scrubbed... URL: From vext01 at gmail.com Fri Sep 6 15:45:39 2013 From: vext01 at gmail.com (Edd Barrett) Date: Fri, 6 Sep 2013 14:45:39 +0100 Subject: [pypy-dev] [pypy-commit] jitviewer argparse-collect: (RichardN, Edd) Add the jitviewer path to PYTHONPATH automatically. In-Reply-To: <52299A3E.6000107@gmail.com> References: <20130905150943.D803D1C0710@cobra.cs.uni-duesseldorf.de> <52299A3E.6000107@gmail.com> Message-ID: <20130906134538.GH18567@edd-i386.kent.ac.uk> On Fri, Sep 06, 2013 at 11:02:54AM +0200, Antonio Cuni wrote: > Hi, > > On 05/09/13 17:09, vext01 wrote: > >Log: (RichardN, Edd) Add the jitviewer path to PYTHONPATH automatically. > > > >diff --git a/bin/jitviewer.py b/bin/jitviewer.py > >--- a/bin/jitviewer.py > >+++ b/bin/jitviewer.py > >@@ -1,4 +1,10 @@ > > #!/usr/bin/env pypy > > import sys > >+import os.path > >+ > >+script_path = os.path.abspath(__file__) > >+pythonpath = os.path.dirname(os.path.dirname(script_path)) > >+sys.path.append(pythonpath) > > this looks wrong. > > I think that the jitviewer is supposed to be installed as a normal > package inside the pypy distribution to work well. I was following the first installation method in the README.rst (albeit i was using pure virtualenv without virtualenvwrapper as it doesn't work on BSD). > You should do: > $ /path/to/pypy/bin/pypy /path/to/jitviewer/setup.py develop > > this way, setuptools creates a link and the jitviewer package is > installed in pypy even if it's physically in the repo (which is > convenient for developing). Doing the above I see that jitviewer is installed into the virtualenv. Jitviewer can then be run from the virtualenv just fine. >+script_path = os.path.abspath(__file__) >+pythonpath = os.path.dirname(os.path.dirname(script_path)) >+sys.path.append(pythonpath) Here we are appending to the path, not overriding it, hence this is safe for either method. Right? -- Best Regards Edd Barrett http://www.theunixzoo.co.uk From skip at pobox.com Fri Sep 6 16:11:54 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 09:11:54 -0500 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: <5228AC5C.4010207@ucar.edu> References: <5228AC5C.4010207@ucar.edu> Message-ID: > The way we solved this problem on our system is creating a compiler wrapper. > This is a non-pypy-specific solution, which we believe is very effective and > convenient. > > The "normal" gcc is installed in non-standard out-of-path location. A gcc > shell script is installed instead. Such a script will call the actual gcc > with all the proper -I -L -l of the other libraries (managed by lmod). In > fact, in this way, it's a piece of cake to maintain several versions of the > same library (and compiler and everything) on the system. > > I can elaborate more if this is not clear. I thought it was clear, but my initial attempts at a wrapper have yielded no improvement. Here's what my wrapper looks like at the moment: #!/bin/bash # Wrapper around GCC used to add a number of non-standard library and # include file locations to command lines when building pypy. gcc \ -L/opt/TWWfsw/ncurses57/lib -I/opt/TWWfsw/ncurses57/include \ -I/opt/TWWfsw/ncurses57/include/ncurses \ "$@" This seems to me what you were describing, and in fact, I've verified that at curses.h and term.h exist in the second -I directory. Lots of other compile commands before the failure succeed, and do use my minimal gcc-wrap script, so I'm sure it's being invoked, and doesn't have some stupid bug like a syntax error or typo. Hints appreciated... Thx, Skip From anto.cuni at gmail.com Fri Sep 6 16:56:37 2013 From: anto.cuni at gmail.com (Antonio Cuni) Date: Fri, 06 Sep 2013 16:56:37 +0200 Subject: [pypy-dev] [pypy-commit] jitviewer argparse-collect: (RichardN, Edd) Add the jitviewer path to PYTHONPATH automatically. In-Reply-To: <20130906134538.GH18567@edd-i386.kent.ac.uk> References: <20130905150943.D803D1C0710@cobra.cs.uni-duesseldorf.de> <52299A3E.6000107@gmail.com> <20130906134538.GH18567@edd-i386.kent.ac.uk> Message-ID: <5229ED25.4050801@gmail.com> Hi Edd, On 06/09/13 15:45, Edd Barrett wrote: >> +script_path = os.path.abspath(__file__) >> +pythonpath = os.path.dirname(os.path.dirname(script_path)) >> +sys.path.append(pythonpath) > > Here we are appending to the path, not overriding it, hence this is safe for > either method. Right? yes, if you do setup.py develop those lines are both safe and pointless :) But I saw that you removed then in a later checkin, so no problem. ciao, Anto From skip at pobox.com Fri Sep 6 17:10:17 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 10:10:17 -0500 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: <5228AC5C.4010207@ucar.edu> Message-ID: I believe I figured this out. On my system, the library is libncurses.so, not libcurses.so. I was only looking at the rpython traceback, not at the last gcc command run before the traceback was emitted. Once I ran that directly on the command line and saw its error message and figured out what was going on. I am not being entertained by the Mandelbrot set... It seems that try_ldflags should also yield ExternalCompilationInfo instances where the main library is "ncurses", not "curses", and /usr/lib64 should be searched as well, at least by some of the ECI variants. Skip From techtonik at gmail.com Fri Sep 6 17:23:43 2013 From: techtonik at gmail.com (anatoly techtonik) Date: Fri, 6 Sep 2013 18:23:43 +0300 Subject: [pypy-dev] What stops ARMed PyPy 2.1 from working on Android? Message-ID: I am certain there is no roadmap for Android or else I'd already found it, but still is there any list of issues to be addressed to get PyPy running there? http://morepypy.blogspot.com/2013/08/pypy-21-considered-armful.html Modern Android tablets are ARMv7, so after 2.1 hardware is not an issue. As for platform, it looks like android-scripting project is able to compile CPython to be run on tablet, http://code.google.com/p/android-scripting/source/browse/#hg%2Fpython-build It would be nice to plug keyboard into Android tablet and start polling its API with PyPy... -- anatoly t. From arigo at tunes.org Fri Sep 6 17:19:23 2013 From: arigo at tunes.org (Armin Rigo) Date: Fri, 6 Sep 2013 17:19:23 +0200 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: <5228AC5C.4010207@ucar.edu> Message-ID: Hi Skip, On Fri, Sep 6, 2013 at 5:10 PM, Skip Montanaro wrote: > It seems that try_ldflags should also yield ExternalCompilationInfo > instances where the main library is "ncurses", not "curses", and > /usr/lib64 should be searched as well, at least by some of the ECI > variants. Feel free to suggest whatever new yield's, but please as a patch to bugs.pypy.org. We can't test them ourselves, obviously, which makes the exercice pointless unless you test them first. A bient?t, Armin. From skip at pobox.com Fri Sep 6 17:47:20 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 10:47:20 -0500 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: <5228AC5C.4010207@ucar.edu> Message-ID: > Feel free to suggest whatever new yield's, but please as a patch to > bugs.pypy.org. We can't test them ourselves, obviously, which makes > the exercice pointless unless you test them first. Done: https://bugs.pypy.org/issue1596 Skip From skip at pobox.com Fri Sep 6 22:15:47 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 15:15:47 -0500 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition Message-ID: Python 2.7's PyDateTime_Delta structure is defined like this: typedef struct { PyObject_HEAD long hashcode; /* -1 when unknown */ int days; /* -MAX_DELTA_DAYS <= days <= MAX_DELTA_DAYS */ int seconds; /* 0 <= seconds < 24*3600 is invariant */ int microseconds; /* 0 <= microseconds < 1000000 is invariant */ } PyDateTime_Delta; In contrast, PyPy's version leaves out everything except PyObject_HEAD: typedef struct { PyObject_HEAD } PyDateTime_Delta; That doesn't seem to be compatible with C extension modules that want to manipulate such objects. This is my first foray into trying to compile some Boost.Python wrappers for libraries at work which manipulate Python datetime objects. Why are the other four fields not defined? Is there a way to get around this? Thx, Skip From amauryfa at gmail.com Fri Sep 6 23:34:47 2013 From: amauryfa at gmail.com (Amaury Forgeot d'Arc) Date: Fri, 6 Sep 2013 23:34:47 +0200 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: 2013/9/6 Skip Montanaro > Python 2.7's PyDateTime_Delta structure is defined like this: > > typedef struct > { > PyObject_HEAD > long hashcode; /* -1 when unknown */ > int days; /* -MAX_DELTA_DAYS <= days <= > MAX_DELTA_DAYS */ > int seconds; /* 0 <= seconds < 24*3600 is invariant */ > int microseconds; /* 0 <= microseconds < 1000000 is > invariant */ > } PyDateTime_Delta; > > In contrast, PyPy's version leaves out everything except > PyObject_HEAD: > > typedef struct { > PyObject_HEAD > } PyDateTime_Delta; > > That doesn't seem to be compatible with C extension modules that want > to manipulate such objects. This is my first foray into trying to > compile some Boost.Python wrappers for libraries at work which > manipulate Python datetime objects. Why are the other four fields not > defined? Is there a way to get around this? > Sure. Use macros like PyDateTime_DELTA_GET_DAYS(). The structure is not part of the public API anyway. With CPython the macros don't make any difference, but PyPy implements them as actual function calls. Remember that with PyPy, the C API needs to create a copy of each Python object (at a fixed address). So to implement the additional fields, it would be necessary to allocate more and copy more, even when the attributes are not needed. Oh, by the way, PyPy uses a pure Python implementation of datetime.py. So there is little chance we can implement some shortcut in C to directly reference some internal storage of the datetime types. -- Amaury Forgeot d'Arc -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip at pobox.com Sat Sep 7 00:05:56 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 17:05:56 -0500 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: > Sure. Use macros like PyDateTime_DELTA_GET_DAYS(). The structure is not part > of the public API anyway. > With CPython the macros don't make any difference, but PyPy implements them > as actual function calls. Ah, that makes sense. The code in question was written when we were still using Python 2.3 or 2.4, before there was any datetime C API. A little tweak is in order I guess. Why are those PyDateTime*GET* declarations in datetime.h? I might well have noticed them before asking a basic question. Skip From amauryfa at gmail.com Sat Sep 7 00:33:34 2013 From: amauryfa at gmail.com (Amaury Forgeot d'Arc) Date: Sat, 7 Sep 2013 00:33:34 +0200 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: 2013/9/7 Skip Montanaro > > Sure. Use macros like PyDateTime_DELTA_GET_DAYS(). The structure is not > part > > of the public API anyway. > > With CPython the macros don't make any difference, but PyPy implements > them > > as actual function calls. > > Ah, that makes sense. The code in question was written when we were > still using Python 2.3 or 2.4, before there was any datetime C API. A > little tweak is in order I guess. Why are those PyDateTime*GET* > declarations in datetime.h? I might well have noticed them before > asking a basic question. > Not sure I understand. Or did you mean "why are those declarations not in datetime.h"? Then the answer is: because all these functions are generated during PyPy translation. The corresponding declarations are written in pypy_decl.h -- Amaury Forgeot d'Arc -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip at pobox.com Sat Sep 7 01:13:22 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 18:13:22 -0500 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: > Not sure I understand. Or did you mean "why are those declarations not in > datetime.h"? Yes, sorry about the typo. > Then the answer is: because all these functions are generated during PyPy > translation. > The corresponding declarations are written in pypy_decl.h Makes sense... Skip From skip at pobox.com Sat Sep 7 03:16:15 2013 From: skip at pobox.com (Skip Montanaro) Date: Fri, 6 Sep 2013 20:16:15 -0500 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: On Fri, Sep 6, 2013 at 6:13 PM, Skip Montanaro wrote: >> Not sure I understand. Or did you mean "why are those declarations not in >> datetime.h"? > > Yes, sorry about the typo. Alas, I am still confused. The PyDateTime_DELTA_GET_* macros aren't defined for CPython until the 3.x series. (There are macros in 2.7, but they are defined in datetimemodule.c, not in a public header file. PyPy declares them as functions. Is there some predefined "This is PyPy" macro I can check? Thx, Skip From arigo at tunes.org Sat Sep 7 09:17:17 2013 From: arigo at tunes.org (Armin Rigo) Date: Sat, 7 Sep 2013 09:17:17 +0200 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: Hi Skip, hi Amaury, On Sat, Sep 7, 2013 at 3:16 AM, Skip Montanaro wrote: > Alas, I am still confused. The PyDateTime_DELTA_GET_* macros aren't > defined for CPython until the 3.x series. (There are macros in 2.7, > but they are defined in datetimemodule.c, not in a public header file. > PyPy declares them as functions. Uh, yes, I'd like to ask this question too. In Python 2.7's Include/datetime.h I see the struct PyDateTime_Delta (which is public) and no macro PyDateTime_DELTA_*. There are macros PyDateTime_TIME_GET_* and PyDateTime_DATE_GET_* as well as some PyDateTime_GET_* for both date and datetime instances, which are declared as structs with a non-public name. (Why? No clue) It seems that "struct PyDateTime_Delta" is supposed to be purely read-only. In case situation, wouldn't it make sense to expose a structure with a similar layout? I don't think we care about the overhead of copying three words; the only annoying case is about read-write objects, when we want the changes in C to be reflected back to the Python world. > Is there some predefined "This is PyPy" macro I can check? In this case, as a workaround, you can check "#ifndef PyDateTime_DELTA_GET_..." and declare them yourself. A bient?t, Armin. From amauryfa at gmail.com Sat Sep 7 12:19:54 2013 From: amauryfa at gmail.com (Amaury Forgeot d'Arc) Date: Sat, 7 Sep 2013 12:19:54 +0200 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: 2013/9/7 Skip Montanaro > On Fri, Sep 6, 2013 at 6:13 PM, Skip Montanaro wrote: > >> Not sure I understand. Or did you mean "why are those declarations not > in > >> datetime.h"? > > > > Yes, sorry about the typo. > > Alas, I am still confused. The PyDateTime_DELTA_GET_* macros aren't > defined for CPython until the 3.x series. (There are macros in 2.7, > but they are defined in datetimemodule.c, not in a public header file. > PyPy declares them as functions. > > Is there some predefined "This is PyPy" macro I can check? > Now I feel embarrassed. I was the one who added these macros to CPython 3.3, exactly for this reason. see the patch we propose for cx_Oracle: https://bitbucket.org/pypy/pypy/src/tip/pypy/module/cpyext/patches/cx_Oracle.patch "#ifdef PyDateTime_DELTA_GET_DAYS" should do the trick for any version. -- Amaury Forgeot d'Arc -------------- next part -------------- An HTML attachment was scrubbed... URL: From amauryfa at gmail.com Sat Sep 7 12:28:35 2013 From: amauryfa at gmail.com (Amaury Forgeot d'Arc) Date: Sat, 7 Sep 2013 12:28:35 +0200 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: 2013/9/7 Armin Rigo > It seems that "struct PyDateTime_Delta" is supposed to be purely > read-only. In case situation, wouldn't it make sense to expose a > structure with a similar layout? I don't think we care about the > overhead of copying three words; the only annoying case is about > read-write objects, when we want the changes in C to be reflected back > to the Python world. > This is right. But datetime classes are difficult to handle in cpyext, because they are pure-python classes. They don't have a specific "TypeDef", and cpyext ties the C structs to those typedefs. See sliceobject.py for a simple example of filling the C struct; I don't know how to do this for datetime. -- Amaury Forgeot d'Arc -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip at pobox.com Sat Sep 7 13:38:43 2013 From: skip at pobox.com (Skip Montanaro) Date: Sat, 7 Sep 2013 06:38:43 -0500 Subject: [pypy-dev] PyPy's PyDateTime_Delta definition In-Reply-To: References: Message-ID: Not a big deal. I did it this way: // PyPy introduced the PyDateTime_DELTA_* API as functions even though // they aren't available in CPython until v3.x. To make matters // slightly worse, in CPython 3.x they are defined as macros. #if !defined(PYPY_VERSION) && PY_MAJOR_VERSION < 3 #define PyDateTime_DELTA_GET_DAYS(o) (((PyDateTime_Delta*)o)->days) #define PyDateTime_DELTA_GET_SECONDS(o) (((PyDateTime_Delta*)o)-seconds) #define PyDateTime_DELTA_GET_MICROSECONDS(o) (((PyDateTime_Delta*)o)-microseconds) #endif (Apologies for the bad line wrappage. I am beginning to rue the day I moved to Gmail for all my mail...) Skip On Sat, Sep 7, 2013 at 5:19 AM, Amaury Forgeot d'Arc wrote: > > 2013/9/7 Skip Montanaro >> >> On Fri, Sep 6, 2013 at 6:13 PM, Skip Montanaro wrote: >> >> Not sure I understand. Or did you mean "why are those declarations not >> >> in >> >> datetime.h"? >> > >> > Yes, sorry about the typo. >> >> Alas, I am still confused. The PyDateTime_DELTA_GET_* macros aren't >> defined for CPython until the 3.x series. (There are macros in 2.7, >> but they are defined in datetimemodule.c, not in a public header file. >> PyPy declares them as functions. >> >> Is there some predefined "This is PyPy" macro I can check? > > > Now I feel embarrassed. > I was the one who added these macros to CPython 3.3, exactly for this > reason. > see the patch we propose for cx_Oracle: > https://bitbucket.org/pypy/pypy/src/tip/pypy/module/cpyext/patches/cx_Oracle.patch > "#ifdef PyDateTime_DELTA_GET_DAYS" should do the trick for any version. > > > -- > Amaury Forgeot d'Arc From arigo at tunes.org Sun Sep 8 09:42:44 2013 From: arigo at tunes.org (Armin Rigo) Date: Sun, 8 Sep 2013 09:42:44 +0200 Subject: [pypy-dev] LLVM next steps Message-ID: Hi Manuel, We've been suitably impressed by the results on the new llvm backend during the sprint (well, or suitably un-impressed by both gcc and clang's failure to reconstruct the SSA meaning of the C code). The current issue seems to be debugging. It would be nice if gdb presented at least source ".ll" code rather than just the assembler instructions --- actually, it would be more than nice: debugging at the level of assembler is a no-no. I was thinking: would it make sense to emit from the translation toolchain some files that are not in ".ll" format, but that are more or less a straightforward text file representation of the flow graphs and constants, and then have a separate tool that converts these files (seen as source) into .ll files? There are some advantages of doing it this way, even if it looks like Yet Another intermediate step. The first advantage is that it would give some inspectable files, which we could also request during testing for cases where the pygame flow graph inspector doesn't really work (e.g. too many graphs). Another advantage is that we might refactor the C backend to also be a separate tool that inputs these flowgraph text files, minimizing the amount of duplicate work between the C and the LLVM backends. And of course, the point is that the flowgraph-to-".ll" conversion would insert file line numbers, so that we can debug it in gdb seeing the flowgraph source lines. (I can even think about hacks to do the same even if we go via C...) Of course the drawback is that it's some non-trivial refactorization. Does it make sense? A bient?t, Armin. From arigo at tunes.org Sun Sep 8 17:17:25 2013 From: arigo at tunes.org (Armin Rigo) Date: Sun, 8 Sep 2013 17:17:25 +0200 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: Hi again, On Sun, Sep 8, 2013 at 9:42 AM, Armin Rigo wrote: > We've been suitably impressed by the results on the new llvm backend > during the sprint (well, or suitably un-impressed by both gcc and > clang's failure to reconstruct the SSA meaning of the C code). I have investigated a bit more and it's quite unclear that this would be the source of the difference. It seems that the "-flto" option of gcc, enabling link-time optimization, actually gives very good improvements over the same compilation without this option --- some 11-14%, more so than, say, the typical 5% reported with CPython. If I had to guess, I'd say it is because of the particularly disorganized kind of C code produced by RPyhon. About the llvm backend, one detail hints that it might be the reason for the speed improvement: the fact that the current llvm backend produces most of the source code in a single file. This may be what gives llvm extra room for improvements. This is precisely the same room for improvement that "-flto" also gives gcc, considering that we generate many C files with never-"static" functions. I tried to compile a no-jit version of PyPy from the llvm-translation-backend branch, for comparison, but this fails right now with "NotImplementedError: v585190 = debug_offset()". It successfully compiles targetrpystonedalone (in -O2 mode), though. I get the following results (with the argument "100000000"): plain gcc 4.7.3: 1.95 seconds llvm 3.3: 1.75 seconds gcc with -flto: 1.66 seconds If we get similar results on the whole PyPy, then I fear the llvm backend is going back to where it already went to several time: "not useful enough". We can simply add the -flto flag to the generated Makefiles. Manuel, do you feel like trying to compare? I'm modifying the Makefile manually as follows: CFLAGS = ...... -flto -fno-fat-lto-objects LDFLAGS = ..... -flto=8 -O3 A bient?t, Armin. From alex.gaynor at gmail.com Sun Sep 8 17:33:58 2013 From: alex.gaynor at gmail.com (Alex Gaynor) Date: Sun, 8 Sep 2013 08:33:58 -0700 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: LLVM also has a link time optimization, is it on by default in LLVM, or do we need to benchmark with it enabled explicitly? Alex On Sun, Sep 8, 2013 at 8:17 AM, Armin Rigo wrote: > Hi again, > > On Sun, Sep 8, 2013 at 9:42 AM, Armin Rigo wrote: > > We've been suitably impressed by the results on the new llvm backend > > during the sprint (well, or suitably un-impressed by both gcc and > > clang's failure to reconstruct the SSA meaning of the C code). > > I have investigated a bit more and it's quite unclear that this would > be the source of the difference. It seems that the "-flto" option of > gcc, enabling link-time optimization, actually gives very good > improvements over the same compilation without this option --- some > 11-14%, more so than, say, the typical 5% reported with CPython. If I > had to guess, I'd say it is because of the particularly disorganized > kind of C code produced by RPyhon. > > About the llvm backend, one detail hints that it might be the reason > for the speed improvement: the fact that the current llvm backend > produces most of the source code in a single file. This may be what > gives llvm extra room for improvements. This is precisely the same > room for improvement that "-flto" also gives gcc, considering that we > generate many C files with never-"static" functions. > > I tried to compile a no-jit version of PyPy from the > llvm-translation-backend branch, for comparison, but this fails right > now with "NotImplementedError: v585190 = debug_offset()". It > successfully compiles targetrpystonedalone (in -O2 mode), though. I > get the following results (with the argument "100000000"): > > plain gcc 4.7.3: 1.95 seconds > llvm 3.3: 1.75 seconds > gcc with -flto: 1.66 seconds > > If we get similar results on the whole PyPy, then I fear the llvm > backend is going back to where it already went to several time: "not > useful enough". We can simply add the -flto flag to the generated > Makefiles. Manuel, do you feel like trying to compare? I'm modifying > the Makefile manually as follows: > > CFLAGS = ...... -flto -fno-fat-lto-objects > LDFLAGS = ..... -flto=8 -O3 > > > A bient?t, > > Armin. > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > -- "I disapprove of what you say, but I will defend to the death your right to say it." -- Evelyn Beatrice Hall (summarizing Voltaire) "The people's good is the highest law." -- Cicero GPG Key fingerprint: 125F 5C67 DFE9 4084 -------------- next part -------------- An HTML attachment was scrubbed... URL: From arigo at tunes.org Sun Sep 8 17:40:22 2013 From: arigo at tunes.org (Armin Rigo) Date: Sun, 8 Sep 2013 17:40:22 +0200 Subject: [pypy-dev] freeze in project under pypy In-Reply-To: References: Message-ID: Hi Max, As a starting point, try to run pypy with the following environment variable: PYPYLOG=gc-collect:- Then look interactively if it seems that the pauses correspond to blocks {gc-collect ... gc-collect} in the output. I would say that 1500ms looks on the high end of GC times, but is not completely impossible. You can also direct the output to a file with PYPYLOG=gc-collect:logfile Then you need the tool "rpython/tool/logparser.py" from the PyPy repository to make sense of the hexadecimal numbers (they are timestamps) and display the result in a nice graph. You can also try it with more information, like: PYPYLOG=:logfile pypy some_command.py python logparser.py draw-time logfile out.png --mainwidth 8000 A bient?t, Armin. From arigo at tunes.org Sun Sep 8 17:42:56 2013 From: arigo at tunes.org (Armin Rigo) Date: Sun, 8 Sep 2013 17:42:56 +0200 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: Hi Alex, On Sun, Sep 8, 2013 at 5:33 PM, Alex Gaynor wrote: > LLVM also has a link time optimization, is it on by default in LLVM, or do > we need to benchmark with it enabled explicitly? The point I made in my mail was that the llvm backend is written in a way that makes link-time optimizations unnecessary. We could also not rely on "-flto" and instead write a single big .c file with the word "static" added everywhere. A bient?t, Armin. From ericvrp at gmail.com Sun Sep 8 19:00:20 2013 From: ericvrp at gmail.com (Eric van Riet Paap) Date: Sun, 8 Sep 2013 19:00:20 +0200 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: <6F9C6634-EC5B-4C5B-97A8-9BC6128DCC90@gmail.com> Hi, I am missing some background information to follow what is being discussed here, so... What is the PyPy speed difference after using gcc versus llvm for the compilation of the PyPy-c backend? Would generating .ll instead of .c files really give any benefit? More interesting would still be using llvm as a PyPy-jit-backend. Is there anything new in the llvm world that would make this feasible? There used to be various issues with our previous attempts of using llvm, as we know all to clearly. Eric Op 8 sep. 2013 om 17:42 heeft Armin Rigo het volgende geschreven: > Hi Alex, > > On Sun, Sep 8, 2013 at 5:33 PM, Alex Gaynor wrote: >> LLVM also has a link time optimization, is it on by default in LLVM, or do >> we need to benchmark with it enabled explicitly? > > The point I made in my mail was that the llvm backend is written in a > way that makes link-time optimizations unnecessary. We could also not > rely on "-flto" and instead write a single big .c file with the word > "static" added everywhere. > > > A bient?t, > > Armin. > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev From me at manueljacob.de Mon Sep 9 02:26:12 2013 From: me at manueljacob.de (Manuel Jacob) Date: Mon, 09 Sep 2013 02:26:12 +0200 Subject: [pypy-dev] =?utf-8?q?=5Bpypy-commit=5D_pypy_refactor-translator?= =?utf-8?q?=3A_Remove_fork=5Fbefore_option_=28unused=29=2E?= In-Reply-To: References: <20130904111332.C01451C135D@cobra.cs.uni-duesseldorf.de> <52274750.7040405@gmail.com> Message-ID: On 2013-09-05 10:15, Armin Rigo wrote: >> did you actually kill support for this feature? I find it >> occasionally >> useful: e.g. when working on the JIT you can use >> --fork-before=pyjitpl and >> avoid to annotate/rtype the whole pypy interp when you change >> something. > > Uh? Yes, please don't randomly kill "fork_before". Hi, I removed it some time ago because it was untested. The plan was to readd it later with tests. When I removed the option I forgot these plans and just saw that this option was unused. Probably it's too hard to test anyway, so I'll just readd it without tests. -Manuel From fijall at gmail.com Mon Sep 9 09:52:37 2013 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 9 Sep 2013 09:52:37 +0200 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: On Sun, Sep 8, 2013 at 5:42 PM, Armin Rigo wrote: > Hi Alex, > > On Sun, Sep 8, 2013 at 5:33 PM, Alex Gaynor wrote: >> LLVM also has a link time optimization, is it on by default in LLVM, or do >> we need to benchmark with it enabled explicitly? > > The point I made in my mail was that the llvm backend is written in a > way that makes link-time optimizations unnecessary. We could also not > rely on "-flto" and instead write a single big .c file with the word > "static" added everywhere. > > > A bient?t, > > Armin. One C file sounds bad, but we can add -ftlo and add a word "static" a bit everywhere too (I don't think we care for non-exported symbols at all). To be honest, a separate intermediate file is a very good idea (tm), for various reasons, like it would be trivial to parallelize the C-generation-from-something step. If we can make the low-level graphs a file format, we can even kinda-parallelize other steps, like JIT or GC. Cheers, fijal From fijall at gmail.com Mon Sep 9 09:55:13 2013 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 9 Sep 2013 09:55:13 +0200 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: References: Message-ID: On Fri, Sep 6, 2013 at 11:53 AM, Sarah Mount wrote: > Hi there, > > I haven't had much time to work on rcsp since the Sprint, but I've been > tinkering with the infrastructure here and there. I have added the project > to the travis-ci.org continuous integration platform here: > > https://travis-ci.org/snim2/rcsp/ > > The thing I'm interested in is that it would be nice to use travis to a) run > automated tests, b) bulid documentation and c) use rpython to translate to > an executable, at the very least to check that the translation doesn't > error. > > This would be really useful for others I'm sure, and I'm happy to document > this work for pypy and maybe set up a basic pypy project skeleton for github > / travis users, which I can package separately or you might want to put into > the repo or whatever -- assuming no one has done that already. > > The sticking point I've got at the moment is that if I get travis to use > pypy and build the interpreter it cannot find the rpython executable. > Presumably if it could it probably wouldn't be able to find rlib. The > "right" fix for that problem is to install both rpython and its libraries in > a virtualenv with pip. If I run "pip install" from the command line pip > cannot find rpython, so does anyone have a recipe for this already? > > Thanks, > > Sarah Hi Sarah I believe Alex did something like that. Find him as Alex_Gaynor on IRC or maybe he'll show up here ;-) Cheers, fijal From fijall at gmail.com Mon Sep 9 09:57:22 2013 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 9 Sep 2013 09:57:22 +0200 Subject: [pypy-dev] What stops ARMed PyPy 2.1 from working on Android? In-Reply-To: References: Message-ID: On Fri, Sep 6, 2013 at 5:23 PM, anatoly techtonik wrote: > I am certain there is no roadmap for Android or else I'd already found > it, but still is there any list of issues to be addressed to get PyPy > running there? > > http://morepypy.blogspot.com/2013/08/pypy-21-considered-armful.html > > Modern Android tablets are ARMv7, so after 2.1 hardware is not an > issue. As for platform, it looks like android-scripting project is > able to compile CPython to be run on tablet, > http://code.google.com/p/android-scripting/source/browse/#hg%2Fpython-build > > > It would be nice to plug keyboard into Android tablet and start polling its API > with PyPy... > -- > anatoly t. Android has a custom buildchain and some differences from normal Linux. The short answer is "some work" and I don't really know the extent. Cheers, fijal From david.schneider at bivab.de Mon Sep 9 10:16:22 2013 From: david.schneider at bivab.de (David Schneider) Date: Mon, 9 Sep 2013 10:16:22 +0200 Subject: [pypy-dev] What stops ARMed PyPy 2.1 from working on Android? In-Reply-To: References: Message-ID: <8ECE557F-AB3E-4555-B6F7-0AA4CD47F2F4@bivab.de> On 06.09.2013, at 17:23, anatoly techtonik wrote: > I am certain there is no roadmap for Android or else I'd already found > it, but still is there any list of issues to be addressed to get PyPy > running there? > > http://morepypy.blogspot.com/2013/08/pypy-21-considered-armful.html > > Modern Android tablets are ARMv7, so after 2.1 hardware is not an > issue. As for platform, it looks like android-scripting project is > able to compile CPython to be run on tablet, > http://code.google.com/p/android-scripting/source/browse/#hg%2Fpython-build > > > It would be nice to plug keyboard into Android tablet and start polling its API > with PyPy... > -- > anatoly t. > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev Hi Anatoly, currently nobody is working on PyPy for Android, although contributions are welcome. From the hardware point of view there should be no problem in running PyPy on a modern Android device. There is no comprehensive list of issues to get PyPy running on Android. The first issue to solve is to get the cross-translation process working with Android in order to be able to create a binary. The translation process creates several small c-programs that need to be compiled and run on the target platform to collect information about it before cross-compiling the final binary. For ARM/Linux we do this using scratchbox2 that uses qemu to emulate an ARM processor. A similar approach might work to create a binary for android using the android NDK as a cross-compiler and a tool like scratchbox2 or sbrsh to run pieces of code in an Android environment. Cheers David From mount.sarah at gmail.com Mon Sep 9 10:32:21 2013 From: mount.sarah at gmail.com (Sarah Mount) Date: Mon, 9 Sep 2013 09:32:21 +0100 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: References: Message-ID: Thanks. I had a look through some past threads, "pip install rpython" would solve my problem. I'm not sure what that's currently blocked on but I'd be happy to help. Regards, Sarah On 9 Sep 2013 08:55, "Maciej Fijalkowski" wrote: > On Fri, Sep 6, 2013 at 11:53 AM, Sarah Mount > wrote: > > Hi there, > > > > I haven't had much time to work on rcsp since the Sprint, but I've been > > tinkering with the infrastructure here and there. I have added the > project > > to the travis-ci.org continuous integration platform here: > > > > https://travis-ci.org/snim2/rcsp/ > > > > The thing I'm interested in is that it would be nice to use travis to a) > run > > automated tests, b) bulid documentation and c) use rpython to translate > to > > an executable, at the very least to check that the translation doesn't > > error. > > > > This would be really useful for others I'm sure, and I'm happy to > document > > this work for pypy and maybe set up a basic pypy project skeleton for > github > > / travis users, which I can package separately or you might want to put > into > > the repo or whatever -- assuming no one has done that already. > > > > The sticking point I've got at the moment is that if I get travis to use > > pypy and build the interpreter it cannot find the rpython executable. > > Presumably if it could it probably wouldn't be able to find rlib. The > > "right" fix for that problem is to install both rpython and its > libraries in > > a virtualenv with pip. If I run "pip install" from the command line pip > > cannot find rpython, so does anyone have a recipe for this already? > > > > Thanks, > > > > Sarah > > Hi Sarah > > I believe Alex did something like that. Find him as Alex_Gaynor on IRC > or maybe he'll show up here ;-) > > Cheers, > fijal > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fijall at gmail.com Mon Sep 9 11:05:43 2013 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 9 Sep 2013 11:05:43 +0200 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: References: Message-ID: On Mon, Sep 9, 2013 at 10:32 AM, Sarah Mount wrote: > Thanks. I had a look through some past threads, "pip install rpython" would > solve my problem. I'm not sure what that's currently blocked on but I'd be > happy to help. that's a thing we have as a goal, but we're not there yet. We need to split the repos (easy), but also make our infrastructure understand the difference and make it work. > > Regards, > > Sarah > > On 9 Sep 2013 08:55, "Maciej Fijalkowski" wrote: >> >> On Fri, Sep 6, 2013 at 11:53 AM, Sarah Mount >> wrote: >> > Hi there, >> > >> > I haven't had much time to work on rcsp since the Sprint, but I've been >> > tinkering with the infrastructure here and there. I have added the >> > project >> > to the travis-ci.org continuous integration platform here: >> > >> > https://travis-ci.org/snim2/rcsp/ >> > >> > The thing I'm interested in is that it would be nice to use travis to a) >> > run >> > automated tests, b) bulid documentation and c) use rpython to translate >> > to >> > an executable, at the very least to check that the translation doesn't >> > error. >> > >> > This would be really useful for others I'm sure, and I'm happy to >> > document >> > this work for pypy and maybe set up a basic pypy project skeleton for >> > github >> > / travis users, which I can package separately or you might want to put >> > into >> > the repo or whatever -- assuming no one has done that already. >> > >> > The sticking point I've got at the moment is that if I get travis to use >> > pypy and build the interpreter it cannot find the rpython executable. >> > Presumably if it could it probably wouldn't be able to find rlib. The >> > "right" fix for that problem is to install both rpython and its >> > libraries in >> > a virtualenv with pip. If I run "pip install" from the command line pip >> > cannot find rpython, so does anyone have a recipe for this already? >> > >> > Thanks, >> > >> > Sarah >> >> Hi Sarah >> >> I believe Alex did something like that. Find him as Alex_Gaynor on IRC >> or maybe he'll show up here ;-) >> >> Cheers, >> fijal From cfbolz at gmx.de Mon Sep 9 15:00:11 2013 From: cfbolz at gmx.de (Carl Friedrich Bolz) Date: Mon, 09 Sep 2013 15:00:11 +0200 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: References: Message-ID: <522DC65B.5070503@gmx.de> On 09/09/13 09:55, Maciej Fijalkowski wrote: > I believe Alex did something like that. Find him as Alex_Gaynor on IRC > or maybe he'll show up here ;-) Yes, Topaz is using Travis: https://github.com/topazproject/topaz Cheers, Carl Friedrich From alex.gaynor at gmail.com Mon Sep 9 15:37:05 2013 From: alex.gaynor at gmail.com (Alex Gaynor) Date: Mon, 9 Sep 2013 06:37:05 -0700 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: <522DC65B.5070503@gmx.de> References: <522DC65B.5070503@gmx.de> Message-ID: Yup, right now it uses the cool hack of "Download the tarball, unpack, and manually put it on the Python path" Alex On Mon, Sep 9, 2013 at 6:00 AM, Carl Friedrich Bolz wrote: > On 09/09/13 09:55, Maciej Fijalkowski wrote: > > I believe Alex did something like that. Find him as Alex_Gaynor on IRC > > or maybe he'll show up here ;-) > > Yes, Topaz is using Travis: > > https://github.com/topazproject/topaz > > Cheers, > > Carl Friedrich > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > -- "I disapprove of what you say, but I will defend to the death your right to say it." -- Evelyn Beatrice Hall (summarizing Voltaire) "The people's good is the highest law." -- Cicero GPG Key fingerprint: 125F 5C67 DFE9 4084 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jfcaron at phas.ubc.ca Mon Sep 9 18:08:54 2013 From: jfcaron at phas.ubc.ca (=?iso-8859-1?Q?Jean-Fran=E7ois_Caron?=) Date: Mon, 9 Sep 2013 09:08:54 -0700 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c Message-ID: Hi, I am following the instructions here: http://doc.pypy.org/en/latest/cppyy.html#installation and here: http://root.cern.ch/phpBB3/viewtopic.php?f=14&t=17018 In the first link, it says I can use a "built-in pypy" to do the translation to make it go faster. I have pypy 2.1.0 from MacPorts, but it fails with the following message: <<< jfcaron at jfcaron-MacBook:~/Projects/PyPyRoot/pypy$ pypy rpython/translator/goal/translate.py --opt=jit pypy/goal/targetpypystandalone.py --withmod-cppyy Traceback (most recent call last): File "app_main.py", line 72, in run_toplevel File "rpython/translator/goal/translate.py", line 89, in log = py.log.Producer("translation") File "/Users/jfcaron/Projects/PyPyRoot/pypy/py/_apipkg.py", line 114, in __makeattr result = importobj(modpath, attrname) File "/Users/jfcaron/Projects/PyPyRoot/pypy/py/_apipkg.py", line 37, in importobj module = __import__(modpath, None, None, ['__doc__']) File "/Users/jfcaron/Projects/PyPyRoot/pypy/py/_log/log.py", line 184, in setattr(Syslog, _prio, getattr(py.std.syslog, _prio)) File "/Users/jfcaron/Projects/PyPyRoot/pypy/py/_std.py", line 13, in __getattr__ m = __import__(name) File "/opt/local/lib/pypy/lib_pypy/syslog.py", line 68, in lib = ffi.verify(""" File "/opt/local/lib/pypy/lib_pypy/cffi/api.py", line 311, in verify lib = self.verifier.load_library() File "/opt/local/lib/pypy/lib_pypy/cffi/verifier.py", line 68, in load_library self.compile_module() File "/opt/local/lib/pypy/lib_pypy/cffi/verifier.py", line 55, in compile_module self._write_source() File "/opt/local/lib/pypy/lib_pypy/cffi/verifier.py", line 117, in _write_source file = open(self.sourcefilename, 'w') IOError: [Errno 2] No such file or directory: '/opt/local/lib/pypy/lib_pypy/__pycache__/_cffi__g7019d5d3xad93c709.c' >>> I am now doing the slow translation with CPython (also from MacPorts) with no problems so far. Is this just a version mismatch between the MacPorts pypy and that expected by the pypy on mercurial? Jean-Fran?ois From ddvento at ucar.edu Mon Sep 9 19:14:08 2013 From: ddvento at ucar.edu (Davide Del Vento) Date: Mon, 09 Sep 2013 11:14:08 -0600 Subject: [pypy-dev] Specifying lib locations? In-Reply-To: References: <5228AC5C.4010207@ucar.edu> Message-ID: <522E01E0.9000200@ucar.edu> Sorry for the late reply. Yes that is the bare-bone I was suggesting you to do. It looks like you figured your problem out yourself anyway, so I'd leave that part alone. I have a few suggestions, not sure they are relevant for you, but they might for others reading this thread now or in the future: - make these locations changeable, i.e. put them in an environmental variables (in my case the env vars are managed by lmod) instead of hard-coding them in the wrapper - add something like a --show option, which would exactly print what would be executed (particularly important if you do the environmental variable thing mentioned above) - add also the -Wl,-rpath,/opt/TWWfsw/ncurses57/lib so your final executable won't have to scramble to find its libraries (and the user won't have to scramble with the messy LD_LIBRARY_PATH) - let the wrapper tell the user it is a wrapper, or tell people about it - install other versions of whatever library you like, use the env vars to cherry pick what you want for any particular application (or other library) and enjoy them all working nicely together :-) Regards, Davide Del Vento, NCAR Computational & Information Services Laboratory Consulting Services Software Engineer http://www2.cisl.ucar.edu/uss/csg/ SEA Chair http://sea.ucar.edu/ On 09/06/2013 08:11 AM, Skip Montanaro wrote: >> The way we solved this problem on our system is creating a compiler wrapper. >> This is a non-pypy-specific solution, which we believe is very effective and >> convenient. >> >> The "normal" gcc is installed in non-standard out-of-path location. A gcc >> shell script is installed instead. Such a script will call the actual gcc >> with all the proper -I -L -l of the other libraries (managed by lmod). In >> fact, in this way, it's a piece of cake to maintain several versions of the >> same library (and compiler and everything) on the system. >> >> I can elaborate more if this is not clear. > > > I thought it was clear, but my initial attempts at a wrapper have yielded no > improvement. Here's what my wrapper looks like at the moment: > > #!/bin/bash > > # Wrapper around GCC used to add a number of non-standard library and > # include file locations to command lines when building pypy. > > gcc \ > -L/opt/TWWfsw/ncurses57/lib -I/opt/TWWfsw/ncurses57/include \ > -I/opt/TWWfsw/ncurses57/include/ncurses \ > "$@" > > This seems to me what you were describing, and in fact, I've verified that > at curses.h and term.h exist in the second -I directory. Lots of other compile > commands before the failure succeed, and do use my minimal gcc-wrap > script, so I'm sure it's being invoked, and doesn't have some stupid bug like > a syntax error or typo. > > Hints appreciated... > > Thx, > > Skip > From matti.picus at gmail.com Mon Sep 9 21:37:45 2013 From: matti.picus at gmail.com (Matti Picus) Date: Mon, 09 Sep 2013 22:37:45 +0300 Subject: [pypy-dev] ndarray cpyext api on the pypy-pyarray branch Message-ID: <522E2389.8000705@gmail.com> I reverted the changes I made to the pypy-pyarray branch that changed c-api functions like PyArray_NDIM(arr). The original code had no real answer to what happens if these are called when arr is not an ndarray. While the reviewers of the branch suggested that these should raise an exception if arr is not a ndarray, Amaury Forgeot d'Arc convinced me that we should be consistent with numpy (perhaps dangerous) practice since in numpy these functions are c-defined macros with no type checking and the numerous example uses scattered through the numpy source code as well as scipy contain no error checking. In fact error checking is not possible since there is no clear idea of what an error condition in the return value would look like. The discussion we had on IRC starts here http://www.tismer.com/pypy/irc-logs/pypy/pypy.2013-09-08.log.html#t22:15 so if my explanation is unclear please read the log. Any objections to keeping with the original numpy-compatible dangerous behaviour? Matti From romain.py at gmail.com Mon Sep 9 21:58:42 2013 From: romain.py at gmail.com (Romain Guillebert) Date: Mon, 9 Sep 2013 21:58:42 +0200 Subject: [pypy-dev] ndarray cpyext api on the pypy-pyarray branch In-Reply-To: <522E2389.8000705@gmail.com> References: <522E2389.8000705@gmail.com> Message-ID: Hi Matti I think it's fine to copy numpy's behaviour. Cheers Romain On Mon, Sep 9, 2013 at 9:37 PM, Matti Picus wrote: > I reverted the changes I made to the pypy-pyarray branch that changed c-api > functions like PyArray_NDIM(arr). > > The original code had no real answer to what happens if these are called > when arr is not an ndarray. > > While the reviewers of the branch suggested that these should raise an > exception if arr is not a ndarray, Amaury Forgeot d'Arc convinced me that we > should be consistent with numpy (perhaps dangerous) practice since in numpy > these functions are c-defined macros with no type checking and the numerous > example > uses scattered through the numpy source code as well as scipy contain no > error checking. > > In fact error checking is not possible since there is no clear idea of what > an error condition > in the return value would look like. > > The discussion we had on IRC starts here > http://www.tismer.com/pypy/irc-logs/pypy/pypy.2013-09-08.log.html#t22:15 > so if my explanation is unclear please read the log. > > Any objections to keeping with the original numpy-compatible dangerous > behaviour? > Matti > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev From n210241048576 at gmail.com Mon Sep 9 21:58:56 2013 From: n210241048576 at gmail.com (Robert Grosse) Date: Mon, 9 Sep 2013 15:58:56 -0400 Subject: [pypy-dev] Windows 7 x64 development In-Reply-To: References: <1B12946B-109E-4105-B1CB-60FDD1A59E49@gmail.com> Message-ID: Are you sure that building CPython with the Windows SDK is actually supported? I've found instructions online for building various Python extensions with the free compiler, but I haven't been able to find anything that works for building Python itself. Running setup.py fails because sysconfig.get_config_var("CONFIG_ARGS") returns None. On Wed, Sep 4, 2013 at 12:15 AM, Clay Sweetser wrote: > Um, I believe you're mistaken. Though it's true that visual studio express > doesn't *come * with a 64 bit compiler (at least on windows 7 and below, > the latest one for win8 does) you can still download the windows 64 bit SDK > and use the one that comes with that. > > Sincerely, Clay Sweetser > > "Evil begins when you begin to think of people as things." - Terry > Pratchett > On Sep 3, 2013 11:53 PM, "Robert Grosse" wrote: > >> It looks like CPython assumes the use of Visual Studio on Windows, but >> the express edition does not support 64bit compilation. Would it be >> feasible to use Mingw instead? I've looked around online, but it seems >> pretty discouraging. >> >> >> On Fri, Aug 23, 2013 at 1:40 PM, Armin Rigo wrote: >> >>> Hi again, >>> >>> On Wed, Aug 21, 2013 at 9:46 AM, Armin Rigo wrote: >>> > I finally wrote out the details of what I think is a reasonable plan. >>> > >>> > https://bitbucket.org/pypy/pypy/raw/default/pypy/doc/windows.rst >>> > "What is missing for a full 64-bit translation" >>> >>> Updated the file. Anyone with an interest in helping on Win64, please >>> start by looking there --- the first step does not require any PyPy >>> knowledge, because it's hacking at *CPython* :-) >>> >>> >>> A bient?t, >>> >>> Armin. >>> >> >> >> _______________________________________________ >> pypy-dev mailing list >> pypy-dev at python.org >> https://mail.python.org/mailman/listinfo/pypy-dev >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From clay.sweetser at gmail.com Tue Sep 10 04:50:08 2013 From: clay.sweetser at gmail.com (Clay Sweetser) Date: Mon, 9 Sep 2013 22:50:08 -0400 Subject: [pypy-dev] Windows 7 x64 development In-Reply-To: References: <1B12946B-109E-4105-B1CB-60FDD1A59E49@gmail.com> Message-ID: On Sep 9, 2013 3:58 PM, "Robert Grosse" wrote: > > Are you sure that building CPython with the Windows SDK is actually supported? I've found instructions online for building various Python extensions with the free compiler, but I haven't been able to find anything that works for building Python itself. The tools provided by the windows SDK are the same tools that Visual Studio uses. In that capacity, VS is little more than a GUI for the underlying tools and configs. Clay Sweetser "Evil begins when you begin to treat other people as things." - Terry Pratchett > > Running setup.py fails because sysconfig.get_config_var("CONFIG_ARGS") returns None. > > > On Wed, Sep 4, 2013 at 12:15 AM, Clay Sweetser wrote: >> >> Um, I believe you're mistaken. Though it's true that visual studio express doesn't *come * with a 64 bit compiler (at least on windows 7 and below, the latest one for win8 does) you can still download the windows 64 bit SDK and use the one that comes with that. >> >> Sincerely, Clay Sweetser >> >> "Evil begins when you begin to think of people as things." - Terry Pratchett >> >> On Sep 3, 2013 11:53 PM, "Robert Grosse" wrote: >>> >>> It looks like CPython assumes the use of Visual Studio on Windows, but the express edition does not support 64bit compilation. Would it be feasible to use Mingw instead? I've looked around online, but it seems pretty discouraging. >>> >>> >>> On Fri, Aug 23, 2013 at 1:40 PM, Armin Rigo wrote: >>>> >>>> Hi again, >>>> >>>> On Wed, Aug 21, 2013 at 9:46 AM, Armin Rigo wrote: >>>> > I finally wrote out the details of what I think is a reasonable plan. >>>> > >>>> > https://bitbucket.org/pypy/pypy/raw/default/pypy/doc/windows.rst >>>> > "What is missing for a full 64-bit translation" >>>> >>>> Updated the file. Anyone with an interest in helping on Win64, please >>>> start by looking there --- the first step does not require any PyPy >>>> knowledge, because it's hacking at *CPython* :-) >>>> >>>> >>>> A bient?t, >>>> >>>> Armin. >>> >>> >>> >>> _______________________________________________ >>> pypy-dev mailing list >>> pypy-dev at python.org >>> https://mail.python.org/mailman/listinfo/pypy-dev >>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sean at seanfisk.com Mon Sep 9 15:00:30 2013 From: sean at seanfisk.com (Sean Fisk) Date: Mon, 9 Sep 2013 09:00:30 -0400 Subject: [pypy-dev] Working on a recipe for using travis-ci with pypy projects In-Reply-To: References: Message-ID: Hi Sarah, Maciej is right; Alex did do something like that. He has Travis-CI running on RPLY: https://github.com/alex/rply/blob/master/.travis.yml I refined the installation slightly for my summer project: https://github.com/seanfisk/rpython-stencil-language/blob/master/.travis.yml Using this recipe, you will be able to use rlib without any problems. There is also an RPLY test that tests translation. I didn't look into that too much, but maybe Alex or someone else would be able to explain that as well: https://github.com/alex/rply/blob/master/tests/test_ztranslation.py For my project as a whole, I used a modified version of my Python project template: https://github.com/seanfisk/python-project-template ... with an added Paver task for translating an executable: https://github.com/seanfisk/rpython-stencil-language/blob/master/pavement.py#L183 Hope this helps! Sincerely, -- Sean Fisk On Mon, Sep 9, 2013 at 4:32 AM, Sarah Mount wrote: > Thanks. I had a look through some past threads, "pip install rpython" > would solve my problem. I'm not sure what that's currently blocked on but > I'd be happy to help. > > Regards, > > Sarah > On 9 Sep 2013 08:55, "Maciej Fijalkowski" wrote: > >> On Fri, Sep 6, 2013 at 11:53 AM, Sarah Mount >> wrote: >> > Hi there, >> > >> > I haven't had much time to work on rcsp since the Sprint, but I've been >> > tinkering with the infrastructure here and there. I have added the >> project >> > to the travis-ci.org continuous integration platform here: >> > >> > https://travis-ci.org/snim2/rcsp/ >> > >> > The thing I'm interested in is that it would be nice to use travis to >> a) run >> > automated tests, b) bulid documentation and c) use rpython to translate >> to >> > an executable, at the very least to check that the translation doesn't >> > error. >> > >> > This would be really useful for others I'm sure, and I'm happy to >> document >> > this work for pypy and maybe set up a basic pypy project skeleton for >> github >> > / travis users, which I can package separately or you might want to put >> into >> > the repo or whatever -- assuming no one has done that already. >> > >> > The sticking point I've got at the moment is that if I get travis to use >> > pypy and build the interpreter it cannot find the rpython executable. >> > Presumably if it could it probably wouldn't be able to find rlib. The >> > "right" fix for that problem is to install both rpython and its >> libraries in >> > a virtualenv with pip. If I run "pip install" from the command line pip >> > cannot find rpython, so does anyone have a recipe for this already? >> > >> > Thanks, >> > >> > Sarah >> >> Hi Sarah >> >> I believe Alex did something like that. Find him as Alex_Gaynor on IRC >> or maybe he'll show up here ;-) >> >> Cheers, >> fijal >> > > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arigo at tunes.org Tue Sep 10 09:49:30 2013 From: arigo at tunes.org (Armin Rigo) Date: Tue, 10 Sep 2013 09:49:30 +0200 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: References: Message-ID: Hi Jean-Fran?ois, On Mon, Sep 9, 2013 at 6:08 PM, Jean-Fran?ois Caron wrote: > Hi, I am following the instructions here: > http://doc.pypy.org/en/latest/cppyy.html#installation and here: > http://root.cern.ch/phpBB3/viewtopic.php?f=14&t=17018 > > In the first link, it says I can use a "built-in pypy" to do the translation to make it go faster. I have pypy 2.1.0 from MacPorts, but it fails with the following message: This is a mis-installed PyPy. To fix it, run PyPy as root and type: import syslog You may have to also import a few other modules as needed. ("syslog" appears in the traceback above.) Note also that cppyy is now included in PyPy by default (on non-Windows platforms), so you don't need to retranslate if that's the only reason. A bient?t, Armin. From anto.cuni at gmail.com Tue Sep 10 10:48:22 2013 From: anto.cuni at gmail.com (Antonio Cuni) Date: Tue, 10 Sep 2013 10:48:22 +0200 Subject: [pypy-dev] ndarray cpyext api on the pypy-pyarray branch In-Reply-To: <522E2389.8000705@gmail.com> References: <522E2389.8000705@gmail.com> Message-ID: <522EDCD6.1070505@gmail.com> Hi, On 09/09/13 21:37, Matti Picus wrote: > I reverted the changes I made to the pypy-pyarray branch that changed c-api > functions like PyArray_NDIM(arr). > > The original code had no real answer to what happens if these are called when > arr is not an ndarray. > [cut] > The discussion we had on IRC starts here > http://www.tismer.com/pypy/irc-logs/pypy/pypy.2013-09-08.log.html#t22:15 > so if my explanation is unclear please read the log. the discussion convinced me as well, so I think it's fine to leave things as they are now :) ciao, Anto From wlavrijsen at lbl.gov Tue Sep 10 17:02:50 2013 From: wlavrijsen at lbl.gov (wlavrijsen at lbl.gov) Date: Tue, 10 Sep 2013 08:02:50 -0700 (PDT) Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: References: Message-ID: Hi Armin, > This is a mis-installed PyPy. To fix it, run PyPy as root and type: > > import syslog > > You may have to also import a few other modules as needed. ("syslog" > appears in the traceback above.) thanks for the recipe! > Note also that cppyy is now included in PyPy by default (on > non-Windows platforms), so you don't need to retranslate if that's the > only reason. Is for the CINT backend. There are a couple of optimizations in RPython for that backend, so those need to be translated, and the latest pieces are on the reflex-support branch, not in trunk at the moment. On the CPython side, we're closing in (finally, yay! :) ) on having an LLVM (Cling, that is: http://root.cern.ch/drupal/content/cling) backend. After that, I can consolidate; dependencies and re-packaging is going to take a bit of time. Way nicer, though. Not only C++11, but also since Cling is dynamic, it is a much better fit to Python. Think cross inheritance, calling Python from C++, automatic template instantiations, the cffi interface for C++ as well, etc. Best regards, Wim -- WLavrijsen at lbl.gov -- +1 (510) 486 6411 -- www.lavrijsen.net From jfcaron at phas.ubc.ca Tue Sep 10 17:43:49 2013 From: jfcaron at phas.ubc.ca (=?iso-8859-1?Q?Jean-Fran=E7ois_Caron?=) Date: Tue, 10 Sep 2013 08:43:49 -0700 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: References: Message-ID: <79065031-11B7-4A23-9D8E-E487B1564F16@phas.ubc.ca> Yes, I was re-translating in order to test the CINT backend for PyPyROOT. After importing syslog as root, I can now import syslog as non-root. I can send an email or bug report to MacPorts, but I'm not sure how to describe the bug, as I'm unfamiliar with the syslog module. Jean-Fran?ois On 2013-09-10, at 08:02 , wlavrijsen at lbl.gov wrote: > Hi Armin, > >> This is a mis-installed PyPy. To fix it, run PyPy as root and type: >> >> import syslog >> >> You may have to also import a few other modules as needed. ("syslog" >> appears in the traceback above.) > > thanks for the recipe! > >> Note also that cppyy is now included in PyPy by default (on >> non-Windows platforms), so you don't need to retranslate if that's the >> only reason. > > Is for the CINT backend. There are a couple of optimizations in RPython for > that backend, so those need to be translated, and the latest pieces are on > the reflex-support branch, not in trunk at the moment. > > On the CPython side, we're closing in (finally, yay! :) ) on having an LLVM > (Cling, that is: http://root.cern.ch/drupal/content/cling) backend. After > that, I can consolidate; dependencies and re-packaging is going to take a > bit of time. Way nicer, though. Not only C++11, but also since Cling is > dynamic, it is a much better fit to Python. Think cross inheritance, calling > Python from C++, automatic template instantiations, the cffi interface for > C++ as well, etc. > > Best regards, > Wim > -- > WLavrijsen at lbl.gov -- +1 (510) 486 6411 -- www.lavrijsen.net From dje.gcc at gmail.com Tue Sep 10 19:53:29 2013 From: dje.gcc at gmail.com (David Edelsohn) Date: Tue, 10 Sep 2013 13:53:29 -0400 Subject: [pypy-dev] LLVM next steps In-Reply-To: References: Message-ID: On Sun, Sep 8, 2013 at 11:17 AM, Armin Rigo wrote: > Hi again, > > On Sun, Sep 8, 2013 at 9:42 AM, Armin Rigo wrote: >> We've been suitably impressed by the results on the new llvm backend >> during the sprint (well, or suitably un-impressed by both gcc and >> clang's failure to reconstruct the SSA meaning of the C code). > > I have investigated a bit more and it's quite unclear that this would > be the source of the difference. It seems that the "-flto" option of > gcc, enabling link-time optimization, actually gives very good > improvements over the same compilation without this option --- some > 11-14%, more so than, say, the typical 5% reported with CPython. If I > had to guess, I'd say it is because of the particularly disorganized > kind of C code produced by RPyhon. > > About the llvm backend, one detail hints that it might be the reason > for the speed improvement: the fact that the current llvm backend > produces most of the source code in a single file. This may be what > gives llvm extra room for improvements. This is precisely the same > room for improvement that "-flto" also gives gcc, considering that we > generate many C files with never-"static" functions. > > I tried to compile a no-jit version of PyPy from the > llvm-translation-backend branch, for comparison, but this fails right > now with "NotImplementedError: v585190 = debug_offset()". It > successfully compiles targetrpystonedalone (in -O2 mode), though. I > get the following results (with the argument "100000000"): > > plain gcc 4.7.3: 1.95 seconds > llvm 3.3: 1.75 seconds > gcc with -flto: 1.66 seconds > > If we get similar results on the whole PyPy, then I fear the llvm > backend is going back to where it already went to several time: "not > useful enough". We can simply add the -flto flag to the generated > Makefiles. Manuel, do you feel like trying to compare? I'm modifying > the Makefile manually as follows: > > CFLAGS = ...... -flto -fno-fat-lto-objects > LDFLAGS = ..... -flto=8 -O3 The type of machine-generated code produced PyPy is difficult for compilers to optimize (lots of seemingly unstructured gotos, state machines, unusual basic block heuristics) when presented in a high-level langauge like C. The distribution of the source code across a large number of source files also complicates the optimization process. GCC and LLVM link-time optimization can overcome some of these problems by allowing the compiler to "see" more of the program and optimize across the source files. Directly generating LLVM IR accomplishes a similar benefit. With some of the recent changes to GCC, one also directly could generate GCC IR. LLVM makes it very convenient to directly input the IR and take advantage of optimization opportunities allowed by such an input method, but the performance benefit is not likely due to other difference in optimization pipelines and code generation capabilities. In addition to the GCC -flto option, you should consider if -fwhole-program also is appropriate (I believe that it is). GCC has additional optimizations that can help with the style of code generated by programs like PyPy. PyPy does not generate code with computed gotos, but the aggressive use of gotos are different than normal user-written code and probably can benefit from non-default compiler optimization heuristics. There is no obvious recommendation, but experiments with enabling / disabling some forms of GCSE (-fgcse, -fgcse-lm, -fgcse-sm, -fgcse-las, -fgcse-after-reload) as well as some of the parameters (crossjumping, goto-duplication, inlining limits) might benefit PyPy. One can achieve performance gains with either compiler through adjustments to the generated code and the compiler optimization heuristics. Thanks, David From arigo at tunes.org Wed Sep 11 09:10:16 2013 From: arigo at tunes.org (Armin Rigo) Date: Wed, 11 Sep 2013 09:10:16 +0200 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: <79065031-11B7-4A23-9D8E-E487B1564F16@phas.ubc.ca> References: <79065031-11B7-4A23-9D8E-E487B1564F16@phas.ubc.ca> Message-ID: Hi Jean-Fran?ois, On Tue, Sep 10, 2013 at 5:43 PM, Jean-Fran?ois Caron wrote: > I can send an email or bug report to MacPorts, but I'm not sure how to describe the bug, as I'm unfamiliar with the syslog module. Just reporting the error is plenty enough. The issue is that it's a CFFI module which gets built (by calling the C compiler) the first time it is imported. The library built is saved into some directory where (after installation) a normal user cannot write. It's supposed to be imported once as part of the installation process, to pregenerate it. It's what our own installation tool does (pypy/tool/release/package.py). A bient?t, Armin. From arigo at tunes.org Wed Sep 11 09:34:09 2013 From: arigo at tunes.org (Armin Rigo) Date: Wed, 11 Sep 2013 09:34:09 +0200 Subject: [pypy-dev] LLVM next steps In-Reply-To: <6F9C6634-EC5B-4C5B-97A8-9BC6128DCC90@gmail.com> References: <6F9C6634-EC5B-4C5B-97A8-9BC6128DCC90@gmail.com> Message-ID: Hi Eric, On Sun, Sep 8, 2013 at 7:00 PM, Eric van Riet Paap wrote: > What is the PyPy speed difference after using gcc versus llvm for the compilation of the PyPy-c backend? Currently, it seems that using the LLVM IR static translation backend of PyPy gives higher performance. We're still trying to figure out why. I'm quite unsure that it's solely because LLVM is better [citation needed]. In particular it's strange because, at the same time, generating .c files and compiling with clang is worse than compiling with GCC. That's why I currently think the performance difference can be fully attributed to details in the two backends. If anything, it seems that someone motivated could extract some critical information from comparing the optimized llvm code produced by clang and by Manuel's .ll backend. Depending on what he finds, he can then fix our C backend to reduce the difference --- and then the C files, compiled by GCC, might be correspondingly faster as well. Alternatively, we should also try to play with the GCC options pointed to by David. On a higher-level note, LLVM still has nothing concrete enough to give us for the topics of (1) root stack scanning and (2) tracing JIT. These two areas might get traction if someone is really motivated to go into LLVM development land. So far there has been no progress that I know of, since several years. This e-mail was written with my long-term experience of 4 or 5 failed attempts at using LLVM :-) If anyone is offended by the negativity of it, feel free to prove me wrong with some backing (I know that LLVM has progressed a lot). Manuel came up with an unexpected performance difference between clang and direct generation of equivalent LLVM IR. That's concrete enough, but until someone can really explain it, I fear that we won't really progress. A bient?t, Armin. From bokr at oz.net Wed Sep 11 13:42:56 2013 From: bokr at oz.net (Bengt Richter) Date: Wed, 11 Sep 2013 13:42:56 +0200 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: References: <79065031-11B7-4A23-9D8E-E487B1564F16@phas.ubc.ca> Message-ID: <52305740.7000707@oz.net> Hi Armin, On 09/11/2013 09:10 AM Armin Rigo wrote: [...] > Just reporting the error is plenty enough. The issue is that it's a > CFFI module which gets built (by calling the C compiler) the first > time it is imported. The library built is saved into some directory > where (after installation) a normal user cannot write. It's supposed > to be imported once as part of the installation process, to > pregenerate it. It's what our own installation tool does > (pypy/tool/release/package.py). (Please excuse ignorance deriving from not having tried to build and install pypy recently ;) Is it not currently possible to build and install everything pypy in user mode, in a user directory, e.g. ~/.pypy/versionstring/... and chmod selectively to protect against accidental mods/deletes? Will it not run properly via symlink from a directory in $PATH, or maybe via an alias defined in ~/.bash_profile? Regards, Bengt Richter From arigo at tunes.org Wed Sep 11 16:00:21 2013 From: arigo at tunes.org (Armin Rigo) Date: Wed, 11 Sep 2013 16:00:21 +0200 Subject: [pypy-dev] Cannot use MacPorts PyPy to Translate pypy-c In-Reply-To: <52305740.7000707@oz.net> References: <79065031-11B7-4A23-9D8E-E487B1564F16@phas.ubc.ca> <52305740.7000707@oz.net> Message-ID: Hi Bengt, On Wed, Sep 11, 2013 at 1:42 PM, Bengt Richter wrote: > Is it not currently possible to build and install everything pypy > in user mode, in a user directory, e.g. ~/.pypy/versionstring/... > and chmod selectively to protect against accidental mods/deletes? > > Will it not run properly via symlink from a directory in $PATH, or > maybe via an alias defined in ~/.bash_profile? Sure, all this should work fine. The original question I'm answering is about a system-(mis)installed pypy, which of course doesn't have permissions to write extra files in its own libs when run as a user. A bient?t, Armin. From kostia.lopuhin at gmail.com Sun Sep 15 07:15:30 2013 From: kostia.lopuhin at gmail.com (=?KOI8-R?B?68/T1NEg7M/Q1cjJzg==?=) Date: Sun, 15 Sep 2013 09:15:30 +0400 Subject: [pypy-dev] pypy-stm - how do I know when transactions are turning inevitable? Message-ID: Hello! As far as I understand, pypy-stm reports only conflicts, but not transactions that turn inevitable. But if transaction turns inevitable, it means that other transactions can not proceed. So it is hard to debug loss of parallelization when some transactions turn inevitable but other transactions do not conflict - the program is just silently slow. Is this correct? Are there any plans on giving some hooks for this case, or stacktraces similar to aborting transactions? Or maybe there is some other method? From arigo at tunes.org Sun Sep 15 11:23:28 2013 From: arigo at tunes.org (Armin Rigo) Date: Sun, 15 Sep 2013 11:23:28 +0200 Subject: [pypy-dev] pypy-stm - how do I know when transactions are turning inevitable? In-Reply-To: References: Message-ID: Hi, On Sun, Sep 15, 2013 at 7:15 AM, ????? ??????? wrote: > As far as I understand, pypy-stm reports only conflicts, but not > transactions that turn inevitable. But if transaction turns > inevitable, it means that other transactions can not proceed. So it is > hard to debug loss of parallelization when some transactions turn > inevitable but other transactions do not conflict - the program is > just silently slow. Yes, that's correct. In fact even the aborts are not reported any more on a "pypy-stm-jit" because recording the location is more difficult with the JIT than without. ("pypy-stm-jit" also causes extra aborts and inevitable transactions right now; it should not.) I realized recently that seeing inevitable transactions would be as useful as seeing aborts. So right now, if you translate the "stmgc-c4" branch at 131f1071ef10 (with or without the jit), then you get one-line reports to stderr for both aborts and inevitable transactions. It's already better than nothing but it contains only very minimal information. (The "[N]" prefix is the thread number.) We definitely need to figure out how to record more details. Once this is done, the second issue is how to report it to the user. For example we need to report for how much time a transaction ran as inevitable, a traceback pointing to the Python line, and so on. We also need (likely) a way to store this data into a file, and query the file for the most costly entries in term of time. Obviously, implementation-wise, this depends on the previous part; but as this part is more accessible, I'd expect to get more feedback and contributions :-) A bient?t, Armin. From cfbolz at gmx.de Tue Sep 17 16:57:12 2013 From: cfbolz at gmx.de (Carl Friedrich Bolz) Date: Tue, 17 Sep 2013 16:57:12 +0200 Subject: [pypy-dev] What a VM does Message-ID: <52386DC8.3040501@gmx.de> http://www.youtube.com/watch?v=G30vNCmMP1k Just remembered that that's the video that Dave Ungar (the main person behind the Self project) pointed me to when I met him for the first time, telling me that that's pretty much exactly what a VM does: Making it appear that something simple is happening while internally a lot of complex mechanisms are at work to uphold the illusion. Cheers, Carl Friedrich From wkyjyy at gmail.com Sat Sep 21 02:38:08 2013 From: wkyjyy at gmail.com (Weikun Yang) Date: Sat, 21 Sep 2013 08:38:08 +0800 Subject: [pypy-dev] lang-scheme development Message-ID: To developers of pypy/lang-scheme, I wonder if anyone still maintains lang-scheme on bitbucket? I'm a university student facing a project for my compilers course, and would like to use lang-scheme as a starting point. Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From arigo at tunes.org Sat Sep 21 10:33:43 2013 From: arigo at tunes.org (Armin Rigo) Date: Sat, 21 Sep 2013 10:33:43 +0200 Subject: [pypy-dev] lang-scheme development In-Reply-To: References: Message-ID: Hi Weikun, On Sat, Sep 21, 2013 at 2:38 AM, Weikun Yang wrote: > To developers of pypy/lang-scheme, > I wonder if anyone still maintains lang-scheme on bitbucket? > I'm a university student facing a project for my compilers course, > and would like to use lang-scheme as a starting point. No-one is maintaining or developing lang-scheme. You are most welcome to use it at a starting point! It is nicely suited for a university project. Note that it is old code. I updated it just now to fix the imports, and fixed an issue in rpython.rlib.parsing. Make sure you update to the lastest version of both pypy/lang-scheme and pypy/pypy. You can make a fork of the pypy/lang-scheme repository on bitbucket, and send us your progress with pull requests. A bient?t, Armin. From matti.picus at gmail.com Sun Sep 22 17:26:54 2013 From: matti.picus at gmail.com (Matti Picus) Date: Sun, 22 Sep 2013 18:26:54 +0300 Subject: [pypy-dev] merging pypy-pyarray branch Message-ID: <523F0C3E.7060102@gmail.com> I am getting close to being ready to merge the pypy-pyarray branch. I related to all the review items from https://bitbucket.org/pypy/pypy/src/1a379a6f127bd6df8a4b628575c7124e83007e60/TODO.txt Does anyone have objections to merging pypy-pyarray into default (assuming the buildbots come up with no failures)? Executive summary - I took over Stefan__'s work to implement the c api interface from numpy. This together with a fork of numpy and a fork of matplotlib got him to a point where he could use non-interpretive plotting (png, pdf) in matplotlib. Matti From fijall at gmail.com Mon Sep 23 11:16:11 2013 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 23 Sep 2013 11:16:11 +0200 Subject: [pypy-dev] merging pypy-pyarray branch In-Reply-To: <523F0C3E.7060102@gmail.com> References: <523F0C3E.7060102@gmail.com> Message-ID: On Sun, Sep 22, 2013 at 5:26 PM, Matti Picus wrote: > I am getting close to being ready to merge the pypy-pyarray branch. I > related to all the review items from > https://bitbucket.org/pypy/pypy/src/1a379a6f127bd6df8a4b628575c7124e83007e60/TODO.txt > > Does anyone have objections to merging pypy-pyarray into default (assuming > the buildbots come up with no failures)? > > Executive summary - I took over Stefan__'s work to implement the c api > interface from numpy. This together with a fork of numpy and a fork of > matplotlib got him to a point where he could use non-interpretive plotting > (png, pdf) in matplotlib. > Matti > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev I would say "go ahead" From romain.py at gmail.com Mon Sep 23 15:01:29 2013 From: romain.py at gmail.com (Romain Guillebert) Date: Mon, 23 Sep 2013 15:01:29 +0200 Subject: [pypy-dev] merging pypy-pyarray branch In-Reply-To: References: <523F0C3E.7060102@gmail.com> Message-ID: Hi Matti +1 for me. On Mon, Sep 23, 2013 at 11:16 AM, Maciej Fijalkowski wrote: > On Sun, Sep 22, 2013 at 5:26 PM, Matti Picus wrote: >> I am getting close to being ready to merge the pypy-pyarray branch. I >> related to all the review items from >> https://bitbucket.org/pypy/pypy/src/1a379a6f127bd6df8a4b628575c7124e83007e60/TODO.txt >> >> Does anyone have objections to merging pypy-pyarray into default (assuming >> the buildbots come up with no failures)? >> >> Executive summary - I took over Stefan__'s work to implement the c api >> interface from numpy. This together with a fork of numpy and a fork of >> matplotlib got him to a point where he could use non-interpretive plotting >> (png, pdf) in matplotlib. >> Matti >> _______________________________________________ >> pypy-dev mailing list >> pypy-dev at python.org >> https://mail.python.org/mailman/listinfo/pypy-dev > > I would say "go ahead" > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev From matti.picus at gmail.com Mon Sep 23 15:05:22 2013 From: matti.picus at gmail.com (Matti Picus) Date: Mon, 23 Sep 2013 16:05:22 +0300 Subject: [pypy-dev] merging pypy-pyarray branch In-Reply-To: References: <523F0C3E.7060102@gmail.com> Message-ID: <52403C92.7050102@gmail.com> On 09/23/2013 04:01 PM, Romain Guillebert wrote: > Hi Matti > > +1 for me. > > On Mon, Sep 23, 2013 at 11:16 AM, Maciej Fijalkowski wrote: >> On Sun, Sep 22, 2013 at 5:26 PM, Matti Picus wrote: >>> I am getting close to being ready to merge the pypy-pyarray branch. I >>> related to all the review items from >>> https://bitbucket.org/pypy/pypy/src/1a379a6f127bd6df8a4b628575c7124e83007e60/TODO.txt >>> >>> Does anyone have objections to merging pypy-pyarray into default (assuming >>> the buildbots come up with no failures)? >>> >>> Executive summary - I took over Stefan__'s work to implement the c api >>> interface from numpy. This together with a fork of numpy and a fork of >>> matplotlib got him to a point where he could use non-interpretive plotting >>> (png, pdf) in matplotlib. >>> Matti >>> _______________________________________________ >>> pypy-dev mailing list >>> pypy-dev at python.org >>> https://mail.python.org/mailman/listinfo/pypy-dev >> I would say "go ahead" >> _______________________________________________ >> pypy-dev mailing list >> pypy-dev at python.org >> https://mail.python.org/mailman/listinfo/pypy-dev Done. I ran a test build, which came up clean Yay! Thanks, Stefan From amauryfa at gmail.com Mon Sep 23 15:38:36 2013 From: amauryfa at gmail.com (Amaury Forgeot d'Arc) Date: Mon, 23 Sep 2013 15:38:36 +0200 Subject: [pypy-dev] merging pypy-pyarray branch In-Reply-To: <523F0C3E.7060102@gmail.com> References: <523F0C3E.7060102@gmail.com> Message-ID: 2013/9/22 Matti Picus > Does anyone have objections to merging pypy-pyarray into default (assuming > the buildbots come up with no failures)? I think I have some remarks about some details in the change, I will have a closer look later. Nothing that prevents the merge though. +1 -- Amaury Forgeot d'Arc -------------- next part -------------- An HTML attachment was scrubbed... URL: From jgustak at gmail.com Wed Sep 25 15:58:48 2013 From: jgustak at gmail.com (Jakub Gustak) Date: Wed, 25 Sep 2013 14:58:48 +0100 Subject: [pypy-dev] lang-scheme development In-Reply-To: References: Message-ID: Hi Weikun, I am the original author of this interpreter and I am the one to blame for abandoning it. If you have any particular questions or need some help please do not hesitate to contact me. Regards, Jakub On Sat, Sep 21, 2013 at 9:33 AM, Armin Rigo wrote: > Hi Weikun, > > On Sat, Sep 21, 2013 at 2:38 AM, Weikun Yang wrote: >> To developers of pypy/lang-scheme, >> I wonder if anyone still maintains lang-scheme on bitbucket? >> I'm a university student facing a project for my compilers course, >> and would like to use lang-scheme as a starting point. > > No-one is maintaining or developing lang-scheme. You are most welcome > to use it at a starting point! It is nicely suited for a university > project. > > Note that it is old code. I updated it just now to fix the imports, > and fixed an issue in rpython.rlib.parsing. Make sure you update to > the lastest version of both pypy/lang-scheme and pypy/pypy. > > You can make a fork of the pypy/lang-scheme repository on bitbucket, > and send us your progress with pull requests. > > > A bient?t, > > Armin. > _______________________________________________ > pypy-dev mailing list > pypy-dev at python.org > https://mail.python.org/mailman/listinfo/pypy-dev